1. When wood is heated with a limited supply of air to a temperature not less than 280°C, the resulting fuel is





Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->A is taller than B; B is taller than C; D is taller than E and E is taller than B. Who is the shortest?....
QA->The Government of India acquired the ownership and control of major banks in 1969 whose deposits were not less than:....
QA->According to PFA rule buffalo milk should contain not less than a :....
QA->According to PFA table butter should contain not less than -----% fat :....
QA->Under Payment of Bonus Act, an employee is eligible to get bonus if he has worked for not less than ……days in the preceding year.....
MCQ->When wood is heated with a limited supply of air to a temperature not less than 280°C, the resulting fuel is....
MCQ-> The broad scientific understanding today is that our planet is experiencing a warming trend over and above natural and normal variations that is almost certainly due to human activities associated with large-scale manufacturing. The process began in the late 1700s with the Industrial Revolution, when manual labor, horsepower, and water power began to be replaced by or enhanced by machines. This revolution, over time, shifted Britain, Europe, and eventually North America from largely agricultural and trading societies to manufacturing ones, relying on machinery and engines rather than tools and animals.The Industrial Revolution was at heart a revolution in the use of energy and power. Its beginning is usually dated to the advent of the steam engine, which was based on the conversion of chemical energy in wood or coal to thermal energy and then to mechanical work primarily the powering of industrial machinery and steam locomotives. Coal eventually supplanted wood because, pound for pound, coal contains twice as much energy as wood (measured in BTUs, or British thermal units, per pound) and because its use helped to save what was left of the world's temperate forests. Coal was used to produce heat that went directly into industrial processes, including metallurgy, and to warm buildings, as well as to power steam engines. When crude oil came along in the mid- 1800s, still a couple of decades before electricity, it was burned, in the form of kerosene, in lamps to make light replacing whale oil. It was also used to provide heat for buildings and in manufacturing processes, and as a fuel for engines used in industry and propulsion.In short, one can say that the main forms in which humans need and use energy are for light, heat, mechanical work and motive power, and electricity which can be used to provide any of the other three, as well as to do things that none of those three can do, such as electronic communications and information processing. Since the Industrial Revolution, all these energy functions have been powered primarily, but not exclusively, by fossil fuels that emit carbon dioxide (CO2), To put it another way, the Industrial Revolution gave a whole new prominence to what Rochelle Lefkowitz, president of Pro-Media Communications and an energy buff, calls "fuels from hell" - coal, oil, and natural gas. All these fuels from hell come from underground, are exhaustible, and emit CO2 and other pollutants when they are burned for transportation, heating, and industrial use. These fuels are in contrast to what Lefkowitz calls "fuels from heaven" -wind, hydroelectric, tidal, biomass, and solar power. These all come from above ground, are endlessly renewable, and produce no harmful emissions.Meanwhile, industrialization promoted urbanization, and urbanization eventually gave birth to suburbanization. This trend, which was repeated across America, nurtured the development of the American car culture, the building of a national highway system, and a mushrooming of suburbs around American cities, which rewove the fabric of American life. Many other developed and developing countries followed the American model, with all its upsides and downsides. The result is that today we have suburbs and ribbons of highways that run in, out, and around not only America s major cities, but China's, India's, and South America's as well. And as these urban areas attract more people, the sprawl extends in every direction.All the coal, oil, and natural gas inputs for this new economic model seemed relatively cheap, relatively inexhaustible, and relatively harmless-or at least relatively easy to clean up afterward. So there wasn't much to stop the juggernaut of more people and more development and more concrete and more buildings and more cars and more coal, oil, and gas needed to build and power them. Summing it all up, Andy Karsner, the Department of Energy's assistant secretary for energy efficiency and renewable energy, once said to me: "We built a really inefficient environment with the greatest efficiency ever known to man."Beginning in the second half of the twentieth century, a scientific understanding began to emerge that an excessive accumulation of largely invisible pollutants-called greenhouse gases - was affecting the climate. The buildup of these greenhouse gases had been under way since the start of the Industrial Revolution in a place we could not see and in a form we could not touch or smell. These greenhouse gases, primarily carbon dioxide emitted from human industrial, residential, and transportation sources, were not piling up along roadsides or in rivers, in cans or empty bottles, but, rather, above our heads, in the earth's atmosphere. If the earth's atmosphere was like a blanket that helped to regulate the planet's temperature, the CO2 buildup was having the effect of thickening that blanket and making the globe warmer.Those bags of CO2 from our cars float up and stay in the atmosphere, along with bags of CO2 from power plants burning coal, oil, and gas, and bags of CO2 released from the burning and clearing of forests, which releases all the carbon stored in trees, plants, and soil. In fact, many people don't realize that deforestation in places like Indonesia and Brazil is responsible for more CO2 than all the world's cars, trucks, planes, ships, and trains combined - that is, about 20 percent of all global emissions. And when we're not tossing bags of carbon dioxide into the atmosphere, we're throwing up other greenhouse gases, like methane (CH4) released from rice farming, petroleum drilling, coal mining, animal defecation, solid waste landfill sites, and yes, even from cattle belching. Cattle belching? That's right-the striking thing about greenhouse gases is the diversity of sources that emit them. A herd of cattle belching can be worse than a highway full of Hummers. Livestock gas is very high in methane, which, like CO2, is colorless and odorless. And like CO2, methane is one of those greenhouse gases that, once released into the atmosphere, also absorb heat radiating from the earth's surface. "Molecule for molecule, methane's heat-trapping power in the atmosphere is twenty-one times stronger than carbon dioxide, the most abundant greenhouse gas.." reported Science World (January 21, 2002). “With 1.3 billion cows belching almost constantly around the world (100 million in the United States alone), it's no surprise that methane released by livestock is one of the chief global sources of the gas, according to the U.S. Environmental Protection Agency ... 'It's part of their normal digestion process,' says Tom Wirth of the EPA. 'When they chew their cud, they regurgitate [spit up] some food to rechew it, and all this gas comes out.' The average cow expels 600 liters of methane a day, climate researchers report." What is the precise scientific relationship between these expanded greenhouse gas emissions and global warming? Experts at the Pew Center on Climate Change offer a handy summary in their report "Climate Change 101. " Global average temperatures, notes the Pew study, "have experienced natural shifts throughout human history. For example; the climate of the Northern Hemisphere varied from a relatively warm period between the eleventh and fifteenth centuries to a period of cooler temperatures between the seventeenth century and the middle of the nineteenth century. However, scientists studying the rapid rise in global temperatures during the late twentieth century say that natural variability cannot account for what is happening now." The new factor is the human factor-our vastly increased emissions of carbon dioxide and other greenhouse gases from the burning of fossil fuels such as coal and oil as well as from deforestation, large-scale cattle-grazing, agriculture, and industrialization.“Scientists refer to what has been happening in the earth’s atmosphere over the past century as the ‘enhanced greenhouse effect’”, notes the Pew study. By pumping man- made greenhouse gases into the atmosphere, humans are altering the process by which naturally occurring greenhouse gases, because of their unique molecular structure, trap the sun’s heat near the earth’s surface before that heat radiates back into space."The greenhouse effect keeps the earth warm and habitable; without it, the earth's surface would be about 60 degrees Fahrenheit colder on average. Since the average temperature of the earth is about 45 degrees Fahrenheit, the natural greenhouse effect is clearly a good thing. But the enhanced greenhouse effect means even more of the sun's heat is trapped, causing global temperatures to rise. Among the many scientific studies providing clear evidence that an enhanced greenhouse effect is under way was a 2005 report from NASA's Goddard Institute for Space Studies. Using satellites, data from buoys, and computer models to study the earth's oceans, scientists concluded that more energy is being absorbed from the sun than is emitted back to space, throwing the earth's energy out of balance and warming the globe."Which of the following statements is correct? (I) Greenhouse gases are responsible for global warming. They should be eliminated to save the planet (II) CO2 is the most dangerous of the greenhouse gases. Reduction in the release of CO2 would surely bring down the temperature (III) The greenhouse effect could be traced back to the industrial revolution. But the current development and the patterns of life have enhanced their emissions (IV) Deforestation has been one of the biggest factors contributing to the emission of greenhouse gases Choose the correct option:....
MCQ-> The second plan to have to examine is that of giving to each person what she deserves. Many people, especially those who are comfortably off, think this is what happens at present: that the industrious and sober and thrifty are never in want, and that poverty is due to idleness, improvidence, drinking, betting, dishonesty, and bad character generally. They can point to the fact that a labour whose character is bad finds it more difficult to get employment than one whose character is good; that a farmer or country gentleman who gambles and bets heavily, and mortgages his land to live wastefully and extravagantly, is soon reduced to poverty; and that a man of business who is lazy and does not attend to it becomes bankrupt. But this proves nothing that you cannot eat your cake and have it too; it does not prove that your share of the cake was a fair one. It shows that certain vices make us rich. People who are hard, grasping, selfish, cruel, and always ready to take advantage of their neighbours, become very rich if they are clever enough not to overreach themselves. On the other hand, people who are generous, public spirited, friendly, and not always thinking of the main chance, stay poor when they are born poor unless they have extraordinary talents. Also as things are today, some are born poor and others are born with silver spoons in their mouths: that is to say, they are divided into rich and poor before they are old enough to have any character at all. The notion that our present system distributes wealth according to merit, even roughly, may be dismissed at once as ridiculous. Everyone can see that it generally has the contrary effect; it makes a few idle people very rich, and a great many hardworking people very poor.On this, intelligent Lady, your first thought may be that if wealth is not distributed according to merit, it ought to be; and that we should at once set to work to alter our laws so that in future the good people shall be rich in proportion to their goodness and the bad people poor in proportion to their badness. There are several objections to this; but the very first one settles the question for good and all. It is, that the proposal is impossible and impractical. How are you going to measure anyone's merit in money? Choose any pair of human beings you like, male or female, and see whether you can decide how much each of them should have on her or his merits. If you live in the country, take the village blacksmith and the village clergyman, or the village washerwoman and the village schoolmistress, to begin with. At present, the clergyman often gets less pay than the blacksmith; it is only in some villages he gets more. But never mind what they get at present: you are trying whether you can set up a new order of things in which each will get what he deserves. You need not fix a sum of money for them: all you have to do is to settle the proportion between them. Is the blacksmith to have as much as the clergyman? Or twice as much as the clergyman? Or half as much as the clergyman? Or how much more or less? It is no use saying that one ought to have more the other less; you must be prepared to say exactly how much more or less in calculable proportion.Well, think it out. The clergyman has had a college education; but that is not any merit on his part: he owns it to his father; so you cannot allow him anything for that. But through it he is able to read the New Testament in Greek; so that he can do something the blacksmith cannot do. On the other hand, the blacksmith can make a horse-shoe, which the parson cannot. How many verses of the Greek Testament are worth one horse-shoe? You have only to ask the silly question to see that nobody can answer it.Since measuring their merits is no use, why not try to measure their faults? Suppose the blacksmith swears a good deal, and gets drunk occasionally! Everybody in the village knows this; but the parson has to keep his faults to himself. His wife knows them; but she will not tell you what they are if she knows that you intend to cut off some of his pay for them. You know that as he is only a mortal human being, he must have some faults; but you cannot find them out. However, suppose he has some faults he is a snob; that he cares more for sport and fashionable society than for religion! Does that make him as bad as the blacksmith, or twice as bad, or twice and quarter as bad, or only half as bad? In other words, if the blacksmith is to have a shilling, is the parson to have six pence, or five pence and one-third, or two shillings? Clearly these are fools' questions: the moment they bring us down from moral generalities to business particulars it becomes plain to every sensible person that no relation can be established between human qualities, good or bad, and sums of money, large or small.It may seem scandalous that a prize-fighter, for hitting another prize-fighter so hard at Wembley that he fell down and could not rise within ten seconds, received the same sum that was paid to the Archbishop of Canterbury for acting as Primate of the Church of England for nine months; but none of those who cry out against the scandal can express any better in money the difference between the two. Not one of the persons who think that the prize-fighter should get less than the Archbishop can say how much less. What the prize- fighter got for his six or seven months' boxing would pay a judge's salary for two years; and we all agree that nothing could be more ridiculous, and that any system of distributing wealth which leads to such absurdities must be wrong. But to suppose that it could be changed by any possible calculation that an ounce of archbishop of three ounces of judge is worth a pound of prize-fighter would be sillier still. You can find out how many candles are worth a pound of butter in the market on any particular day; but when you try to estimate the worth of human souls the utmost you can say is that they are all of equal value before the throne of God:And that will not help you in the least to settle how much money they should have. You must simply give it up, and admit that distributing money according to merit is beyond mortal measurement and judgement.Which of the following is not a vice attributed to the poor by the rich?
 ....
MCQ-> The passage below is accompanied by a set of six questions. Choose the best answer to each question. During the frigid season... it's often necessary to nestle under a blanket to try to stay warm. The temperature difference between the blanket and the air outside is so palpable that we often have trouble leaving our warm refuge. Many plants and animals similarly hunker down, relying on snow cover for safety from winter's harsh conditions. The small area between the snowpack and the ground, called the subnivium... might be the most important ecosystem that you have never heard of.The subnivium is so well-insulated and stable that its temperature holds steady at around 32 degree Fahrenheit (0 degree Celsius). Although that might still sound cold, a constant temperature of 32 degree Fahrenheit can often be 30 to 40 degrees warmer than the air temperature during the peak of winter. Because of this large temperature difference, a wide variety of species...depend on the subnivium for winter protection.For many organisms living in temperate and Arctic regions, the difference between being under the snow or outside it is a matter of life and death. Consequently, disruptions to the subnivium brought about by climate change will affect everything from population dynamics to nutrient cycling through the ecosystem.The formation and stability of the subnivium requires more than a few flurries. Winter ecologists have suggested that eight inches of snow is necessary to develop a stable layer of insulation. Depth is not the only factor, however. More accurately, the stability of the subnivium depends on the interaction between snow depth and snow density. Imagine being under a stack of blankets that are all flattened and pressed together. When compressed, the blankets essentially form one compacted layer. In contrast, when they are lightly placed on top of one another, their insulative capacity increases because the air pockets between them trap heat. Greater depths of low-density snow are therefore better at insulating the ground.Both depth and density of snow are sensitive to temperature. Scientists are now beginning to explore how climate change will affect the subnivium, as well as the species that depend on it. At first glance, warmer winters seem beneficial for species that have difficulty surviving subzero temperatures; however, as with most ecological phenomena, the consequences are not so straightforward. Research has shown that the snow season (the period when snow is more likely than rain) has become shorter since l970. When rain falls on snow, it increases the density of the snow and reduces its insulative capacity. Therefore, even though winters are expected to become warmer overall from future climate change, the subnivium will tend to become colder and more variable with less protection from the above-ground temperatures.The effects of a colder subnivium are complex... For example, shrubs such as crowberry and alpine azalea that grow along the forest floor tend to block the wind and so retain higher depths of snow around them. This captured snow helps to keep soils insulated and in turn increases plant decomposition and nutrient release. In field experiments, researchers removed a portion. of the snow cover to investigate the importance of the subnivium's insulation. They found that soil frost in the snow-free area resulted in damage to plant roots and sometimes even the death of the plant.The purpose of this passage is to
 ....
MCQ-> Read the passage carefully and answer the questions given at the end of each passage:Turning the business involved more than segmenting and pulling out of retail. It also meant maximizing every strength we had in order to boost our profit margins. In re-examining the direct model, we realized that inventory management was not just core strength; it could be an incredible opportunity for us, and one that had not yet been discovered by any of our competitors. In Version 1.0 the direct model, we eliminated the reseller, thereby eliminating the mark-up and the cost of maintaining a store. In Version 1.1, we went one step further to reduce inventory inefficiencies. Traditionally, a long chain of partners was involved in getting a product to the customer. Let’s say you have a factory building a PC we’ll call model #4000. The system is then sent to the distributor, which sends it to the warehouse, which sends it to the dealer, who eventually pushes it on to the consumer by advertising, “I’ve got model #4000. Come and buy it.” If the consumer says, “But I want model #8000,” the dealer replies, “Sorry, I only have model #4000.” Meanwhile, the factory keeps building model #4000s and pushing the inventory into the channel. The result is a glut of model #4000s that nobody wants. Inevitably, someone ends up with too much inventory, and you see big price corrections. The retailer can’t sell it at the suggested retail price, so the manufacturer loses money on price protection (a practice common in our industry of compensating dealers for reductions in suggested selling price). Companies with long, multi-step distribution systems will often fill their distribution channels with products in an attempt to clear out older targets. This dangerous and inefficient practice is called “channel stuffing”. Worst of all, the customer ends up paying for it by purchasing systems that are already out of date Because we were building directly to fill our customers’ orders, we didn’t have finished goods inventory devaluing on a daily basis. Because we aligned our suppliers to deliver components as we used them, we were able to minimize raw material inventory. Reductions in component costs could be passed on to our customers quickly, which made them happier and improved our competitive advantage. It also allowed us to deliver the latest technology to our customers faster than our competitors. The direct model turns conventional manufacturing inside out. Conventional manufacturing, because your plant can’t keep going. But if you don’t know what you need to build because of dramatic changes in demand, you run the risk of ending up with terrific amounts of excess and obsolete inventory. That is not the goal. The concept behind the direct model has nothing to do with stockpiling and everything to do with information. The quality of your information is inversely proportional to the amount of assets required, in this case excess inventory. With less information about customer needs, you need massive amounts of inventory. So, if you have great information – that is, you know exactly what people want and how much - you need that much less inventory. Less inventory, of course, corresponds to less inventory depreciation. In the computer industry, component prices are always falling as suppliers introduce faster chips, bigger disk drives and modems with ever-greater bandwidth. Let’s say that Dell has six days of inventory. Compare that to an indirect competitor who has twenty-five days of inventory with another thirty in their distribution channel. That’s a difference of forty-nine days, and in forty-nine days, the cost of materials will decline about 6 percent. Then there’s the threat of getting stuck with obsolete inventory if you’re caught in a transition to a next- generation product, as we were with those memory chip in 1989. As the product approaches the end of its life, the manufacturer has to worry about whether it has too much in the channel and whether a competitor will dump products, destroying profit margins for everyone. This is a perpetual problem in the computer industry, but with the direct model, we have virtually eliminated it. We know when our customers are ready to move on technologically, and we can get out of the market before its most precarious time. We don’t have to subsidize our losses by charging higher prices for other products. And ultimately, our customer wins. Optimal inventory management really starts with the design process. You want to design the product so that the entire product supply chain, as well as the manufacturing process, is oriented not just for speed but for what we call velocity. Speed means being fast in the first place. Velocity means squeezing time out of every step in the process. Inventory velocity has become a passion for us. To achieve maximum velocity, you have to design your products in a way that covers the largest part of the market with the fewest number of parts. For example, you don’t need nine different disk drives when you can serve 98 percent of the market with only four. We also learned to take into account the variability of the lost cost and high cost components. Systems were reconfigured to allow for a greater variety of low-cost parts and a limited variety of expensive parts. The goal was to decrease the number of components to manage, which increased the velocity, which decreased the risk of inventory depreciation, which increased the overall health of our business system. We were also able to reduce inventory well below the levels anyone thought possible by constantly challenging and surprising ourselves with the result. We had our internal skeptics when we first started pushing for ever-lower levels of inventory. I remember the head of our procurement group telling me that this was like “flying low to the ground 300 knots.” He was worried that we wouldn’t see the trees.In 1993, we had $2.9 billion in sales and $220 million in inventory. Four years later, we posted $12.3 billion in sales and had inventory of $33 million. We’re now down to six days of inventory and we’re starting to measure it in hours instead of days. Once you reduce your inventory while maintaining your growth rate, a significant amount of risk comes from the transition from one generation of product to the next. Without traditional stockpiles of inventory, it is critical to precisely time the discontinuance of the older product line with the ramp-up in customer demand for the newer one. Since we were introducing new products all the time, it became imperative to avoid the huge drag effect from mistakes made during transitions. E&O; – short for “excess and obsolete” - became taboo at Dell. We would debate about whether our E&O; was 30 or 50 cent per PC. Since anything less than $20 per PC is not bad, when you’re down in the cents range, you’re approaching stellar performance.Find out the TRUE statement:
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions