1. In which of the following non conventional machining process, the material is removed by using abrasive Slurry between the tool and work :





Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Show Similar Question And Answers
QA->Anil can do a work in 12 days. Basheer can do it in 15 days. Chandran can do the same work in 20 days. If they all work together, the number of days need to complete the work is :....
QA->A and B can do a work in 10 days, B and C can do it in 12 days, C and A can do it in 15 days. If A, B, and C work together, they will complete the work in :....
QA->A and B can do a work in 12 days. B and C together can do it in 15 days and C and A together in 20 days. If A, B, C work together, they will complete the work in ?....
QA->A and B can do a work in 12 days. B and C together can do it in 15 days and C and A together in 20 days. If A, B, C work together, they will complete the work in :....
QA->A non-conventional source of power is....
MCQ-> The story begins as the European pioneers crossed the Alleghenies and started to settle in the Midwest. The land they found was covered with forests. With incredible efforts they felled the trees, pulled the stumps and planted their crops in the rich, loamy soil. When they finally reached the western edge of the place we now call Indiana, the forest stopped and ahead lay a thousand miles of the great grass prairie. The Europeans were puzzled by this new environment. Some even called it the “Great Desert”. It seemed untillable. The earth was often very wet and it was covered with centuries of tangled and matted grasses. With their cast iron plows, the settlers found that the prairie sod could not be cut and the wet earth stuck to their plowshares. Even a team of the best oxen bogged down after a few years of tugging. The iron plow was a useless tool to farm the prairie soil. The pioneers were stymied for nearly two decades. Their western march was hefted and they filled in the eastern regions of the Midwest.In 1837, a blacksmith in the town of Grand Detour, Illinois, invented a new tool. His name was John Deere and the tool was a plow made of steel. It was sharp enough to cut through matted grasses and smooth enough to cast off the mud. It was a simple too, the “sod buster” that opened the great prairies to agricultural development.Sauk Country, Wisconsin is the part of that prairie where I have a home. It is named after the Sauk Indians. In i673 Father Marquette was the first European to lay his eyes upon their land. He found a village laid out in regular patterns on a plain beside the Wisconsin River. He called the place Prairie du Sac) The village was surrounded by fields that had provided maize, beans and squash for the Sauk people for generations reaching back into the unrecorded time.When the European settlers arrived at the Sauk prairie in 1837, the government forced the native Sank people west of the Mississippi River. The settlers came with John Deere’s new invention and used the tool to open the area to a new kind of agriculture. They ignored the traditional ways of the Sank Indians and used their sod-busting tool for planting wheat. Initially, the soil was generous and the nurturing thrived. However each year the soil lost more of its nurturing power. It was only thirty years after the Europeans arrived with their new technology that the land was depleted, Wheat farming became uneconomic and tens of thousands of farmers left Wisconsin seeking new land with sod to bust.It took the Europeans and their new technology just one generation to make their homeland into a desert. The Sank Indians who knew how to sustain themselves on the Sauk prairie land were banished to another kind of desert called a reservation. And they even forgot about the techniques and tools that had sustained them on the prairie for generations unrecorded. And that is how it was that three deserts were created — Wisconsin, the reservation and the memories of a people. A century later, the land of the Sauks is now populated by the children of a second wave of European tanners who learned to replenish the soil through the regenerative powers of dairying, ground cover crops and animal manures. These third and fourth generation farmers and townspeople do not realise, however, that a new settler is coming soon with an invention as powerful as John Deere’s plow.The new technology is called ‘bereavement counselling’. It is a tool forged at the great state university, an innovative technique to meet the needs of those experiencing the death of a loved one, tool that an “process” the grief of the people who now live on the Prairie of the Sauk. As one can imagine the final days of the village of the Sauk Indians before the arrival of the settlers with John Deere’s plow, one can also imagine these final days before the arrival of the first bereavement counsellor at Prairie du Sac) In these final days, the farmers arid the townspeople mourn at the death of a mother, brother, son or friend. The bereaved is joined by neighbours and kin. They meet grief together in lamentation, prayer and song. They call upon the words of the clergy and surround themselves in community.It is in these ways that they grieve and then go on with life. Through their mourning they are assured of the bonds between them and renewed in the knowledge that this death is a part of the Prairie of the Sauk. Their grief is common property, an anguish from which the community draws strength and gives the bereaved the courage to move ahead.It is into this prairie community that the bereavement counsellor arrives with the new grief technology. The counsellor calls the invention a service and assures the prairie folk of its effectiveness and superiority by invoking the name of the great university while displaying a diploma and certificate. At first, we can imagine that the local people will be puzzled by the bereavement counsellor’s claim, However, the counsellor will tell a few of them that the new technique is merely o assist the bereaved’s community at the time of death. To some other prairie folk who are isolated or forgotten, the counsellor will approach the Country Board and advocate the right to treatment for these unfortunate souls. This right will be guaranteed by the Board’s decision to reimburse those too poor tc pay for counselling services. There will be others, schooled to believe in the innovative new tools certified by universities and medical centres, who will seek out the bereavement counsellor by force of habit. And one of these people will tell a bereaved neighbour who is unschooled that unless his grief is processed by a counsellor, he will probably have major psychological problems in later life. Several people will begin to use the bereavement counsellor because, since the Country Board now taxes them to insure access to the technology, they will feel that to fail to be counselled is to waste their money, and to be denied a benefit, or even a right.Finally, one day, the aged father of a Sauk woman will die. And the next door neighbour will not drop by because he doesn’t want to interrupt the bereavement counsellor. The woman’s kin will stay home because they will have learned that only the bereavement counsellor knows how to process grief the proper way. The local clergy will seek technical assistance from the bereavement counsellor to learn the connect form of service to deal with guilt and grief. And the grieving daughter will know that it is the bereavement counsellor who really cares for her because only the bereavement counsellor comes when death visits this family on the Prairie of the Sauk.It will be only one generation between the bereavement counsellor arrives and the community of mourners disappears. The counsellor’s new tool will cut through the social fabric, throwing aside kinship, care, neighbourly obligations and communality ways cc coming together and going on. Like John Deere’s plow, the tools of bereavement counselling will create a desert we a community once flourished, And finally, even the bereavement counsellor will see the impossibility of restoring hope in clients once they are genuinely alone with nothing but a service for consolation. In the inevitable failure of the service, the bereavement counsellor will find the deserts even in herself.Which one of the following best describes the approach of the author?
 ....
MCQ->In which of the following non conventional machining process, the material is removed by using abrasive Slurry between the tool and work :....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ....
MCQ-> A distinction should be made between work and occupation. Work implies necessity; it is something that must be done as contributing to the means of life in general and to one.s own subsistence in particular. Occupation absorbs time and energy so long as we choose to give them; it demands constant initiative, and it is its own reward. For the average person the element of necessity in work is valuable, for he is saved the mental stress involved in devising outlets for his energy. Work has for him obvious utility, and it bring the satisfaction of tangible rewards. Where as occupation is an end in itself, and we therefore demand that it shall be agreeable, work is usually the means to other ends . ends which present themselves to the mind as sufficiently important to compensate for any disagreeableness in the means. There are forms of work, of course, which since external compulsion is reduced to a minimum, are hardly to be differentiated from occupation. The artist, the imaginative writer, the scientist, the social worker, for instance, find their pleasure in the constant spontaneous exercise o creative energy and the essential reward of their work is in the doing of it. In all work performed by a suitable agent there must be a pleasurable element, and the greater the amount of pleasure that can be associated with work, the better. But for most people the pleasure of occupation needs the addition of the necessity provided in work. It is better for them to follow a path of employment marked out for them than to have to find their own.When, therefore, we look ahead to the situation likely to be produced by the continued rapid extension of machine production, we should think not so much about providing occupation for leisure as about limiting the amount of leisure to that which can be profitably usedWe shall have to put the emphasis on the work . providing rather than the goods. providing aspect of the economic process. In the earlier and more ruthless days of capitalism the duty of the economic system to provide work was overlooked The purpose of competitive enterprise was to realize a profit. When profit ceased or was curtailed, production also ceased or was curtailed Thus the workers, who were regarded as units of labour forming part of the costs of production, were taken on when required and dismissed when not required They hardly thought of demanding work as a right. And so long as British manufacturers had their eyes mainly on the markets awaiting them abroad, they could conveniently neglect the fact that since workers are also consumers, unemployment at home means loss of trade. Moral considerations did not yet find a substitute in ordinary business prudence. The labour movements arose largely as a revolt against the conception of workers as commodities to be bought and sold without regard to their needs as human beings. In a socialist system it is assumed that they will be treated with genuine consideration, for, the making of profit not being essential, central planning will not only adjust the factors of production to the best advantage but will secure regularity of employment. But has the socialist thought about what he would do if owing to technological advance, the amount of human labour were catastrophically reduced? So far as I know, he has no plan beyond drastically lining the hours of work, and sharing out as much work as there may be. And, of course, he would grant monetary relief to those who were actually unemployed But has he considered what would be the moral effect of life imagined as possible in the highly mechanized state of future? Has he thought of the possibility of bands of unemployed and under-employed workers marching on the capital to demand not income (which they will have but work?Future, according to the passage, may find the workers
 ....
MCQ-> The broad scientific understanding today is that our planet is experiencing a warming trend over and above natural and normal variations that is almost certainly due to human activities associated with large-scale manufacturing. The process began in the late 1700s with the Industrial Revolution, when manual labor, horsepower, and water power began to be replaced by or enhanced by machines. This revolution, over time, shifted Britain, Europe, and eventually North America from largely agricultural and trading societies to manufacturing ones, relying on machinery and engines rather than tools and animals.The Industrial Revolution was at heart a revolution in the use of energy and power. Its beginning is usually dated to the advent of the steam engine, which was based on the conversion of chemical energy in wood or coal to thermal energy and then to mechanical work primarily the powering of industrial machinery and steam locomotives. Coal eventually supplanted wood because, pound for pound, coal contains twice as much energy as wood (measured in BTUs, or British thermal units, per pound) and because its use helped to save what was left of the world's temperate forests. Coal was used to produce heat that went directly into industrial processes, including metallurgy, and to warm buildings, as well as to power steam engines. When crude oil came along in the mid- 1800s, still a couple of decades before electricity, it was burned, in the form of kerosene, in lamps to make light replacing whale oil. It was also used to provide heat for buildings and in manufacturing processes, and as a fuel for engines used in industry and propulsion.In short, one can say that the main forms in which humans need and use energy are for light, heat, mechanical work and motive power, and electricity which can be used to provide any of the other three, as well as to do things that none of those three can do, such as electronic communications and information processing. Since the Industrial Revolution, all these energy functions have been powered primarily, but not exclusively, by fossil fuels that emit carbon dioxide (CO2), To put it another way, the Industrial Revolution gave a whole new prominence to what Rochelle Lefkowitz, president of Pro-Media Communications and an energy buff, calls "fuels from hell" - coal, oil, and natural gas. All these fuels from hell come from underground, are exhaustible, and emit CO2 and other pollutants when they are burned for transportation, heating, and industrial use. These fuels are in contrast to what Lefkowitz calls "fuels from heaven" -wind, hydroelectric, tidal, biomass, and solar power. These all come from above ground, are endlessly renewable, and produce no harmful emissions.Meanwhile, industrialization promoted urbanization, and urbanization eventually gave birth to suburbanization. This trend, which was repeated across America, nurtured the development of the American car culture, the building of a national highway system, and a mushrooming of suburbs around American cities, which rewove the fabric of American life. Many other developed and developing countries followed the American model, with all its upsides and downsides. The result is that today we have suburbs and ribbons of highways that run in, out, and around not only America s major cities, but China's, India's, and South America's as well. And as these urban areas attract more people, the sprawl extends in every direction.All the coal, oil, and natural gas inputs for this new economic model seemed relatively cheap, relatively inexhaustible, and relatively harmless-or at least relatively easy to clean up afterward. So there wasn't much to stop the juggernaut of more people and more development and more concrete and more buildings and more cars and more coal, oil, and gas needed to build and power them. Summing it all up, Andy Karsner, the Department of Energy's assistant secretary for energy efficiency and renewable energy, once said to me: "We built a really inefficient environment with the greatest efficiency ever known to man."Beginning in the second half of the twentieth century, a scientific understanding began to emerge that an excessive accumulation of largely invisible pollutants-called greenhouse gases - was affecting the climate. The buildup of these greenhouse gases had been under way since the start of the Industrial Revolution in a place we could not see and in a form we could not touch or smell. These greenhouse gases, primarily carbon dioxide emitted from human industrial, residential, and transportation sources, were not piling up along roadsides or in rivers, in cans or empty bottles, but, rather, above our heads, in the earth's atmosphere. If the earth's atmosphere was like a blanket that helped to regulate the planet's temperature, the CO2 buildup was having the effect of thickening that blanket and making the globe warmer.Those bags of CO2 from our cars float up and stay in the atmosphere, along with bags of CO2 from power plants burning coal, oil, and gas, and bags of CO2 released from the burning and clearing of forests, which releases all the carbon stored in trees, plants, and soil. In fact, many people don't realize that deforestation in places like Indonesia and Brazil is responsible for more CO2 than all the world's cars, trucks, planes, ships, and trains combined - that is, about 20 percent of all global emissions. And when we're not tossing bags of carbon dioxide into the atmosphere, we're throwing up other greenhouse gases, like methane (CH4) released from rice farming, petroleum drilling, coal mining, animal defecation, solid waste landfill sites, and yes, even from cattle belching. Cattle belching? That's right-the striking thing about greenhouse gases is the diversity of sources that emit them. A herd of cattle belching can be worse than a highway full of Hummers. Livestock gas is very high in methane, which, like CO2, is colorless and odorless. And like CO2, methane is one of those greenhouse gases that, once released into the atmosphere, also absorb heat radiating from the earth's surface. "Molecule for molecule, methane's heat-trapping power in the atmosphere is twenty-one times stronger than carbon dioxide, the most abundant greenhouse gas.." reported Science World (January 21, 2002). “With 1.3 billion cows belching almost constantly around the world (100 million in the United States alone), it's no surprise that methane released by livestock is one of the chief global sources of the gas, according to the U.S. Environmental Protection Agency ... 'It's part of their normal digestion process,' says Tom Wirth of the EPA. 'When they chew their cud, they regurgitate [spit up] some food to rechew it, and all this gas comes out.' The average cow expels 600 liters of methane a day, climate researchers report." What is the precise scientific relationship between these expanded greenhouse gas emissions and global warming? Experts at the Pew Center on Climate Change offer a handy summary in their report "Climate Change 101. " Global average temperatures, notes the Pew study, "have experienced natural shifts throughout human history. For example; the climate of the Northern Hemisphere varied from a relatively warm period between the eleventh and fifteenth centuries to a period of cooler temperatures between the seventeenth century and the middle of the nineteenth century. However, scientists studying the rapid rise in global temperatures during the late twentieth century say that natural variability cannot account for what is happening now." The new factor is the human factor-our vastly increased emissions of carbon dioxide and other greenhouse gases from the burning of fossil fuels such as coal and oil as well as from deforestation, large-scale cattle-grazing, agriculture, and industrialization.“Scientists refer to what has been happening in the earth’s atmosphere over the past century as the ‘enhanced greenhouse effect’”, notes the Pew study. By pumping man- made greenhouse gases into the atmosphere, humans are altering the process by which naturally occurring greenhouse gases, because of their unique molecular structure, trap the sun’s heat near the earth’s surface before that heat radiates back into space."The greenhouse effect keeps the earth warm and habitable; without it, the earth's surface would be about 60 degrees Fahrenheit colder on average. Since the average temperature of the earth is about 45 degrees Fahrenheit, the natural greenhouse effect is clearly a good thing. But the enhanced greenhouse effect means even more of the sun's heat is trapped, causing global temperatures to rise. Among the many scientific studies providing clear evidence that an enhanced greenhouse effect is under way was a 2005 report from NASA's Goddard Institute for Space Studies. Using satellites, data from buoys, and computer models to study the earth's oceans, scientists concluded that more energy is being absorbed from the sun than is emitted back to space, throwing the earth's energy out of balance and warming the globe."Which of the following statements is correct? (I) Greenhouse gases are responsible for global warming. They should be eliminated to save the planet (II) CO2 is the most dangerous of the greenhouse gases. Reduction in the release of CO2 would surely bring down the temperature (III) The greenhouse effect could be traced back to the industrial revolution. But the current development and the patterns of life have enhanced their emissions (IV) Deforestation has been one of the biggest factors contributing to the emission of greenhouse gases Choose the correct option:....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions