1. For taking more copies on typewriter, we use





Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->The transparency provided in a distributed system where users cannot tell how many copies of a resource exist is termed as :....
QA->Eggs and shoes were hurled at Tony Blair (former British Prime Minister) by anti-war activists, in Dublin (Ireland) where he was signing copies of his new book. What is the name of that book?....
QA->In India,which campaign to curb over the counter use of antibiotics without prescription,create awareness of danger of taking antibiotics?....
QA->Who discovered Typewriter ?....
QA->Who is the inventor of Typewriter?....
MCQ-> Read the following passage carefully and answer the questions given. Certain words/phrases have been given in bold to help you locate them while answering some of the questions. From a technical and economic perspective, many assessments have highlighted the presence of cost-effective opportunities to reduce energy use in buildings. However several bodies note the significance of multiple barriers that prevent the take-up of energy efficiency measures in buildings. These include lack of awareness and concern, limited access to reliable information from trusted sources, fear about risk, disruption and other ‘transaction costs’ concerns about up-front costs and inadequate access to suitably priced finance, a lack of confidence in suppliers and technologies and the presence of split incentives between landlords and tenants. The widespread presence of these barriers led experts to predict thatwithout a concerted push from policy, two-thirds of the economically viable potential to improve energy efficiency will remain unexploited by 2035. These barriers are albatross around the neck that represent a classic market failure and a basis for governmental intervention. While these measurements focus on the technical, financial or economic barriers preventing the take-up of energy efficiency options in buildings, others emphasise the significance of the often deeply embedded social practices that shape energy use in buildings. These analyses focus not on the preferences and rationalities that might shape individual behaviours, but on the ‘entangled’ cultural practices, norms, values and routines that underpin domestic energy use. Focusing on the practice-related aspects of consumption generates very different conceptual framings and policy prescriptions than those that emerge from more traditional or mainstream perspectives. But the underlying case for government intervention to help to promote retrofit and the diffusion of more energy efficient particles is still apparent, even though the forms of intervention advocated are often very different to those that emerge from a more technical or economic perspective. Based on the recognition of the multiple barriers to change and the social, economic and environmental benefits that could be realised if they were overcome, government support for retrofit (renovating existing infrastructure to make it more energy efficient) has been widespread. Retrofit programmes have been supported and adopted in diverse forms in many setting and their ability to recruit householders and then to impact their energy use has been discussed quite extensively. Frequently, these discussions have criticised the extent to which retrofit schemes rely on incentives and the provision of new technologies to change behaviour whilst ignoring the many other factors that might limit either participation in the schemes or their impact on the behaviours and prac-tices that shape domestic energy use. These factors are obviously central to the success of retrofit schemes, but evaluations of different schemes have found that despite these they can still have significant impacts. Few experts that the best estimate of the gap between the technical potential and the actual in-situ performance of energy efficiency measures is 50%, with 35% coming from performance gaps and 15% coming from ‘comfort taking’ or direct rebound effects. They further suggest that the direct rebound effect of energy efficiency measures related to household heating is Ilkley to be less than 30% while rebound effects for various domestic energy efficiency measures vary from 5 to 15% and arise mostly from indirect effects (i.e., where savings from energy efficiency lead to increased demand for goods and services). Other analyses also note that the gap between technical potential and actual performance is likely to vary by measure, with the range extending from 0% for measures such as solar water heating to 50% for measures such as improved heating controls. And others note that levels of comfort taking are likely to vary according to the levels of consumption and fuel poverty in the sample of homes where insulation is installed, with the range extending from 30% when considering homes across all income groups to around 60% when considering only lower income homes. The scale of these gapsis significant because it materially affects the impacts of retrofit schemes and expectations and perceptions of these impacts go on to influence levels of political, financial and public support for these schemes. The literature on retrofit highlights the presence of multiple barriers to change and the need for government support, if these are to be overcome. Although much has been written on the extent to which different forms of support enable the wider take-up of domestic energy efficiency measures, behaviours and practices, various areas of contestation remain and there is still an absence of robust ex-post evidence on the extent to which these schemes actually do lead to the social, economic and environmental benefits that are widely claimed.Which of the following is most nearly the OPPOSITE in meaning to the word ‘CONCERTED’ as used in the passage ?
 ....
MCQ-> The broad scientific understanding today is that our planet is experiencing a warming trend over and above natural and normal variations that is almost certainly due to human activities associated with large-scale manufacturing. The process began in the late 1700s with the Industrial Revolution, when manual labor, horsepower, and water power began to be replaced by or enhanced by machines. This revolution, over time, shifted Britain, Europe, and eventually North America from largely agricultural and trading societies to manufacturing ones, relying on machinery and engines rather than tools and animals.The Industrial Revolution was at heart a revolution in the use of energy and power. Its beginning is usually dated to the advent of the steam engine, which was based on the conversion of chemical energy in wood or coal to thermal energy and then to mechanical work primarily the powering of industrial machinery and steam locomotives. Coal eventually supplanted wood because, pound for pound, coal contains twice as much energy as wood (measured in BTUs, or British thermal units, per pound) and because its use helped to save what was left of the world's temperate forests. Coal was used to produce heat that went directly into industrial processes, including metallurgy, and to warm buildings, as well as to power steam engines. When crude oil came along in the mid- 1800s, still a couple of decades before electricity, it was burned, in the form of kerosene, in lamps to make light replacing whale oil. It was also used to provide heat for buildings and in manufacturing processes, and as a fuel for engines used in industry and propulsion.In short, one can say that the main forms in which humans need and use energy are for light, heat, mechanical work and motive power, and electricity which can be used to provide any of the other three, as well as to do things that none of those three can do, such as electronic communications and information processing. Since the Industrial Revolution, all these energy functions have been powered primarily, but not exclusively, by fossil fuels that emit carbon dioxide (CO2), To put it another way, the Industrial Revolution gave a whole new prominence to what Rochelle Lefkowitz, president of Pro-Media Communications and an energy buff, calls "fuels from hell" - coal, oil, and natural gas. All these fuels from hell come from underground, are exhaustible, and emit CO2 and other pollutants when they are burned for transportation, heating, and industrial use. These fuels are in contrast to what Lefkowitz calls "fuels from heaven" -wind, hydroelectric, tidal, biomass, and solar power. These all come from above ground, are endlessly renewable, and produce no harmful emissions.Meanwhile, industrialization promoted urbanization, and urbanization eventually gave birth to suburbanization. This trend, which was repeated across America, nurtured the development of the American car culture, the building of a national highway system, and a mushrooming of suburbs around American cities, which rewove the fabric of American life. Many other developed and developing countries followed the American model, with all its upsides and downsides. The result is that today we have suburbs and ribbons of highways that run in, out, and around not only America s major cities, but China's, India's, and South America's as well. And as these urban areas attract more people, the sprawl extends in every direction.All the coal, oil, and natural gas inputs for this new economic model seemed relatively cheap, relatively inexhaustible, and relatively harmless-or at least relatively easy to clean up afterward. So there wasn't much to stop the juggernaut of more people and more development and more concrete and more buildings and more cars and more coal, oil, and gas needed to build and power them. Summing it all up, Andy Karsner, the Department of Energy's assistant secretary for energy efficiency and renewable energy, once said to me: "We built a really inefficient environment with the greatest efficiency ever known to man."Beginning in the second half of the twentieth century, a scientific understanding began to emerge that an excessive accumulation of largely invisible pollutants-called greenhouse gases - was affecting the climate. The buildup of these greenhouse gases had been under way since the start of the Industrial Revolution in a place we could not see and in a form we could not touch or smell. These greenhouse gases, primarily carbon dioxide emitted from human industrial, residential, and transportation sources, were not piling up along roadsides or in rivers, in cans or empty bottles, but, rather, above our heads, in the earth's atmosphere. If the earth's atmosphere was like a blanket that helped to regulate the planet's temperature, the CO2 buildup was having the effect of thickening that blanket and making the globe warmer.Those bags of CO2 from our cars float up and stay in the atmosphere, along with bags of CO2 from power plants burning coal, oil, and gas, and bags of CO2 released from the burning and clearing of forests, which releases all the carbon stored in trees, plants, and soil. In fact, many people don't realize that deforestation in places like Indonesia and Brazil is responsible for more CO2 than all the world's cars, trucks, planes, ships, and trains combined - that is, about 20 percent of all global emissions. And when we're not tossing bags of carbon dioxide into the atmosphere, we're throwing up other greenhouse gases, like methane (CH4) released from rice farming, petroleum drilling, coal mining, animal defecation, solid waste landfill sites, and yes, even from cattle belching. Cattle belching? That's right-the striking thing about greenhouse gases is the diversity of sources that emit them. A herd of cattle belching can be worse than a highway full of Hummers. Livestock gas is very high in methane, which, like CO2, is colorless and odorless. And like CO2, methane is one of those greenhouse gases that, once released into the atmosphere, also absorb heat radiating from the earth's surface. "Molecule for molecule, methane's heat-trapping power in the atmosphere is twenty-one times stronger than carbon dioxide, the most abundant greenhouse gas.." reported Science World (January 21, 2002). “With 1.3 billion cows belching almost constantly around the world (100 million in the United States alone), it's no surprise that methane released by livestock is one of the chief global sources of the gas, according to the U.S. Environmental Protection Agency ... 'It's part of their normal digestion process,' says Tom Wirth of the EPA. 'When they chew their cud, they regurgitate [spit up] some food to rechew it, and all this gas comes out.' The average cow expels 600 liters of methane a day, climate researchers report." What is the precise scientific relationship between these expanded greenhouse gas emissions and global warming? Experts at the Pew Center on Climate Change offer a handy summary in their report "Climate Change 101. " Global average temperatures, notes the Pew study, "have experienced natural shifts throughout human history. For example; the climate of the Northern Hemisphere varied from a relatively warm period between the eleventh and fifteenth centuries to a period of cooler temperatures between the seventeenth century and the middle of the nineteenth century. However, scientists studying the rapid rise in global temperatures during the late twentieth century say that natural variability cannot account for what is happening now." The new factor is the human factor-our vastly increased emissions of carbon dioxide and other greenhouse gases from the burning of fossil fuels such as coal and oil as well as from deforestation, large-scale cattle-grazing, agriculture, and industrialization.“Scientists refer to what has been happening in the earth’s atmosphere over the past century as the ‘enhanced greenhouse effect’”, notes the Pew study. By pumping man- made greenhouse gases into the atmosphere, humans are altering the process by which naturally occurring greenhouse gases, because of their unique molecular structure, trap the sun’s heat near the earth’s surface before that heat radiates back into space."The greenhouse effect keeps the earth warm and habitable; without it, the earth's surface would be about 60 degrees Fahrenheit colder on average. Since the average temperature of the earth is about 45 degrees Fahrenheit, the natural greenhouse effect is clearly a good thing. But the enhanced greenhouse effect means even more of the sun's heat is trapped, causing global temperatures to rise. Among the many scientific studies providing clear evidence that an enhanced greenhouse effect is under way was a 2005 report from NASA's Goddard Institute for Space Studies. Using satellites, data from buoys, and computer models to study the earth's oceans, scientists concluded that more energy is being absorbed from the sun than is emitted back to space, throwing the earth's energy out of balance and warming the globe."Which of the following statements is correct? (I) Greenhouse gases are responsible for global warming. They should be eliminated to save the planet (II) CO2 is the most dangerous of the greenhouse gases. Reduction in the release of CO2 would surely bring down the temperature (III) The greenhouse effect could be traced back to the industrial revolution. But the current development and the patterns of life have enhanced their emissions (IV) Deforestation has been one of the biggest factors contributing to the emission of greenhouse gases Choose the correct option:....
MCQ-> The current debate on intellectual property rights (IPRs) raises a number of important issues concerning the strategy and policies for building a more dynamic national agricultural research system, the relative roles of public and private sectors, and the role of agribusiness multinational corporations (MNCs). This debate has been stimulated by the international agreement on Trade Related Intellectual Property Rights (TRIPs), negotiated as part of the Uruguay Round. TRIPs, for the first time, seeks to bring innovations in agricultural technology under a new worldwide IPR regime. The agribusiness MNCs (along with pharmaceutical companies) played a leading part in lobbying for such a regime during the Uruguay Round negotiations. The argument was that incentives are necessary to stimulate innovations, and that this calls for a system of patents which gives innovators the sole right to use (or sell/lease the right to use) their innovations for a specified period and protects them against unauthorised copying or use. With strong support of their national governments, they were influential in shaping the agreement on TRIPs, which eventually emerged from the Uruguay Round. The current debate on TRIPs in India - as indeed elsewhere - echoes wider concerns about ‘privatisation’ of research and allowing a free field for MNCs in the sphere of biotechnology and agriculture. The agribusiness corporations, and those with unbounded faith in the power of science to overcome all likely problems, point to the vast potential that new technology holds for solving the problems of hunger, malnutrition and poverty in the world. The exploitation of this potential should be encouraged and this is best done by the private sector for which patents are essential. Some, who do not necessarily accept this optimism, argue that fears of MNC domination are exaggerated and that farmers will accept their products only if they decisively outperform the available alternatives. Those who argue against agreeing to introduce an IPR regime in agriculture and encouraging private sector research are apprehensive that this will work to the disadvantage of farmers by making them more and more dependent on monopolistic MNCs. A different, though related apprehension is that extensive use of hybrids and genetically engineered new varieties might increase the vulnerability of agriculture to outbreaks of pests and diseases. The larger, longer-term consequences of reduced biodiversity that may follow from the use of specially bred varieties are also another cause for concern. Moreover, corporations, driven by the profit motive, will necessarily tend to underplay, if not ignore, potential adverse consequences, especially those which are unknown and which may manifest themselves only over a relatively long period. On the other hand, high-pressure advertising and aggressive sales campaigns by private companies can seduce farmers into accepting varieties without being aware of potential adverse effects and the possibility of disastrous consequences for their livelihood if these varieties happen to fail. There is no provision under the laws, as they now exist, for compensating users against such eventualities. Excessive preoccupation with seeds and seed material has obscured other important issues involved in reviewing the research policy. We need to remind ourselves that improved varieties by themselves are not sufficient for sustained growth of yields. in our own experience, some of the early high yielding varieties (HYVs) of rice and wheat were found susceptible to widespread pest attacks; and some had problems of grain quality. Further research was necessary to solve these problems. This largely successful research was almost entirely done in public research institutions. Of course, it could in principle have been done by private companies, but whether they choose to do so depends crucially on the extent of the loss in market for their original introductions on account of the above factors and whether the companies are financially strong enough to absorb the ‘losses’, invest in research to correct the deficiencies and recover the lost market. Public research, which is not driven by profit, is better placed to take corrective action. Research for improving common pool resource management, maintaining ecological health and ensuring sustainability is both critical and also demanding in terms of technological challenge and resource requirements. As such research is crucial to the impact of new varieties, chemicals and equipment in the farmer’s field, private companies should be interested in such research. But their primary interest is in the sale of seed materials, chemicals, equipment and other inputs produced by them. Knowledge and techniques for resource management are not ‘marketable’ in the same way as those inputs. Their application to land, water and forests has a long gestation and their efficacy depends on resolving difficult problems such as designing institutions for proper and equitable management of common pool resources. Public or quasi-public research institutions informed by broader, long-term concerns can only do such work. The public sector must therefore continue to play a major role in the national research system. It is both wrong and misleading to pose the problem in terms of public sector versus private sector or of privatisation of research. We need to address problems likely to arise on account of the public-private sector complementarity, and ensure that the public research system performs efficiently. Complementarity between various elements of research raises several issues in implementing an IPR regime. Private companies do not produce new varieties and inputs entirely as a result of their own research. Almost all technological improvement is based on knowledge and experience accumulated from the past, and the results of basic and applied research in public and quasi-public institutions (universities, research organisations). Moreover, as is increasingly recognised, accumulated stock of knowledge does not reside only in the scientific community and its academic publications, but is also widely diffused in traditions and folk knowledge of local communities all over. The deciphering of the structure and functioning of DNA forms the basis of much of modern biotechnology. But this fundamental breakthrough is a ‘public good’ freely accessible in the public domain and usable free of any charge. Various techniques developed using that knowledge can however be, and are, patented for private profit. Similarly, private corporations draw extensively, and without any charge, on germplasm available in varieties of plants species (neem and turmeric are by now famous examples). Publicly funded gene banks as well as new varieties bred by public sector research stations can also be used freely by private enterprises for developing their own varieties and seek patent protection for them. Should private breeders be allowed free use of basic scientific discoveries? Should the repositories of traditional knowledge and germplasm be collected which are maintained and improved by publicly funded organisations? Or should users be made to pay for such use? If they are to pay, what should be the basis of compensation? Should the compensation be for individuals or (or communities/institutions to which they belong? Should individual institutions be given the right of patenting their innovations? These are some of the important issues that deserve more attention than they now get and need serious detailed study to evolve reasonably satisfactory, fair and workable solutions. Finally, the tendency to equate the public sector with the government is wrong. The public space is much wider than government departments and includes co- operatives, universities, public trusts and a variety of non-governmental organisations (NGOs). Giving greater autonomy to research organisations from government control and giving non- government public institutions the space and resources to play a larger, more effective role in research, is therefore an issue of direct relevance in restructuring the public research system.Which one of the following statements describes an important issue, or important issues, not being raised in the context of the current debate on IPRs?
 ....
MCQ->A placed three sheets with two carbons to get two extra copies of the original. Then he decided to get more carbon copies and folded the paper in such a way that the upper half of the sheets were on top of the lower half. Then he typed. How many carbon copies did he get?....
MCQ-> Read passage carefully. Answer the questions by selecting the most appropriate option (with reference to the passage). PASSAGE 4While majoring in computer science isn't a requirement to participate in the Second Machine Age, what skills do liberal arts graduates specifically possess to contribute to this brave new world? Another major oversight in the debate has been the failure to appreciate that a good liberal arts education teaches many skills that are not only valuable to the general world of business, but are in fact vital to innovating the next wave of breakthrough tech-driven products and services. Many defenses of the value of a liberal arts education have been launched, of course, with the emphasis being on the acquisition of fundamental thinking and communication skills, such as critical thinking, logical argumentation, and good communication skills. One aspect of liberal arts education that has been strangely neglected in the discussion is the fact that the humanities and social sciences are devoted to the study of human nature and the nature of our communities and larger societies. Students who pursue degrees in the liberal arts disciplines tend to be particularly motivated to investigate what makes us human: how we behave and why we behave as we do. They're driven to explore how our families and our public institutions-such as our schools and legal systems-operate, and could operate better, and how governments and economies work, or as is so often the case, are plagued by dysfunction. These students learn a great deal from their particular courses of study and apply that knowledge to today's issues, the leading problems to be tackled, and various approaches for analyzing and addressing those problems. The greatest opportunities for innovation in the emerging era are in applying evolving technological capabilities to finding better ways to solve human problems like social dysfunction and political corruption; finding ways to better educate children; helping people live healthier and happier lives by altering harmful behaviors; improving our working conditions; discovering better ways to tackle poverty; Improving healthcare and making it more affordable; making our governments more accountable, from the local level up to that of global affairs; and finding optimal ways to incorporate intelligent, nimble machines into our work lives so that we are empowered to do more of the work that we do best, and to let the machines do the rest. Workers with a solid liberal arts education have a strong foundation to build on in pursuing these goals. One of the most immediate needs in technology innovation is to invest products and services with more human qualities. with more sensitivity to human needs and desires. Companies and entrepreneurs that want to succeed today and in the future must learn to consider in all aspects of their product and service creation how they can make use of the new technologies to make them more humane. Still, many other liberal arts disciplines also have much to provide the world of technological innovation. The study of psychology, for example, can help people build products that are more attuned to our emotions and ways of thinking. Experience in Anthropology can additionally help companies understand cultural and individual behavioural factors that should be considered in developing products and in marketing them. As technology allows for more machine intelligence and our lives become increasingly populated by the Internet of things and as the gathering of data about our lives and analysis of it allows for more discoveries about our behaviour, consideration of how new products and services can be crafted for the optimal enhancement of our lives and the nature of our communities, workplaces and governments will be of vital importance. Those products and services developed with the keeneSt sense of how they' can serve our human needs and complement our human talents will have a distinct competitive advantage. Much of the criticism of the liberal arts is based on the false assumption that liberal arts students lack rigor in comparison to those participating in the STEM disciplines and that they are 'soft' and unscientific whereas those who study STEM fields learn the scientific method. In fact the liberal arts teach many methods of rigorous inquiry and analysis, such as close observation and interviewing in ways that hard science adherents don't always appreciate. Many fields have long incorporated the scientific method and other types of data driven scientific inquiry and problem solving. Sociologists have developed sophisticated mathematical models of societal networks. Historians gather voluminous data on centuries-old household expenses, marriage and divorce rates, and the world trade, and use data to conduct statistical analyses, identifying trends and contributing factors to the phenomena they are studying. Linguists have developed high-tech models of the evolution of language, and they've made crucial contributions to the development of one of the technologies behind the rapid advance of automation- natural language processing, whereby computers are able to communicate with the, accuracy and personality of Siri and Alexa. It's also important to debunk the fallacy that liberal arts students who don't study these quantitative analytical methods have no 'hard' or relevant skills. This gets us back to the arguments about the fundamental ways of thinking, inquiring, problem solving and communicating that a liberal arts education teaches.What is the central theme of the passage?
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions