1. The term used to refer to an individual,who gains unauthorized access to Computer system for the purpose of stealing and corrupting data is:





Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

  • By: remshad on 18 Dec 2017 04.24 pm
    The standard difference is that a Hacker is attacking systems and probing security vulnerabilities for fun, exploration, fame, proving that they can, discovering weaknesses which can assist owners etc. Crackers are are the profit side of the coin. Their motivation is financial gain and/or to cause damage.
Tags
Show Similar Question And Answers
QA->...............refer to the amount spent on fuel, coal, diesel and fresh water used for the purpose of voyage.....
QA->Unauthorized attempts to bypass the security mechanisms of an information system or network is called:....
QA->A computer with a 32 bit wide data bus implements its memory using 8 K x 8 static RAM chips. The smallest memory that this computer can have is:....
QA->A farmer has 50 kg wheat in hand, part of which he sells at 8% profit and the rest at 18% profit. He gains 14% altogether. What is the quantity of wheat sold by him at 18% profit?....
QA->By whom was word Adivasi was used for the first time to refer to the tribal people?....
MCQ->The term used to refer to an individual,who gains unauthorized access to Computer system for the purpose of stealing and corrupting data is:....
MCQ-> The broad scientific understanding today is that our planet is experiencing a warming trend over and above natural and normal variations that is almost certainly due to human activities associated with large-scale manufacturing. The process began in the late 1700s with the Industrial Revolution, when manual labor, horsepower, and water power began to be replaced by or enhanced by machines. This revolution, over time, shifted Britain, Europe, and eventually North America from largely agricultural and trading societies to manufacturing ones, relying on machinery and engines rather than tools and animals.The Industrial Revolution was at heart a revolution in the use of energy and power. Its beginning is usually dated to the advent of the steam engine, which was based on the conversion of chemical energy in wood or coal to thermal energy and then to mechanical work primarily the powering of industrial machinery and steam locomotives. Coal eventually supplanted wood because, pound for pound, coal contains twice as much energy as wood (measured in BTUs, or British thermal units, per pound) and because its use helped to save what was left of the world's temperate forests. Coal was used to produce heat that went directly into industrial processes, including metallurgy, and to warm buildings, as well as to power steam engines. When crude oil came along in the mid- 1800s, still a couple of decades before electricity, it was burned, in the form of kerosene, in lamps to make light replacing whale oil. It was also used to provide heat for buildings and in manufacturing processes, and as a fuel for engines used in industry and propulsion.In short, one can say that the main forms in which humans need and use energy are for light, heat, mechanical work and motive power, and electricity which can be used to provide any of the other three, as well as to do things that none of those three can do, such as electronic communications and information processing. Since the Industrial Revolution, all these energy functions have been powered primarily, but not exclusively, by fossil fuels that emit carbon dioxide (CO2), To put it another way, the Industrial Revolution gave a whole new prominence to what Rochelle Lefkowitz, president of Pro-Media Communications and an energy buff, calls "fuels from hell" - coal, oil, and natural gas. All these fuels from hell come from underground, are exhaustible, and emit CO2 and other pollutants when they are burned for transportation, heating, and industrial use. These fuels are in contrast to what Lefkowitz calls "fuels from heaven" -wind, hydroelectric, tidal, biomass, and solar power. These all come from above ground, are endlessly renewable, and produce no harmful emissions.Meanwhile, industrialization promoted urbanization, and urbanization eventually gave birth to suburbanization. This trend, which was repeated across America, nurtured the development of the American car culture, the building of a national highway system, and a mushrooming of suburbs around American cities, which rewove the fabric of American life. Many other developed and developing countries followed the American model, with all its upsides and downsides. The result is that today we have suburbs and ribbons of highways that run in, out, and around not only America s major cities, but China's, India's, and South America's as well. And as these urban areas attract more people, the sprawl extends in every direction.All the coal, oil, and natural gas inputs for this new economic model seemed relatively cheap, relatively inexhaustible, and relatively harmless-or at least relatively easy to clean up afterward. So there wasn't much to stop the juggernaut of more people and more development and more concrete and more buildings and more cars and more coal, oil, and gas needed to build and power them. Summing it all up, Andy Karsner, the Department of Energy's assistant secretary for energy efficiency and renewable energy, once said to me: "We built a really inefficient environment with the greatest efficiency ever known to man."Beginning in the second half of the twentieth century, a scientific understanding began to emerge that an excessive accumulation of largely invisible pollutants-called greenhouse gases - was affecting the climate. The buildup of these greenhouse gases had been under way since the start of the Industrial Revolution in a place we could not see and in a form we could not touch or smell. These greenhouse gases, primarily carbon dioxide emitted from human industrial, residential, and transportation sources, were not piling up along roadsides or in rivers, in cans or empty bottles, but, rather, above our heads, in the earth's atmosphere. If the earth's atmosphere was like a blanket that helped to regulate the planet's temperature, the CO2 buildup was having the effect of thickening that blanket and making the globe warmer.Those bags of CO2 from our cars float up and stay in the atmosphere, along with bags of CO2 from power plants burning coal, oil, and gas, and bags of CO2 released from the burning and clearing of forests, which releases all the carbon stored in trees, plants, and soil. In fact, many people don't realize that deforestation in places like Indonesia and Brazil is responsible for more CO2 than all the world's cars, trucks, planes, ships, and trains combined - that is, about 20 percent of all global emissions. And when we're not tossing bags of carbon dioxide into the atmosphere, we're throwing up other greenhouse gases, like methane (CH4) released from rice farming, petroleum drilling, coal mining, animal defecation, solid waste landfill sites, and yes, even from cattle belching. Cattle belching? That's right-the striking thing about greenhouse gases is the diversity of sources that emit them. A herd of cattle belching can be worse than a highway full of Hummers. Livestock gas is very high in methane, which, like CO2, is colorless and odorless. And like CO2, methane is one of those greenhouse gases that, once released into the atmosphere, also absorb heat radiating from the earth's surface. "Molecule for molecule, methane's heat-trapping power in the atmosphere is twenty-one times stronger than carbon dioxide, the most abundant greenhouse gas.." reported Science World (January 21, 2002). “With 1.3 billion cows belching almost constantly around the world (100 million in the United States alone), it's no surprise that methane released by livestock is one of the chief global sources of the gas, according to the U.S. Environmental Protection Agency ... 'It's part of their normal digestion process,' says Tom Wirth of the EPA. 'When they chew their cud, they regurgitate [spit up] some food to rechew it, and all this gas comes out.' The average cow expels 600 liters of methane a day, climate researchers report." What is the precise scientific relationship between these expanded greenhouse gas emissions and global warming? Experts at the Pew Center on Climate Change offer a handy summary in their report "Climate Change 101. " Global average temperatures, notes the Pew study, "have experienced natural shifts throughout human history. For example; the climate of the Northern Hemisphere varied from a relatively warm period between the eleventh and fifteenth centuries to a period of cooler temperatures between the seventeenth century and the middle of the nineteenth century. However, scientists studying the rapid rise in global temperatures during the late twentieth century say that natural variability cannot account for what is happening now." The new factor is the human factor-our vastly increased emissions of carbon dioxide and other greenhouse gases from the burning of fossil fuels such as coal and oil as well as from deforestation, large-scale cattle-grazing, agriculture, and industrialization.“Scientists refer to what has been happening in the earth’s atmosphere over the past century as the ‘enhanced greenhouse effect’”, notes the Pew study. By pumping man- made greenhouse gases into the atmosphere, humans are altering the process by which naturally occurring greenhouse gases, because of their unique molecular structure, trap the sun’s heat near the earth’s surface before that heat radiates back into space."The greenhouse effect keeps the earth warm and habitable; without it, the earth's surface would be about 60 degrees Fahrenheit colder on average. Since the average temperature of the earth is about 45 degrees Fahrenheit, the natural greenhouse effect is clearly a good thing. But the enhanced greenhouse effect means even more of the sun's heat is trapped, causing global temperatures to rise. Among the many scientific studies providing clear evidence that an enhanced greenhouse effect is under way was a 2005 report from NASA's Goddard Institute for Space Studies. Using satellites, data from buoys, and computer models to study the earth's oceans, scientists concluded that more energy is being absorbed from the sun than is emitted back to space, throwing the earth's energy out of balance and warming the globe."Which of the following statements is correct? (I) Greenhouse gases are responsible for global warming. They should be eliminated to save the planet (II) CO2 is the most dangerous of the greenhouse gases. Reduction in the release of CO2 would surely bring down the temperature (III) The greenhouse effect could be traced back to the industrial revolution. But the current development and the patterns of life have enhanced their emissions (IV) Deforestation has been one of the biggest factors contributing to the emission of greenhouse gases Choose the correct option:....
MCQ-> DIRECTIONS for questions 24 to 50: Each of the five passages given below is followed by questions. For each question, choose the best answer.The World Trade Organisation (WTO) was created in the early 1990s as a component of the Uruguay Round negotiation. However, it could have been negotiated as part of the Tokyo Round of the 1970s, since that negotiation was an attempt at a 'constitutional reform' of the General Agreement on Tariffs and Trade (GATT). Or it could have been put off to the future, as the US government wanted. What factors led to the creation of the WTO in the early 1990s?One factor was the pattern of multilateral bargaining that developed late in the Uruguay Round. Like all complex international agreements, the WTO was a product of a series of trade-offs between principal actors and groups. For the United States, which did not want a new Organisation, the dispute settlement part of the WTO package achieved its longstanding goal of a more effective and more legal dispute settlement system. For the Europeans, who by the 1990s had come to view GATT dispute settlement less in political terms and more as a regime of legal obligations, the WTO package was acceptable as a means to discipline the resort to unilateral measures by the United States. Countries like Canada and other middle and smaller trading partners were attracted by the expansion of a rules-based system and by the symbolic value of a trade Organisation, both of which inherently support the weak against the strong. The developing countries were attracted due to the provisions banning unilateral measures. Finally, and perhaps most important, many countries at the Uruguay Round came to put a higher priority on the export gains than on the import losses that the negotiation would produce, and they came to associate the WTO and a rules-based system with those gains. This reasoning - replicated in many countries - was contained in U.S. Ambassador Kantor's defence of the WTO, and it amounted to a recognition that international trade and its benefits cannot be enjoyed unless trading nations accept the discipline of a negotiated rules-based environment.A second factor in the creation of the WTO was pressure from lawyers and the legal process. The dispute settlement system of the WTO was seen as a victory of legalists over pragmatists but the matter went deeper than that. The GATT, and the WTO, are contract organisations based on rules, and it is inevitable that an Organisation created to further rules will in turn be influenced by the legal process. Robert Hudec has written of the 'momentum of legal development', but what is this precisely? Legal development can be defined as promotion of the technical legal values of consistency, clarity (or, certainty) and effectiveness; these are values that those responsible for administering any legal system will seek to maximise. As it played out in the WTO, consistency meant integrating under one roof the whole lot of separate agreements signed under GATT auspices; clarity meant removing ambiguities about the powers of contracting parties to make certain decisions or to undertake waivers; and effectiveness meant eliminating exceptions arising out of grandfather-rights and resolving defects in dispute settlement procedures and institutional provisions. Concern for these values is inherent in any rules-based system of co-operation, since without these values rules would be meaningless in the first place. Rules, therefore, create their own incentive for fulfilment.The momentum of legal development has occurred in other institutions besides the GATT, most notably in the European Union (EU). Over the past two decades the European Court of Justice (ECJ) has consistently rendered decisions that have expanded incrementally the EU's internal market, in which the doctrine of 'mutual recognition' handed down in the case Cassis de Dijon in 1979 was a key turning point. The Court is now widely recognised as a major player in European integration, even though arguably such a strong role was not originally envisaged in the Treaty of Rome, which initiated the current European Union. One means the Court used to expand integration was the 'teleological method of interpretation', whereby the actions of member states were evaluated against 'the accomplishment of the most elementary community goals set forth in the Preamble to the [Rome] treaty'. The teleological method represents an effort to keep current policies consistent with stated goals, and it is analogous to the effort in GATT to keep contracting party trade practices consistent with stated rules. In both cases legal concerns and procedures are an independent force for further cooperation.In large part the WTO was an exercise in consolidation. In the context of a trade negotiation that created a near- revolutionary expansion of international trade rules, the formation of the WTO was a deeply conservative act needed to ensure that the benefits of the new rules would not be lost. The WTO was all about institutional structure and dispute settlement: these are the concerns of conservatives and not revolutionaries, which is why lawyers and legalists took the lead on these issues. The WTO codified the GATT institutional practice that had developed by custom over three decades, and it incorporated a new dispute settlement system that was necessary to keep both old and new rules from becoming a sham. Both the international structure and the dispute settlement system were necessary to preserve and enhance the integrity of the multilateral trade regime that had been built incrementally from the 1940s to the 1990s.What could be the closest reason why the WTO was not formed in the 1970s?
 ....
MCQ-> The current debate on intellectual property rights (IPRs) raises a number of important issues concerning the strategy and policies for building a more dynamic national agricultural research system, the relative roles of public and private sectors, and the role of agribusiness multinational corporations (MNCs). This debate has been stimulated by the international agreement on Trade Related Intellectual Property Rights (TRIPs), negotiated as part of the Uruguay Round. TRIPs, for the first time, seeks to bring innovations in agricultural technology under a new worldwide IPR regime. The agribusiness MNCs (along with pharmaceutical companies) played a leading part in lobbying for such a regime during the Uruguay Round negotiations. The argument was that incentives are necessary to stimulate innovations, and that this calls for a system of patents which gives innovators the sole right to use (or sell/lease the right to use) their innovations for a specified period and protects them against unauthorised copying or use. With strong support of their national governments, they were influential in shaping the agreement on TRIPs, which eventually emerged from the Uruguay Round. The current debate on TRIPs in India - as indeed elsewhere - echoes wider concerns about ‘privatisation’ of research and allowing a free field for MNCs in the sphere of biotechnology and agriculture. The agribusiness corporations, and those with unbounded faith in the power of science to overcome all likely problems, point to the vast potential that new technology holds for solving the problems of hunger, malnutrition and poverty in the world. The exploitation of this potential should be encouraged and this is best done by the private sector for which patents are essential. Some, who do not necessarily accept this optimism, argue that fears of MNC domination are exaggerated and that farmers will accept their products only if they decisively outperform the available alternatives. Those who argue against agreeing to introduce an IPR regime in agriculture and encouraging private sector research are apprehensive that this will work to the disadvantage of farmers by making them more and more dependent on monopolistic MNCs. A different, though related apprehension is that extensive use of hybrids and genetically engineered new varieties might increase the vulnerability of agriculture to outbreaks of pests and diseases. The larger, longer-term consequences of reduced biodiversity that may follow from the use of specially bred varieties are also another cause for concern. Moreover, corporations, driven by the profit motive, will necessarily tend to underplay, if not ignore, potential adverse consequences, especially those which are unknown and which may manifest themselves only over a relatively long period. On the other hand, high-pressure advertising and aggressive sales campaigns by private companies can seduce farmers into accepting varieties without being aware of potential adverse effects and the possibility of disastrous consequences for their livelihood if these varieties happen to fail. There is no provision under the laws, as they now exist, for compensating users against such eventualities. Excessive preoccupation with seeds and seed material has obscured other important issues involved in reviewing the research policy. We need to remind ourselves that improved varieties by themselves are not sufficient for sustained growth of yields. in our own experience, some of the early high yielding varieties (HYVs) of rice and wheat were found susceptible to widespread pest attacks; and some had problems of grain quality. Further research was necessary to solve these problems. This largely successful research was almost entirely done in public research institutions. Of course, it could in principle have been done by private companies, but whether they choose to do so depends crucially on the extent of the loss in market for their original introductions on account of the above factors and whether the companies are financially strong enough to absorb the ‘losses’, invest in research to correct the deficiencies and recover the lost market. Public research, which is not driven by profit, is better placed to take corrective action. Research for improving common pool resource management, maintaining ecological health and ensuring sustainability is both critical and also demanding in terms of technological challenge and resource requirements. As such research is crucial to the impact of new varieties, chemicals and equipment in the farmer’s field, private companies should be interested in such research. But their primary interest is in the sale of seed materials, chemicals, equipment and other inputs produced by them. Knowledge and techniques for resource management are not ‘marketable’ in the same way as those inputs. Their application to land, water and forests has a long gestation and their efficacy depends on resolving difficult problems such as designing institutions for proper and equitable management of common pool resources. Public or quasi-public research institutions informed by broader, long-term concerns can only do such work. The public sector must therefore continue to play a major role in the national research system. It is both wrong and misleading to pose the problem in terms of public sector versus private sector or of privatisation of research. We need to address problems likely to arise on account of the public-private sector complementarity, and ensure that the public research system performs efficiently. Complementarity between various elements of research raises several issues in implementing an IPR regime. Private companies do not produce new varieties and inputs entirely as a result of their own research. Almost all technological improvement is based on knowledge and experience accumulated from the past, and the results of basic and applied research in public and quasi-public institutions (universities, research organisations). Moreover, as is increasingly recognised, accumulated stock of knowledge does not reside only in the scientific community and its academic publications, but is also widely diffused in traditions and folk knowledge of local communities all over. The deciphering of the structure and functioning of DNA forms the basis of much of modern biotechnology. But this fundamental breakthrough is a ‘public good’ freely accessible in the public domain and usable free of any charge. Various techniques developed using that knowledge can however be, and are, patented for private profit. Similarly, private corporations draw extensively, and without any charge, on germplasm available in varieties of plants species (neem and turmeric are by now famous examples). Publicly funded gene banks as well as new varieties bred by public sector research stations can also be used freely by private enterprises for developing their own varieties and seek patent protection for them. Should private breeders be allowed free use of basic scientific discoveries? Should the repositories of traditional knowledge and germplasm be collected which are maintained and improved by publicly funded organisations? Or should users be made to pay for such use? If they are to pay, what should be the basis of compensation? Should the compensation be for individuals or (or communities/institutions to which they belong? Should individual institutions be given the right of patenting their innovations? These are some of the important issues that deserve more attention than they now get and need serious detailed study to evolve reasonably satisfactory, fair and workable solutions. Finally, the tendency to equate the public sector with the government is wrong. The public space is much wider than government departments and includes co- operatives, universities, public trusts and a variety of non-governmental organisations (NGOs). Giving greater autonomy to research organisations from government control and giving non- government public institutions the space and resources to play a larger, more effective role in research, is therefore an issue of direct relevance in restructuring the public research system.Which one of the following statements describes an important issue, or important issues, not being raised in the context of the current debate on IPRs?
 ....
MCQ-> Question Numbers: (55 to 58)In a square layout of site 5m ~ 5m 25 equal-sized square platforms of different heights are built. The heights (in metre) of individual platforms are as shown below: Individuals (all of same height) are seated on these platforms. We say an individual A can reach individual B, if all the three following conditions are met; (i) A and B are In the same row or column (ii) A is at a lower height than B (iii) If there is/are any individuals (s) between A and B, such individual(s) must be at a height lower than that of A. Thus in the table given above, consider the Individual seated at height 8 on 3rd row and 2nd column. He can be reached by four individuals. He can be reached by the individual on his left at height 7, by the two individuals on his right at heights of 4 and 6 and by the individual above at height 5.  How many individuals in this layout can be reached by just one individual?
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions