1. The administrative reform that introduced the element of direct election for the first time:





Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->Whichcountry’s Prime Minister resigned on December 5, 2016, after being defeated ina referendum on constitutional reform over his plan to reform the constitution?....
QA->How is Share of Direct tax in post economic reform?....
QA->What is the total earnings of a worker from the following data? Standard time for completing the job 50 hours. Actual time taken for completing the job 45 hours. Time rate is Rs.20 per hour, premium bonus is 60% of time saved.....
QA->In Travancore who introduced the direct management of Devaswams by Government ?....
QA->Which of the following reform movements was the first to be started in the 19th century?-3....
MCQ-> The conventional wisdom says that this is an issue-less election. There is no central personality of whom voters have to express approval or dislike; no central matter of concern that makes this a one-issue referendum like so many elections in the past; no central party around which everything else revolves — the Congress has been displaced from its customary pole position, and no one else has been able to take its place. Indeed, given that all-seeing video cameras of the Election Commission, and the detailed pictures they are putting together on campaign expenditure, there isn't even much electioning: no slogans on the walls, no loudspeakers blaring forth at all hours of the day and night, no cavalcades of cars heralding the arrival of a candidate at the local bazaar. Forget it being an issue-less election, is this an election at all?Perhaps the ‘fun’ of an election lies in its featuring someone whom you can love or hate. But Narasimha Rao has managed to reduce even a general election, involving nearly 600 million voters, to the boring non-event that is the trademark of his election rallies, and indeed of everything else that he does. After all, the Nehru-Gandhi clan has disappeared from the political map, and the majority of voters will not even be able to name P.V.Narasimha Rao as India's Prime Minister. There could be as many as a dozen prime ministerial candidates ranging from Jyoti Basu to Ramakrishna Hegde, and from Chandra Shekar to (believe it or not) K.R.Narayanan. The sole personality who stands out, therefore, is none of the players, but the umpire: T.N.Seshan. .As for the parties, they are like the blind men of Hindustan, trying in vain to gauge the contours of the animal they have to confront. But it doesn't look as if it will be the mandir-masjid, nor will it be Hindutva or economic nationalism. The Congress will like it to be stability, but what does that mean for the majority? Economic reform is a non-issue for most people with inflation down to barely 4 per cent, prices are not top of the mind either. In a strange twist, after the hawala scandal, corruption has been pushed off the map too.But ponder for a moment, isn't this state of affairs astonishing, given the context? Consider that so many ministers have had to resign over the hawala issue; that a governor who was a cabinet minister has also had to quit, in the wake of judicial displeasure; that the prime minister himself is under investigation for his involvement in not one scandal but two; that the main prime ministerial candidate from the opposition has had to bow out because he too has been changed in the hawala case; and that the head of the ‘third force’ has his own little (or not so little fodder scandal to face. Why then is corruption not an issue — not as a matter of competitive politics, but as an issue on which the contenders for power feel that they have to offer the prospect of genuine change? If all this does not make the parties (almost all of whom have broken the law, in not submitting their audited accounts every year to the income tax authorities) realise that the country both needs — and is ready for-change in the Supreme Court; the assertiveness of the Election Commission, giving new life to a model code of conduct that has been ignored for a quarter country; the independence that has been thrust upon the Central Bureau of Investigation; and the fresh zeal on the part of tax collectors out to nab corporate no-gooders. Think also that at no other point since the Emergency of 1975-77 have so many people in power been hounded by the system for their misdeeds.Is this just a case of a few individuals outside the political system doing the job, or is the country heading for a new era? The seventies saw the collapse of the national consensus that marked the Nehruvian era, and ideology took over in the Indira Gandhi years. That too was buried by Rajiv Gandhi and his technocratic friends. And now, we have these issue-less elections. One possibility is that the country is heading for a period of constitutionalism as the other arms of the state reclaim some of the powers they lost, or yielded, to the political establishment. Economic reform free one part of Indian society from the clutches of the political class. Now, this could spread to other parts of the system. Against such a dramatic backdrop, it should be obvious that people (voters) are looking for accountability, for ways in which to make a corrupted system work again. And the astonishing thing is that no party has sought to ride this particular wave; instead all are on the defensive, desperately evading the real issues. No wonder this is an ‘issue-less’ election.Why does the author probably say that the sole personality who stands out in the elections is T.N.Seshan?
 ....
MCQ->The administrative reform that introduced the element of direct election for the first time:....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ....
MCQ-> Read the passage carefully and answer the questions given at the end of each passage:Turning the business involved more than segmenting and pulling out of retail. It also meant maximizing every strength we had in order to boost our profit margins. In re-examining the direct model, we realized that inventory management was not just core strength; it could be an incredible opportunity for us, and one that had not yet been discovered by any of our competitors. In Version 1.0 the direct model, we eliminated the reseller, thereby eliminating the mark-up and the cost of maintaining a store. In Version 1.1, we went one step further to reduce inventory inefficiencies. Traditionally, a long chain of partners was involved in getting a product to the customer. Let’s say you have a factory building a PC we’ll call model #4000. The system is then sent to the distributor, which sends it to the warehouse, which sends it to the dealer, who eventually pushes it on to the consumer by advertising, “I’ve got model #4000. Come and buy it.” If the consumer says, “But I want model #8000,” the dealer replies, “Sorry, I only have model #4000.” Meanwhile, the factory keeps building model #4000s and pushing the inventory into the channel. The result is a glut of model #4000s that nobody wants. Inevitably, someone ends up with too much inventory, and you see big price corrections. The retailer can’t sell it at the suggested retail price, so the manufacturer loses money on price protection (a practice common in our industry of compensating dealers for reductions in suggested selling price). Companies with long, multi-step distribution systems will often fill their distribution channels with products in an attempt to clear out older targets. This dangerous and inefficient practice is called “channel stuffing”. Worst of all, the customer ends up paying for it by purchasing systems that are already out of date Because we were building directly to fill our customers’ orders, we didn’t have finished goods inventory devaluing on a daily basis. Because we aligned our suppliers to deliver components as we used them, we were able to minimize raw material inventory. Reductions in component costs could be passed on to our customers quickly, which made them happier and improved our competitive advantage. It also allowed us to deliver the latest technology to our customers faster than our competitors. The direct model turns conventional manufacturing inside out. Conventional manufacturing, because your plant can’t keep going. But if you don’t know what you need to build because of dramatic changes in demand, you run the risk of ending up with terrific amounts of excess and obsolete inventory. That is not the goal. The concept behind the direct model has nothing to do with stockpiling and everything to do with information. The quality of your information is inversely proportional to the amount of assets required, in this case excess inventory. With less information about customer needs, you need massive amounts of inventory. So, if you have great information – that is, you know exactly what people want and how much - you need that much less inventory. Less inventory, of course, corresponds to less inventory depreciation. In the computer industry, component prices are always falling as suppliers introduce faster chips, bigger disk drives and modems with ever-greater bandwidth. Let’s say that Dell has six days of inventory. Compare that to an indirect competitor who has twenty-five days of inventory with another thirty in their distribution channel. That’s a difference of forty-nine days, and in forty-nine days, the cost of materials will decline about 6 percent. Then there’s the threat of getting stuck with obsolete inventory if you’re caught in a transition to a next- generation product, as we were with those memory chip in 1989. As the product approaches the end of its life, the manufacturer has to worry about whether it has too much in the channel and whether a competitor will dump products, destroying profit margins for everyone. This is a perpetual problem in the computer industry, but with the direct model, we have virtually eliminated it. We know when our customers are ready to move on technologically, and we can get out of the market before its most precarious time. We don’t have to subsidize our losses by charging higher prices for other products. And ultimately, our customer wins. Optimal inventory management really starts with the design process. You want to design the product so that the entire product supply chain, as well as the manufacturing process, is oriented not just for speed but for what we call velocity. Speed means being fast in the first place. Velocity means squeezing time out of every step in the process. Inventory velocity has become a passion for us. To achieve maximum velocity, you have to design your products in a way that covers the largest part of the market with the fewest number of parts. For example, you don’t need nine different disk drives when you can serve 98 percent of the market with only four. We also learned to take into account the variability of the lost cost and high cost components. Systems were reconfigured to allow for a greater variety of low-cost parts and a limited variety of expensive parts. The goal was to decrease the number of components to manage, which increased the velocity, which decreased the risk of inventory depreciation, which increased the overall health of our business system. We were also able to reduce inventory well below the levels anyone thought possible by constantly challenging and surprising ourselves with the result. We had our internal skeptics when we first started pushing for ever-lower levels of inventory. I remember the head of our procurement group telling me that this was like “flying low to the ground 300 knots.” He was worried that we wouldn’t see the trees.In 1993, we had $2.9 billion in sales and $220 million in inventory. Four years later, we posted $12.3 billion in sales and had inventory of $33 million. We’re now down to six days of inventory and we’re starting to measure it in hours instead of days. Once you reduce your inventory while maintaining your growth rate, a significant amount of risk comes from the transition from one generation of product to the next. Without traditional stockpiles of inventory, it is critical to precisely time the discontinuance of the older product line with the ramp-up in customer demand for the newer one. Since we were introducing new products all the time, it became imperative to avoid the huge drag effect from mistakes made during transitions. E&O; – short for “excess and obsolete” - became taboo at Dell. We would debate about whether our E&O; was 30 or 50 cent per PC. Since anything less than $20 per PC is not bad, when you’re down in the cents range, you’re approaching stellar performance.Find out the TRUE statement:
 ....
MCQ-> Directions : Read the following passage carefully and answer the questions given below it. Following the end of the Second World War, the United Kingdom enjoyed a long period without a major recession (from 1945 - 1973) and a rapid growth in prosperity in the 1959s and 1960s. According to the OECD, the annual rate of growth (percentage change between 1960 and 1973 averaged 2.9%, although this figure was far behind the rates of other European countries such as France, West Germany and Italy. However, following the 1973 oil crisis and the 1973-1974 stock market crash, the British economy fell into recession and the government of Edward Heath was ousted by the Labour Party under Harold Wilson. Wilson formed a minority government on 4 March 1974 after the general election on 28 February ended in a hung parliament. Wilson subsequently secured a three seat majority in a second election in October that year. The UK recorded weaker growth than many other European nations in the 1970s; even after the early 1970s recession ended, the economy was still blighted by rising unemployment and double-digit inflation. In 1976, the UK was forced to request a loan of $ 2.3 billion from the International Monetary Fund. The then Chancellor of the Exchequer Denis Healey was required to implement public spending cuts and other economic reforms in order to secure the loan. Following the Winter of Discontent, the government of James Callaghan lost a vote of no confidence. This triggered the May 1979 general electron which resulted in Margaret Thatcher's Conservative Party forming a new government. A new period of neo-liberal economics began in 1979 with the election of Margaret Thatcher who won the general election on 3 May that year to return the Conservative Party to government after five years of Labour government. During the 1980s most state-owned enterprises were privatised, taxes cut and markets deregulated. GDP fell 5.9 % initially but growth subsequently returned and rose to 5% at its peak in 1988, one of the highest rates of any European nation. The UK economy had been one of the strongest economies in terms of inflation, interest rates and unemployment, all of which remained relatively low until the 2008-09 recession. Unemployment has since reached a peak of just under 2.5 million (7.8 %), the highest level since the early 1990s, although still far lower than some other European nations. However, interest rates have reduced to 0.5 % pa. During August 2008 the IMF warned that the UK economic outlook had worsened due to a twin shock : financial turmoil and rising commodity prices. Both developments harm the UK more than most developed countries, as the UK obtains revenue from exporting financial services while recording deficits in finished goods and commodities, including food. In 2007, the UK had the world's third largest current account deficit, due mainly to a large deficit in manufactured goods. During May 2008, the IMF advised the UK government to broaden the scope of fiscal policy to promote external balance. Although the UK's labour productivity per person employed¡¨ has been progressing well over the last two decades and has overtaken productivity in Germany, it still lags around 20% behind France, where workers have a 35 hour working week. the UK's labour productivity per hour worked is currently on a par with the average for the sold EU (15 countries). In 2010, the United Kingdom ranked 26th on the Human Development Index. The UK entered a recession in Q2 of 2008, according to the Office for National Statics and exited it in Q4 of 2009. The subsequently revised ONS figures show that the UK suffered six consecutive quarters of negative growth, making it the longest recession since records began. As of the end of Q4 2009, revised statistics from the Office for National Statistics demonstrate that the UK economy shrank by 7.2% from peak to trough. The Blue Book 2013 confirms that UK growth in Q2 of 2013 was 0.7 %, and that the volume of output of GDP remains 3.2% below its prerecession peak; The UK economy's recovery has thus been more lackluster than previously thought. Furthermore The Blue Book 2013 demonstrates that the UK experienced a deeper initial downturn than all of the G7 economies save for Japan, and has experienced a slower recovery than all but Italy. A report released by the Office of National Statistics on 14 May 2013 revealed that over the six-year period between 2005 and 2011, the UK dropped from 5th place to 12th place in terms of household income on an international scale ¡X the drop was partially attrib10 uted to the devaluation of sterling over this time frame. However, the report also concluded that, during this period, inflation was relatively less volatile, the UK labour market was more resilient in comparison to other recessions, and household spending and wealth in the UK remained relatively strong in comparison with other OECD countries. According to a report by Moody's Corporation, Britain's debt-to-GDP ratio continues to increase in 2013 and is expected to reach 93% at the end of the year. The UK has lost its triple. A credit rating on the basis of poor economic outlook. 2013 Economic Growth has surprised many Economists, Ministers and the OBR in the 2013 budget projected annual growth of just 0.6 %. In 2013 Q1 the economy grew by 0.4 % Q2 the economy grew by 0.7 % and Q3 the economy is predicted to have grown at 0.8%.A new period of neo-liberal economics began in United Kingdom with the election of Margaret Thatcher after five years of Labour government. Margaret Thatcher came in power in
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions