1. In a series resonant circuit Q is more than 10. Then the lower half power frequency ω1, and resonant frequency ω0 are related as





Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->At the time of short-circuit; what will be the current in the circuit?....
QA->At the time of short-circuit, what will be the current in the circuit?....
QA->A is taller than B; B is taller than C; D is taller than E and E is taller than B. Who is the shortest?....
QA->Which dam has broken the world record for annual hydroelectric power production, more than a decade after it became the world"s largest power plant?....
QA->The former Rwandan army colonel who was sentenced to life in prison by the United Nations tribunal for genocide and crimes against humanity for masterminding the killings of more than half a million people in a 100-day slaughter in 1994?....
MCQ-> The broad scientific understanding today is that our planet is experiencing a warming trend over and above natural and normal variations that is almost certainly due to human activities associated with large-scale manufacturing. The process began in the late 1700s with the Industrial Revolution, when manual labor, horsepower, and water power began to be replaced by or enhanced by machines. This revolution, over time, shifted Britain, Europe, and eventually North America from largely agricultural and trading societies to manufacturing ones, relying on machinery and engines rather than tools and animals.The Industrial Revolution was at heart a revolution in the use of energy and power. Its beginning is usually dated to the advent of the steam engine, which was based on the conversion of chemical energy in wood or coal to thermal energy and then to mechanical work primarily the powering of industrial machinery and steam locomotives. Coal eventually supplanted wood because, pound for pound, coal contains twice as much energy as wood (measured in BTUs, or British thermal units, per pound) and because its use helped to save what was left of the world's temperate forests. Coal was used to produce heat that went directly into industrial processes, including metallurgy, and to warm buildings, as well as to power steam engines. When crude oil came along in the mid- 1800s, still a couple of decades before electricity, it was burned, in the form of kerosene, in lamps to make light replacing whale oil. It was also used to provide heat for buildings and in manufacturing processes, and as a fuel for engines used in industry and propulsion.In short, one can say that the main forms in which humans need and use energy are for light, heat, mechanical work and motive power, and electricity which can be used to provide any of the other three, as well as to do things that none of those three can do, such as electronic communications and information processing. Since the Industrial Revolution, all these energy functions have been powered primarily, but not exclusively, by fossil fuels that emit carbon dioxide (CO2), To put it another way, the Industrial Revolution gave a whole new prominence to what Rochelle Lefkowitz, president of Pro-Media Communications and an energy buff, calls "fuels from hell" - coal, oil, and natural gas. All these fuels from hell come from underground, are exhaustible, and emit CO2 and other pollutants when they are burned for transportation, heating, and industrial use. These fuels are in contrast to what Lefkowitz calls "fuels from heaven" -wind, hydroelectric, tidal, biomass, and solar power. These all come from above ground, are endlessly renewable, and produce no harmful emissions.Meanwhile, industrialization promoted urbanization, and urbanization eventually gave birth to suburbanization. This trend, which was repeated across America, nurtured the development of the American car culture, the building of a national highway system, and a mushrooming of suburbs around American cities, which rewove the fabric of American life. Many other developed and developing countries followed the American model, with all its upsides and downsides. The result is that today we have suburbs and ribbons of highways that run in, out, and around not only America s major cities, but China's, India's, and South America's as well. And as these urban areas attract more people, the sprawl extends in every direction.All the coal, oil, and natural gas inputs for this new economic model seemed relatively cheap, relatively inexhaustible, and relatively harmless-or at least relatively easy to clean up afterward. So there wasn't much to stop the juggernaut of more people and more development and more concrete and more buildings and more cars and more coal, oil, and gas needed to build and power them. Summing it all up, Andy Karsner, the Department of Energy's assistant secretary for energy efficiency and renewable energy, once said to me: "We built a really inefficient environment with the greatest efficiency ever known to man."Beginning in the second half of the twentieth century, a scientific understanding began to emerge that an excessive accumulation of largely invisible pollutants-called greenhouse gases - was affecting the climate. The buildup of these greenhouse gases had been under way since the start of the Industrial Revolution in a place we could not see and in a form we could not touch or smell. These greenhouse gases, primarily carbon dioxide emitted from human industrial, residential, and transportation sources, were not piling up along roadsides or in rivers, in cans or empty bottles, but, rather, above our heads, in the earth's atmosphere. If the earth's atmosphere was like a blanket that helped to regulate the planet's temperature, the CO2 buildup was having the effect of thickening that blanket and making the globe warmer.Those bags of CO2 from our cars float up and stay in the atmosphere, along with bags of CO2 from power plants burning coal, oil, and gas, and bags of CO2 released from the burning and clearing of forests, which releases all the carbon stored in trees, plants, and soil. In fact, many people don't realize that deforestation in places like Indonesia and Brazil is responsible for more CO2 than all the world's cars, trucks, planes, ships, and trains combined - that is, about 20 percent of all global emissions. And when we're not tossing bags of carbon dioxide into the atmosphere, we're throwing up other greenhouse gases, like methane (CH4) released from rice farming, petroleum drilling, coal mining, animal defecation, solid waste landfill sites, and yes, even from cattle belching. Cattle belching? That's right-the striking thing about greenhouse gases is the diversity of sources that emit them. A herd of cattle belching can be worse than a highway full of Hummers. Livestock gas is very high in methane, which, like CO2, is colorless and odorless. And like CO2, methane is one of those greenhouse gases that, once released into the atmosphere, also absorb heat radiating from the earth's surface. "Molecule for molecule, methane's heat-trapping power in the atmosphere is twenty-one times stronger than carbon dioxide, the most abundant greenhouse gas.." reported Science World (January 21, 2002). “With 1.3 billion cows belching almost constantly around the world (100 million in the United States alone), it's no surprise that methane released by livestock is one of the chief global sources of the gas, according to the U.S. Environmental Protection Agency ... 'It's part of their normal digestion process,' says Tom Wirth of the EPA. 'When they chew their cud, they regurgitate [spit up] some food to rechew it, and all this gas comes out.' The average cow expels 600 liters of methane a day, climate researchers report." What is the precise scientific relationship between these expanded greenhouse gas emissions and global warming? Experts at the Pew Center on Climate Change offer a handy summary in their report "Climate Change 101. " Global average temperatures, notes the Pew study, "have experienced natural shifts throughout human history. For example; the climate of the Northern Hemisphere varied from a relatively warm period between the eleventh and fifteenth centuries to a period of cooler temperatures between the seventeenth century and the middle of the nineteenth century. However, scientists studying the rapid rise in global temperatures during the late twentieth century say that natural variability cannot account for what is happening now." The new factor is the human factor-our vastly increased emissions of carbon dioxide and other greenhouse gases from the burning of fossil fuels such as coal and oil as well as from deforestation, large-scale cattle-grazing, agriculture, and industrialization.“Scientists refer to what has been happening in the earth’s atmosphere over the past century as the ‘enhanced greenhouse effect’”, notes the Pew study. By pumping man- made greenhouse gases into the atmosphere, humans are altering the process by which naturally occurring greenhouse gases, because of their unique molecular structure, trap the sun’s heat near the earth’s surface before that heat radiates back into space."The greenhouse effect keeps the earth warm and habitable; without it, the earth's surface would be about 60 degrees Fahrenheit colder on average. Since the average temperature of the earth is about 45 degrees Fahrenheit, the natural greenhouse effect is clearly a good thing. But the enhanced greenhouse effect means even more of the sun's heat is trapped, causing global temperatures to rise. Among the many scientific studies providing clear evidence that an enhanced greenhouse effect is under way was a 2005 report from NASA's Goddard Institute for Space Studies. Using satellites, data from buoys, and computer models to study the earth's oceans, scientists concluded that more energy is being absorbed from the sun than is emitted back to space, throwing the earth's energy out of balance and warming the globe."Which of the following statements is correct? (I) Greenhouse gases are responsible for global warming. They should be eliminated to save the planet (II) CO2 is the most dangerous of the greenhouse gases. Reduction in the release of CO2 would surely bring down the temperature (III) The greenhouse effect could be traced back to the industrial revolution. But the current development and the patterns of life have enhanced their emissions (IV) Deforestation has been one of the biggest factors contributing to the emission of greenhouse gases Choose the correct option:....
MCQ-> Read the following passage carefully and answer the questions given at the end.Passage 4Public sector banks (PSBs) are pulling back on credit disbursement to lower rated companies, as they keep a closer watch on using their own scarce capital and the banking regulator heightens its scrutiny on loans being sanctioned. Bankers say the Reserve Bank of India has started strictly monitoring how banks are utilizing their capital. Any big-ticket loan to lower rated companies is being questioned. Almost all large public sector banks that reported their first quarter results so far have showed a contraction in credit disbursal on a year-to-date basis, as most banks have shifted to a strategy of lending largely to government-owned "Navratna" companies and highly rated private sector companies. On a sequential basis too, banks have grown their loan book at an anaemic rate.To be sure, in the first quarter, loan demand is not quite robust. However, in the first quarter last year, banks had healthier loan growth on a sequential basis than this year. The country's largest lender State Bank of India grew its loan book at only 1.21% quarter-on-quarter. Meanwhile, Bank of Baroda and Punjab National Bank shrank their loan book by 1.97% and 0.66% respectively in the first quarter on a sequential basis.Last year, State Bank of India had seen sequential loan growth of 3.37%, while Bank of Baroda had seen a smaller contraction of 0.22%. Punjab National Bank had seen a growth of 0.46% in loan book between the January-March and April-June quarters last year. On a year-to-date basis, SBI's credit growth fell more than 2%, Bank of Baroda's credit growth contracted 4.71% and Bank of India's credit growth shrank about 3%. SBI chief Arundhati Bhattacharya said the bank's year-to-date credit growth fell as the bank focused on ‘A’ rated customers. About 90% of the loans in the quarter were given to high-rated companies. "Part of this was a conscious decision and part of it is because we actually did not get good fresh proposals in the quarter," Bhattacharya said.According to bankers, while part of the credit contraction is due to the economic slowdown, capital constraints and reluctance to take on excessive risk has also played a role. "Most of the PSU banks are facing pressure on capital adequacy. It is challenging to maintain 9% core capital adequacy. The pressure on monitoring capital adequacy and maintaining capital buffer is so strict that you cannot grow aggressively," said Rupa Rege Nitsure, chief economist at Bank of Baroda.Nitsure said capital conservation pressures will substantially cut down "irrational expansion of loans" in some smaller banks, which used to grow at a rate much higher than the industry average. The companies coming to banks, in turn, will have to make themselves more creditworthy for banks to lend. "The conservation of capital is going to inculcate a lot of discipline in both banks and borrowers," she said.For every loan that a bank disburses, some amount of money is required to be set aside as provision. Lower the credit rating of the company, riskier the loan is perceived to be. Thus, the bank is required to set aside more capital for a lower rated company than what it otherwise would do for a higher rated client. New international accounting norms, known as Basel III norms, require banks to maintain higher capital and higher liquidity. They also require a bank to set aside "buffer" capital to meet contingencies. As per the norms, a bank's total capital adequacy ratio should be 12% at any time, in which tier-I, or the core capital, should be at 9%. Capital adequacy is calculated by dividing total capital by risk-weighted assets. If the loans have been given to lower rated companies, risk weight goes up and capital adequacy falls.According to bankers, all loan decisions are now being assessed on the basis of the capital that needs to be set aside as provision against the loan and as a result, loans to lower rated companies are being avoided. According to a senior banker with a public sector bank, the capital adequacy situation is so precarious in some banks that if the risk weight increases a few basis points, the proposal gets cancelled. The banker did not wish to be named. One basis point is one hundredth of a percentage point. Bankers add that the Reserve Bank of India has also started strictly monitoring how banks are utilising their capital. Any big-ticket loan to lower rated companies is being questioned.In this scenario, banks are looking for safe bets, even if it means that profitability is being compromised. "About 25% of our loans this quarter was given to Navratna companies, who pay at base rate. This resulted in contraction of our net interest margin (NIM)," said Bank of India chairperson V.R. Iyer, while discussing the bank's first quarter results with the media. Bank of India's NIM, or the difference between yields on advances and cost of deposits, a key gauge of profitability, fell in the first quarter to 2.45% from 3.07% a year ago, as the bank focused on lending to highly rated customers.Analysts, however, say the strategy being followed by banks is short-sighted. "A high rated client will take loans at base rate and will not give any fee income to a bank. A bank will never be profitable that way. Besides, there are only so many PSU companies to chase. All banks cannot be chasing them all at a time. Fact is, the banks are badly hit by NPA and are afraid to lend now to big projects. They need capital, true, but they have become risk-averse," said a senior analyst with a local brokerage who did not wish to be named.Various estimates suggest that Indian banks would require more than Rs. 2 trillion of additional capital to have this kind of capital adequacy ratio by 2019. The central government, which owns the majority share of these banks, has been cutting down on its commitment to recapitalize the banks. In 2013-14, the government infused Rs. 14,000 crore in its banks. However, in 2014-15, the government will infuse just Rs. 11,200 crore.Which of the following statements is correct according to the passage?
 ....
MCQ-> Read the following passage carefully and answer the questions given below it. Certain words/phrases have been printed in bold tohelp you locate them while answering some of the questions. During the last few years, a lot of hype has been heaped on the BRICS (Brazil, Russia, India, China, and South Africa). With their large populations and rapid growth, these countries, so the argument goes, will soon become some of the largest economies in the world and, in the case of China, the largest of all by as early as 2020. But the BRICS, as well as many other emerging-market economieshave recently experienced a sharp economic slowdown. So, is the honeymoon over? Brazil’s GDP grew by only 1% last year, and may not grow by more than 2% this year, with its potential growth barely above 3%. Russia’s economy may grow by barely 2% this year, with potential growth also at around 3%, despite oil prices being around $100 a barrel. India had a couple of years of strong growth recently (11.2% in 2010 and 7.7% in 2011) but slowed to 4% in 2012. China’s economy grew by 10% a year for the last three decades, but slowed to 7.8% last year and risks a hard landing. And South Africa grew by only 2.5% last year and may not grow faster than 2% this year. Many other previously fast-growing emerging-market economies – for example, Turkey, Argentina, Poland, Hungary, and many in Central and Eastern Europe are experiencing a similar slowdown. So, what is ailing the BRICS and other emerging markets? First, most emerging-market economies were overheating in 2010-2011, with growth above potential and inflation rising and exceeding targets. Many of them thus tightened monetary policy in 2011, with consequences for growth in 2012 that have carried over into this year. Second, the idea that emerging-market economies could fully decouple from economic weakness in advanced economies was farfetched : recession in the eurozone, near-recession in the United Kingdom and Japan in 2011-2012, and slow economic growth in the United States were always likely to affect emerging market performance negatively – via trade, financial links, and investor confidence. For example, the ongoing euro zone downturn has hurt Turkey and emergingmarket economies in Central and Eastern Europe, owing to trade links. Third, most BRICS and a few other emerging markets have moved toward a variant of state capitalism. This implies a slowdown in reforms that increase the private sector’s productivity and economic share, together with a greater economic role for state-owned enterprises (and for state-owned banks in the allocation of credit and savings), as well as resource nationalism, trade protectionism, import substitution industrialization policies, and imposition of capital controls. This approach may have worked at earlier stages of development and when the global financial crisis caused private spending to fall; but it is now distorting economic activity and depressing potential growth. Indeed, China’s slowdown reflects an economic model that is, as former Premier Wen Jiabao put it, “unstable, unbalanced, uncoordinated, and unsustainable,” and that now is adversely affecting growth in emerging Asia and in commodity-exporting emerging markets from Asia to Latin America and Africa. The risk that China will experience a hard landing in the next two years may further hurt many emerging economies. Fourth, the commodity super-cycle that helped Brazil, Russia, South Africa, and many other commodity-exporting emerging markets may be over. Indeed, a boom would be difficult to sustain, given China’s slowdown, higher investment in energysaving technologies, less emphasis on capital-and resource-oriented growth models around the world, and the delayed increase in supply that high prices induced. The fifth, and most recent, factor is the US Federal Reserve’s signals that it might end its policy of quantitative easing earlier than expected, and its hints of an even tual exit from zero interest rates. both of which have caused turbulence in emerging economies’ financial markets. Even before the Fed’s signals, emergingmarket equities and commodities had underperformed this year, owing to China’s slowdown. Since then, emerging-market currencies and fixed-income securities (government and corporate bonds) have taken a hit. The era of cheap or zerointerest money that led to a wall of liquidity chasing high yields and assets equities, bonds, currencies, and commodities – in emerging markets is drawing to a close. Finally, while many emerging-market economies tend to run current-account surpluses, a growing number of them – including Turkey, South Africa, Brazil, and India – are running deficits. And these deficits are now being financed in riskier ways: more debt than equity; more short-term debt than longterm debt; more foreign-currency debt than local-currency debt; and more financing from fickle cross-border interbank flows. These countries share other weaknesses as well: excessive fiscal deficits, abovetarget inflation, and stability risk (reflected not only in the recent political turmoil in Brazil and Turkey, but also in South Africa’s labour strife and India’s political and electoral uncertainties). The need to finance the external deficit and to avoid excessive depreciation (and even higher inflation) calls for raising policy rates or keeping them on hold at high levels. But monetary tightening would weaken already-slow growth. Thus, emerging economies with large twin deficits and other macroeconomic fragilities may experience further downward pressure on their financial markets and growth rates. These factors explain why growth in most BRICS and many other emerging markets has slowed sharply. Some factors are cyclical, but others – state capitalism, the risk of a hard landing in China, the end of the commodity supercycle -are more structural. Thus, many emerging markets’ growth rates in the next decade may be lower than in the last – as may the outsize returns that investors realised from these economies’ financial assets (currencies, equities. bonds, and commodities). Of course, some of the better-managed emerging-market economies will continue to experitnce rapid growth and asset outperformance. But many of the BRICS, along with some other emerging economies, may hit a thick wall, with growth and financial markets taking a serious beating.Which of the following statement(s) is/are true as per the given information in the passage ? A. Brazil’s GDP grew by only 1% last year, and is expected to grow by approximately 2% this year. B. China’s economy grew by 10% a year for the last three decades but slowed to 7.8% last year. C. BRICS is a group of nations — Barzil, Russia, India China and South Africa.....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ....
MCQ-> Crinoline and croquet are out. As yet, no political activists have thrown themselves in front of the royal horse on Derby Day. Even so, some historians can spot the parallels. It is a time of rapid technological change. It is a period when the dominance of the world’s superpower is coming under threat. It is an epoch when prosperity masks underlying economic strain. And, crucially, it is a time when policy-makers are confident that all is for the best in the best of all possible worlds. Welcome to the Edwardian Summer of the second age of globalisation. Spare a moment to take stock of what’s been happening in the past few months. Let’s start with the oil price, which has rocketed to more than $65 a barrel, more than double its level 18 months ago. The accepted wisdom is that we shouldn’t worry our little heads about that, because the incentives are there for business to build new production and refining capacity, which will effortlessly bring demand and supply back into balance and bring crude prices back to $25 a barrel. As Tommy Cooper used to say, ‘just like that’. Then there is the result of the French referendum on the European Constitution, seen as thick-headed luddites railing vainly against the modern world. What the French needed to realise, the argument went, was that there was no alternative to the reforms that would make the country more flexible, more competitive, more dynamic. Just the sort of reforms that allowed Gate Gourmet to sack hundreds of its staff at Heathrow after the sort of ultimatum that used to be handed out by Victorian mill owners. An alternative way of looking at the French “non” is that our neighbours translate “flexibility” as “you’re fired”. Finally, take a squint at the United States. Just like Britain a century ago, a period of unquestioned superiority is drawing to a close. China is still a long way from matching America’s wealth, but it is growing at a stupendous rate and economic strength brings geo-political clout. Already, there is evidence of a new scramble for Africa as Washington and Beijing compete for oil stocks. Moreover, beneath the surface of the US economy, all is not well. Growth looks healthy enough, but the competition from China and elsewhere has meant the world’s biggest economy now imports far more than it exports. The US is living beyond its means, but in this time of studied complacency a current account deficit worth 6 percent of gross domestic product is seen as a sign of strength, not weakness. In this new Edwardian summer, comfort is taken from the fact that dearer oil has not had the savage inflationary consequences of 1973-74, when a fourfold increase in the cost of crude brought an abrupt end to a postwar boom that had gone on uninterrupted for a quarter of a century. True, the cost of living has been affected by higher transport costs, but we are talking of inflation at b)3 per cent and not 27 per cent. Yet the idea that higher oil prices are of little consequence is fanciful. If people are paying more to fill up their cars it leaves them with less to spend on everything else, but there is a reluctance to consume less. In the 1970s unions were strong and able to negotiate large, compensatory pay deals that served to intensify inflationary pressure. In 2005, that avenue is pretty much closed off, but the abolition of all the controls on credit that existed in the 1970s means that households are invited to borrow more rather than consume less. The knock-on effects of higher oil prices are thus felt in different ways – through high levels of indebtedness, in inflated asset prices, and in balance of payments deficits.There are those who point out, rightly, that modern industrial capitalism has proved mightily resilient these past 250 years, and that a sign of the enduring strength of the system has been the way it apparently shrugged off everything – a stock market crash, 9/11, rising oil prices – that have been thrown at it in the half decade since the millennium. Even so, there are at least three reasons for concern. First, we have been here before. In terms of political economy, the first era of globalisation mirrored our own. There was a belief in unfettered capital flows, in free trade, and in the power of the market. It was a time of massive income inequality and unprecedented migration. Eventually, though, there was a backlash, manifested in a struggle between free traders and protectionists, and in rising labour militancy. Second, the world is traditionally at its most fragile at times when the global balance of power is in flux. By the end of the nineteenth century, Britain’s role as the hegemonic power was being challenged by the rise of the United States, Germany, and Japan while the Ottoman and Hapsburg empires were clearly in rapid decline. Looking ahead from 2005, it is clear that over the next two or three decades, both China and India – which together account for half the world’s population – will flex their muscles. Finally, there is the question of what rising oil prices tell us. The emergence of China and India means global demand for crude is likely to remain high at a time when experts say production is about to top out. If supply constraints start to bite, any declines in the price are likely to be short-term cyclical affairs punctuating a long upward trend.By the expression ‘Edwardian Summer’, the author refers to a period in which there is
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions