1. It is possible to develop more than 16 different analog levels using 4-bit resolution.



Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->In November 2009, more than 50 employees of a nuclear plant of India were affected due to high radiation levels. A water cooler for supplying water for the employees was contaminated with tritiated water. Where is that nuclear plant?....
QA->A computer with a 32 bit wide data bus implements its memory using 8 K x 8 static RAM chips. The smallest memory that this computer can have is:....
QA->A is taller than B; B is taller than C; D is taller than E and E is taller than B. Who is the shortest?....
QA->WHO BUILT MECHANICAL ANALOG COMPUTER FIRST....
QA->Scientists develop world’s first wireless flexible smartphone.Researchers atQueen’s University’s Human Media Lab have developed the world’s first full-colour,high-resolution and wireless flexible smartphone to combine multitouch withbend input.....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ....
MCQ-> Studies of the factors governing reading development in young children have achieved a remarkable degree of consensus over the past two decades. The consensus concerns the causal role of ‘phonological skills in young children’s reading progress. Children who have good phonological skills, or good ‘phonological awareness’ become good readers and good spellers. Children with poor phonological skills progress more poorly. In particular, those who have a specific phonological deficit are likely to be classified as dyslexic by the time that they are 9 or 10 years old.Phonological skills in young children can be measured at a number of different levels. The term phonological awareness is a global one, and refers to a deficit in recognising smaller units of sound within spoken words. Development work has shown that this deficit can be at the level of syllables, of onsets and rimes, or phonemes. For example, a 4-year old child might have difficulty in recognising that a word like valentine has three syllables, suggesting a lack of syllabic awareness. A five-year-old might have difficulty in recognizing that the odd work out in the set of words fan, cat, hat, mat is fan. This task requires an awareness of the sub-syllabic units of the onset and the rime. The onset corresponds to any initial consonants in a syllable words, and the rime corresponds to the vowel and to any following consonants. Rimes correspond to rhyme in single-syllable words, and so the rime in fan differs from the rime in cat, hat and mat. In longer words, rime and rhyme may differ. The onsets in val:en:tine are /v/ and /t/, and the rimes correspond to the selling patterns ‘al’, ‘en’ and’ ine’.A six-year-old might have difficulty in recognising that plea and pray begin with the same initial sound. This is a phonemic judgement. Although the initial phoneme /p/ is shared between the two words, in plea it is part of the onset ‘pl’ and in pray it is part if the onset ‘pr’. Until children can segment the onset (or the rime), such phonemic judgements are difficult for them to make. In fact, a recent survey of different developmental studies has shown that the different levels of phonological awareness appear to emerge sequentially. The awareness of syllables, onsets, and rimes appears to merge at around the ages of 3 and 4, long before most children go to school. The awareness of phonemes, on the other hand, usually emerges at around the age of 5 or 6, when children have been taught to read for about a year. An awareness of onsets and rimes thus appears to be a precursor of reading, whereas an awareness of phonemes at every serial position in a word only appears to develop as reading is taught. The onset-rime and phonemic levels of phonological structure, however, are not distinct. Many onsets in English are single phonemes, and so are some rimes (e.g. sea, go, zoo).The early availability of onsets and rimes is supported by studies that have compared the development of phonological awareness of onsets, rimes, and phonemes in the same subjects using the same phonological awareness tasks. For example, a study by Treiman and Zudowski used a same/different judgement task based on the beginning or the end sounds of words. In the beginning sound task, the words either began with the same onset, as in plea and plank, or shared only the initial phoneme, as in plea and pray. In the end-sound task, the words either shared the entire rime, as in spit and wit, or shared only the final phoneme, as in rat and wit. Treiman and Zudowski showed that four- and five-year-old children found the onset-rime version of the same/different task significantly easier than the version based on phonemes. Only the sixyear- olds, who had been learning to read for about a year, were able to perform both versions of the tasks with an equal level of success.From the following statements, pick out the true statement according to the passage.
 ....
MCQ->It is possible to develop more than 16 different analog levels using 4-bit resolution.....
MCQ-> Read the following passage carefully and answer the questions given. Certain words/phrases have been given in bold to help you locate them while answering some of the questions. From a technical and economic perspective, many assessments have highlighted the presence of cost-effective opportunities to reduce energy use in buildings. However several bodies note the significance of multiple barriers that prevent the take-up of energy efficiency measures in buildings. These include lack of awareness and concern, limited access to reliable information from trusted sources, fear about risk, disruption and other ‘transaction costs’ concerns about up-front costs and inadequate access to suitably priced finance, a lack of confidence in suppliers and technologies and the presence of split incentives between landlords and tenants. The widespread presence of these barriers led experts to predict thatwithout a concerted push from policy, two-thirds of the economically viable potential to improve energy efficiency will remain unexploited by 2035. These barriers are albatross around the neck that represent a classic market failure and a basis for governmental intervention. While these measurements focus on the technical, financial or economic barriers preventing the take-up of energy efficiency options in buildings, others emphasise the significance of the often deeply embedded social practices that shape energy use in buildings. These analyses focus not on the preferences and rationalities that might shape individual behaviours, but on the ‘entangled’ cultural practices, norms, values and routines that underpin domestic energy use. Focusing on the practice-related aspects of consumption generates very different conceptual framings and policy prescriptions than those that emerge from more traditional or mainstream perspectives. But the underlying case for government intervention to help to promote retrofit and the diffusion of more energy efficient particles is still apparent, even though the forms of intervention advocated are often very different to those that emerge from a more technical or economic perspective. Based on the recognition of the multiple barriers to change and the social, economic and environmental benefits that could be realised if they were overcome, government support for retrofit (renovating existing infrastructure to make it more energy efficient) has been widespread. Retrofit programmes have been supported and adopted in diverse forms in many setting and their ability to recruit householders and then to impact their energy use has been discussed quite extensively. Frequently, these discussions have criticised the extent to which retrofit schemes rely on incentives and the provision of new technologies to change behaviour whilst ignoring the many other factors that might limit either participation in the schemes or their impact on the behaviours and prac-tices that shape domestic energy use. These factors are obviously central to the success of retrofit schemes, but evaluations of different schemes have found that despite these they can still have significant impacts. Few experts that the best estimate of the gap between the technical potential and the actual in-situ performance of energy efficiency measures is 50%, with 35% coming from performance gaps and 15% coming from ‘comfort taking’ or direct rebound effects. They further suggest that the direct rebound effect of energy efficiency measures related to household heating is Ilkley to be less than 30% while rebound effects for various domestic energy efficiency measures vary from 5 to 15% and arise mostly from indirect effects (i.e., where savings from energy efficiency lead to increased demand for goods and services). Other analyses also note that the gap between technical potential and actual performance is likely to vary by measure, with the range extending from 0% for measures such as solar water heating to 50% for measures such as improved heating controls. And others note that levels of comfort taking are likely to vary according to the levels of consumption and fuel poverty in the sample of homes where insulation is installed, with the range extending from 30% when considering homes across all income groups to around 60% when considering only lower income homes. The scale of these gapsis significant because it materially affects the impacts of retrofit schemes and expectations and perceptions of these impacts go on to influence levels of political, financial and public support for these schemes. The literature on retrofit highlights the presence of multiple barriers to change and the need for government support, if these are to be overcome. Although much has been written on the extent to which different forms of support enable the wider take-up of domestic energy efficiency measures, behaviours and practices, various areas of contestation remain and there is still an absence of robust ex-post evidence on the extent to which these schemes actually do lead to the social, economic and environmental benefits that are widely claimed.Which of the following is most nearly the OPPOSITE in meaning to the word ‘CONCERTED’ as used in the passage ?
 ....
MCQ-> Crinoline and croquet are out. As yet, no political activists have thrown themselves in front of the royal horse on Derby Day. Even so, some historians can spot the parallels. It is a time of rapid technological change. It is a period when the dominance of the world’s superpower is coming under threat. It is an epoch when prosperity masks underlying economic strain. And, crucially, it is a time when policy-makers are confident that all is for the best in the best of all possible worlds. Welcome to the Edwardian Summer of the second age of globalisation. Spare a moment to take stock of what’s been happening in the past few months. Let’s start with the oil price, which has rocketed to more than $65 a barrel, more than double its level 18 months ago. The accepted wisdom is that we shouldn’t worry our little heads about that, because the incentives are there for business to build new production and refining capacity, which will effortlessly bring demand and supply back into balance and bring crude prices back to $25 a barrel. As Tommy Cooper used to say, ‘just like that’. Then there is the result of the French referendum on the European Constitution, seen as thick-headed luddites railing vainly against the modern world. What the French needed to realise, the argument went, was that there was no alternative to the reforms that would make the country more flexible, more competitive, more dynamic. Just the sort of reforms that allowed Gate Gourmet to sack hundreds of its staff at Heathrow after the sort of ultimatum that used to be handed out by Victorian mill owners. An alternative way of looking at the French “non” is that our neighbours translate “flexibility” as “you’re fired”. Finally, take a squint at the United States. Just like Britain a century ago, a period of unquestioned superiority is drawing to a close. China is still a long way from matching America’s wealth, but it is growing at a stupendous rate and economic strength brings geo-political clout. Already, there is evidence of a new scramble for Africa as Washington and Beijing compete for oil stocks. Moreover, beneath the surface of the US economy, all is not well. Growth looks healthy enough, but the competition from China and elsewhere has meant the world’s biggest economy now imports far more than it exports. The US is living beyond its means, but in this time of studied complacency a current account deficit worth 6 percent of gross domestic product is seen as a sign of strength, not weakness. In this new Edwardian summer, comfort is taken from the fact that dearer oil has not had the savage inflationary consequences of 1973-74, when a fourfold increase in the cost of crude brought an abrupt end to a postwar boom that had gone on uninterrupted for a quarter of a century. True, the cost of living has been affected by higher transport costs, but we are talking of inflation at b)3 per cent and not 27 per cent. Yet the idea that higher oil prices are of little consequence is fanciful. If people are paying more to fill up their cars it leaves them with less to spend on everything else, but there is a reluctance to consume less. In the 1970s unions were strong and able to negotiate large, compensatory pay deals that served to intensify inflationary pressure. In 2005, that avenue is pretty much closed off, but the abolition of all the controls on credit that existed in the 1970s means that households are invited to borrow more rather than consume less. The knock-on effects of higher oil prices are thus felt in different ways – through high levels of indebtedness, in inflated asset prices, and in balance of payments deficits.There are those who point out, rightly, that modern industrial capitalism has proved mightily resilient these past 250 years, and that a sign of the enduring strength of the system has been the way it apparently shrugged off everything – a stock market crash, 9/11, rising oil prices – that have been thrown at it in the half decade since the millennium. Even so, there are at least three reasons for concern. First, we have been here before. In terms of political economy, the first era of globalisation mirrored our own. There was a belief in unfettered capital flows, in free trade, and in the power of the market. It was a time of massive income inequality and unprecedented migration. Eventually, though, there was a backlash, manifested in a struggle between free traders and protectionists, and in rising labour militancy. Second, the world is traditionally at its most fragile at times when the global balance of power is in flux. By the end of the nineteenth century, Britain’s role as the hegemonic power was being challenged by the rise of the United States, Germany, and Japan while the Ottoman and Hapsburg empires were clearly in rapid decline. Looking ahead from 2005, it is clear that over the next two or three decades, both China and India – which together account for half the world’s population – will flex their muscles. Finally, there is the question of what rising oil prices tell us. The emergence of China and India means global demand for crude is likely to remain high at a time when experts say production is about to top out. If supply constraints start to bite, any declines in the price are likely to be short-term cyclical affairs punctuating a long upward trend.By the expression ‘Edwardian Summer’, the author refers to a period in which there is
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions