1. Assertion (A): In microprocessor 8085 DMA allows direct access to memory to speed up data transfer.Reason (R): HOLD and HLDA signals are used for DMA operations.





Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->The DMA transfer technique where the transfer of one word data at a time:....
QA->A car during its journey travels 30 minutes at the speed of 40 km/hr. another 45 minutes at the speed of 60 km /hr and for two hours at a speed of 70 km/hr. Find the average speed of the car?....
QA->A byte addressable computer has memory capacity of 4096 KB and can perform 64 operations. An instruction involving 3 memory operands and one operator needs:....
QA->A computer has 8 MB in main memory, 128 KB cache with block size of 4KB. If direct mapping scheme is used, how many different main memory blocks can map into a given physical cache block?....
QA->In a memory, the minimum time delay between the initiation of successive memory operations is:....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ....
MCQ->Assertion (A): In microprocessor 8085 DMA allows direct access to memory to speed up data transfer.Reason (R): HOLD and HLDA signals are used for DMA operations.

....
MCQ-> Read the passage carefully and answer the questions given at the end of each passage:Turning the business involved more than segmenting and pulling out of retail. It also meant maximizing every strength we had in order to boost our profit margins. In re-examining the direct model, we realized that inventory management was not just core strength; it could be an incredible opportunity for us, and one that had not yet been discovered by any of our competitors. In Version 1.0 the direct model, we eliminated the reseller, thereby eliminating the mark-up and the cost of maintaining a store. In Version 1.1, we went one step further to reduce inventory inefficiencies. Traditionally, a long chain of partners was involved in getting a product to the customer. Let’s say you have a factory building a PC we’ll call model #4000. The system is then sent to the distributor, which sends it to the warehouse, which sends it to the dealer, who eventually pushes it on to the consumer by advertising, “I’ve got model #4000. Come and buy it.” If the consumer says, “But I want model #8000,” the dealer replies, “Sorry, I only have model #4000.” Meanwhile, the factory keeps building model #4000s and pushing the inventory into the channel. The result is a glut of model #4000s that nobody wants. Inevitably, someone ends up with too much inventory, and you see big price corrections. The retailer can’t sell it at the suggested retail price, so the manufacturer loses money on price protection (a practice common in our industry of compensating dealers for reductions in suggested selling price). Companies with long, multi-step distribution systems will often fill their distribution channels with products in an attempt to clear out older targets. This dangerous and inefficient practice is called “channel stuffing”. Worst of all, the customer ends up paying for it by purchasing systems that are already out of date Because we were building directly to fill our customers’ orders, we didn’t have finished goods inventory devaluing on a daily basis. Because we aligned our suppliers to deliver components as we used them, we were able to minimize raw material inventory. Reductions in component costs could be passed on to our customers quickly, which made them happier and improved our competitive advantage. It also allowed us to deliver the latest technology to our customers faster than our competitors. The direct model turns conventional manufacturing inside out. Conventional manufacturing, because your plant can’t keep going. But if you don’t know what you need to build because of dramatic changes in demand, you run the risk of ending up with terrific amounts of excess and obsolete inventory. That is not the goal. The concept behind the direct model has nothing to do with stockpiling and everything to do with information. The quality of your information is inversely proportional to the amount of assets required, in this case excess inventory. With less information about customer needs, you need massive amounts of inventory. So, if you have great information – that is, you know exactly what people want and how much - you need that much less inventory. Less inventory, of course, corresponds to less inventory depreciation. In the computer industry, component prices are always falling as suppliers introduce faster chips, bigger disk drives and modems with ever-greater bandwidth. Let’s say that Dell has six days of inventory. Compare that to an indirect competitor who has twenty-five days of inventory with another thirty in their distribution channel. That’s a difference of forty-nine days, and in forty-nine days, the cost of materials will decline about 6 percent. Then there’s the threat of getting stuck with obsolete inventory if you’re caught in a transition to a next- generation product, as we were with those memory chip in 1989. As the product approaches the end of its life, the manufacturer has to worry about whether it has too much in the channel and whether a competitor will dump products, destroying profit margins for everyone. This is a perpetual problem in the computer industry, but with the direct model, we have virtually eliminated it. We know when our customers are ready to move on technologically, and we can get out of the market before its most precarious time. We don’t have to subsidize our losses by charging higher prices for other products. And ultimately, our customer wins. Optimal inventory management really starts with the design process. You want to design the product so that the entire product supply chain, as well as the manufacturing process, is oriented not just for speed but for what we call velocity. Speed means being fast in the first place. Velocity means squeezing time out of every step in the process. Inventory velocity has become a passion for us. To achieve maximum velocity, you have to design your products in a way that covers the largest part of the market with the fewest number of parts. For example, you don’t need nine different disk drives when you can serve 98 percent of the market with only four. We also learned to take into account the variability of the lost cost and high cost components. Systems were reconfigured to allow for a greater variety of low-cost parts and a limited variety of expensive parts. The goal was to decrease the number of components to manage, which increased the velocity, which decreased the risk of inventory depreciation, which increased the overall health of our business system. We were also able to reduce inventory well below the levels anyone thought possible by constantly challenging and surprising ourselves with the result. We had our internal skeptics when we first started pushing for ever-lower levels of inventory. I remember the head of our procurement group telling me that this was like “flying low to the ground 300 knots.” He was worried that we wouldn’t see the trees.In 1993, we had $2.9 billion in sales and $220 million in inventory. Four years later, we posted $12.3 billion in sales and had inventory of $33 million. We’re now down to six days of inventory and we’re starting to measure it in hours instead of days. Once you reduce your inventory while maintaining your growth rate, a significant amount of risk comes from the transition from one generation of product to the next. Without traditional stockpiles of inventory, it is critical to precisely time the discontinuance of the older product line with the ramp-up in customer demand for the newer one. Since we were introducing new products all the time, it became imperative to avoid the huge drag effect from mistakes made during transitions. E&O; – short for “excess and obsolete” - became taboo at Dell. We would debate about whether our E&O; was 30 or 50 cent per PC. Since anything less than $20 per PC is not bad, when you’re down in the cents range, you’re approaching stellar performance.Find out the TRUE statement:
 ....
MCQ-> Read passage carefully. Answer the questions by selecting the most appropriate option (with reference to the passage). PASSAGE 4While majoring in computer science isn't a requirement to participate in the Second Machine Age, what skills do liberal arts graduates specifically possess to contribute to this brave new world? Another major oversight in the debate has been the failure to appreciate that a good liberal arts education teaches many skills that are not only valuable to the general world of business, but are in fact vital to innovating the next wave of breakthrough tech-driven products and services. Many defenses of the value of a liberal arts education have been launched, of course, with the emphasis being on the acquisition of fundamental thinking and communication skills, such as critical thinking, logical argumentation, and good communication skills. One aspect of liberal arts education that has been strangely neglected in the discussion is the fact that the humanities and social sciences are devoted to the study of human nature and the nature of our communities and larger societies. Students who pursue degrees in the liberal arts disciplines tend to be particularly motivated to investigate what makes us human: how we behave and why we behave as we do. They're driven to explore how our families and our public institutions-such as our schools and legal systems-operate, and could operate better, and how governments and economies work, or as is so often the case, are plagued by dysfunction. These students learn a great deal from their particular courses of study and apply that knowledge to today's issues, the leading problems to be tackled, and various approaches for analyzing and addressing those problems. The greatest opportunities for innovation in the emerging era are in applying evolving technological capabilities to finding better ways to solve human problems like social dysfunction and political corruption; finding ways to better educate children; helping people live healthier and happier lives by altering harmful behaviors; improving our working conditions; discovering better ways to tackle poverty; Improving healthcare and making it more affordable; making our governments more accountable, from the local level up to that of global affairs; and finding optimal ways to incorporate intelligent, nimble machines into our work lives so that we are empowered to do more of the work that we do best, and to let the machines do the rest. Workers with a solid liberal arts education have a strong foundation to build on in pursuing these goals. One of the most immediate needs in technology innovation is to invest products and services with more human qualities. with more sensitivity to human needs and desires. Companies and entrepreneurs that want to succeed today and in the future must learn to consider in all aspects of their product and service creation how they can make use of the new technologies to make them more humane. Still, many other liberal arts disciplines also have much to provide the world of technological innovation. The study of psychology, for example, can help people build products that are more attuned to our emotions and ways of thinking. Experience in Anthropology can additionally help companies understand cultural and individual behavioural factors that should be considered in developing products and in marketing them. As technology allows for more machine intelligence and our lives become increasingly populated by the Internet of things and as the gathering of data about our lives and analysis of it allows for more discoveries about our behaviour, consideration of how new products and services can be crafted for the optimal enhancement of our lives and the nature of our communities, workplaces and governments will be of vital importance. Those products and services developed with the keeneSt sense of how they' can serve our human needs and complement our human talents will have a distinct competitive advantage. Much of the criticism of the liberal arts is based on the false assumption that liberal arts students lack rigor in comparison to those participating in the STEM disciplines and that they are 'soft' and unscientific whereas those who study STEM fields learn the scientific method. In fact the liberal arts teach many methods of rigorous inquiry and analysis, such as close observation and interviewing in ways that hard science adherents don't always appreciate. Many fields have long incorporated the scientific method and other types of data driven scientific inquiry and problem solving. Sociologists have developed sophisticated mathematical models of societal networks. Historians gather voluminous data on centuries-old household expenses, marriage and divorce rates, and the world trade, and use data to conduct statistical analyses, identifying trends and contributing factors to the phenomena they are studying. Linguists have developed high-tech models of the evolution of language, and they've made crucial contributions to the development of one of the technologies behind the rapid advance of automation- natural language processing, whereby computers are able to communicate with the, accuracy and personality of Siri and Alexa. It's also important to debunk the fallacy that liberal arts students who don't study these quantitative analytical methods have no 'hard' or relevant skills. This gets us back to the arguments about the fundamental ways of thinking, inquiring, problem solving and communicating that a liberal arts education teaches.What is the central theme of the passage?
 ....
MCQ-> Read the following passage carefully and answer the questions given below it. Certain words/phrases have been printed in bold tohelp you locate them while answering some of the questions. During the last few years, a lot of hype has been heaped on the BRICS (Brazil, Russia, India, China, and South Africa). With their large populations and rapid growth, these countries, so the argument goes, will soon become some of the largest economies in the world and, in the case of China, the largest of all by as early as 2020. But the BRICS, as well as many other emerging-market economieshave recently experienced a sharp economic slowdown. So, is the honeymoon over? Brazil’s GDP grew by only 1% last year, and may not grow by more than 2% this year, with its potential growth barely above 3%. Russia’s economy may grow by barely 2% this year, with potential growth also at around 3%, despite oil prices being around $100 a barrel. India had a couple of years of strong growth recently (11.2% in 2010 and 7.7% in 2011) but slowed to 4% in 2012. China’s economy grew by 10% a year for the last three decades, but slowed to 7.8% last year and risks a hard landing. And South Africa grew by only 2.5% last year and may not grow faster than 2% this year. Many other previously fast-growing emerging-market economies – for example, Turkey, Argentina, Poland, Hungary, and many in Central and Eastern Europe are experiencing a similar slowdown. So, what is ailing the BRICS and other emerging markets? First, most emerging-market economies were overheating in 2010-2011, with growth above potential and inflation rising and exceeding targets. Many of them thus tightened monetary policy in 2011, with consequences for growth in 2012 that have carried over into this year. Second, the idea that emerging-market economies could fully decouple from economic weakness in advanced economies was farfetched : recession in the eurozone, near-recession in the United Kingdom and Japan in 2011-2012, and slow economic growth in the United States were always likely to affect emerging market performance negatively – via trade, financial links, and investor confidence. For example, the ongoing euro zone downturn has hurt Turkey and emergingmarket economies in Central and Eastern Europe, owing to trade links. Third, most BRICS and a few other emerging markets have moved toward a variant of state capitalism. This implies a slowdown in reforms that increase the private sector’s productivity and economic share, together with a greater economic role for state-owned enterprises (and for state-owned banks in the allocation of credit and savings), as well as resource nationalism, trade protectionism, import substitution industrialization policies, and imposition of capital controls. This approach may have worked at earlier stages of development and when the global financial crisis caused private spending to fall; but it is now distorting economic activity and depressing potential growth. Indeed, China’s slowdown reflects an economic model that is, as former Premier Wen Jiabao put it, “unstable, unbalanced, uncoordinated, and unsustainable,” and that now is adversely affecting growth in emerging Asia and in commodity-exporting emerging markets from Asia to Latin America and Africa. The risk that China will experience a hard landing in the next two years may further hurt many emerging economies. Fourth, the commodity super-cycle that helped Brazil, Russia, South Africa, and many other commodity-exporting emerging markets may be over. Indeed, a boom would be difficult to sustain, given China’s slowdown, higher investment in energysaving technologies, less emphasis on capital-and resource-oriented growth models around the world, and the delayed increase in supply that high prices induced. The fifth, and most recent, factor is the US Federal Reserve’s signals that it might end its policy of quantitative easing earlier than expected, and its hints of an even tual exit from zero interest rates. both of which have caused turbulence in emerging economies’ financial markets. Even before the Fed’s signals, emergingmarket equities and commodities had underperformed this year, owing to China’s slowdown. Since then, emerging-market currencies and fixed-income securities (government and corporate bonds) have taken a hit. The era of cheap or zerointerest money that led to a wall of liquidity chasing high yields and assets equities, bonds, currencies, and commodities – in emerging markets is drawing to a close. Finally, while many emerging-market economies tend to run current-account surpluses, a growing number of them – including Turkey, South Africa, Brazil, and India – are running deficits. And these deficits are now being financed in riskier ways: more debt than equity; more short-term debt than longterm debt; more foreign-currency debt than local-currency debt; and more financing from fickle cross-border interbank flows. These countries share other weaknesses as well: excessive fiscal deficits, abovetarget inflation, and stability risk (reflected not only in the recent political turmoil in Brazil and Turkey, but also in South Africa’s labour strife and India’s political and electoral uncertainties). The need to finance the external deficit and to avoid excessive depreciation (and even higher inflation) calls for raising policy rates or keeping them on hold at high levels. But monetary tightening would weaken already-slow growth. Thus, emerging economies with large twin deficits and other macroeconomic fragilities may experience further downward pressure on their financial markets and growth rates. These factors explain why growth in most BRICS and many other emerging markets has slowed sharply. Some factors are cyclical, but others – state capitalism, the risk of a hard landing in China, the end of the commodity supercycle -are more structural. Thus, many emerging markets’ growth rates in the next decade may be lower than in the last – as may the outsize returns that investors realised from these economies’ financial assets (currencies, equities. bonds, and commodities). Of course, some of the better-managed emerging-market economies will continue to experitnce rapid growth and asset outperformance. But many of the BRICS, along with some other emerging economies, may hit a thick wall, with growth and financial markets taking a serious beating.Which of the following statement(s) is/are true as per the given information in the passage ? A. Brazil’s GDP grew by only 1% last year, and is expected to grow by approximately 2% this year. B. China’s economy grew by 10% a year for the last three decades but slowed to 7.8% last year. C. BRICS is a group of nations — Barzil, Russia, India China and South Africa.....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions