1. Name the two persons who were the first to develop a model of the microprocessor chip.






Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->The Trivago model is often a butt of jokes. He is not a regular model. Name him and what is his day job ?....
QA->What isthe name of world’s First 1000-Processor Chip that was developed by Universityof California?....
QA->Name the First microprocessor?....
QA->....... is generally regarded as the first microprocessor?....
QA->The first Operating System used in Microprocessor based system was:....
MCQ-> Read the passage carefully and answer the questions given at the end of each passage:Turning the business involved more than segmenting and pulling out of retail. It also meant maximizing every strength we had in order to boost our profit margins. In re-examining the direct model, we realized that inventory management was not just core strength; it could be an incredible opportunity for us, and one that had not yet been discovered by any of our competitors. In Version 1.0 the direct model, we eliminated the reseller, thereby eliminating the mark-up and the cost of maintaining a store. In Version 1.1, we went one step further to reduce inventory inefficiencies. Traditionally, a long chain of partners was involved in getting a product to the customer. Let’s say you have a factory building a PC we’ll call model #4000. The system is then sent to the distributor, which sends it to the warehouse, which sends it to the dealer, who eventually pushes it on to the consumer by advertising, “I’ve got model #4000. Come and buy it.” If the consumer says, “But I want model #8000,” the dealer replies, “Sorry, I only have model #4000.” Meanwhile, the factory keeps building model #4000s and pushing the inventory into the channel. The result is a glut of model #4000s that nobody wants. Inevitably, someone ends up with too much inventory, and you see big price corrections. The retailer can’t sell it at the suggested retail price, so the manufacturer loses money on price protection (a practice common in our industry of compensating dealers for reductions in suggested selling price). Companies with long, multi-step distribution systems will often fill their distribution channels with products in an attempt to clear out older targets. This dangerous and inefficient practice is called “channel stuffing”. Worst of all, the customer ends up paying for it by purchasing systems that are already out of date Because we were building directly to fill our customers’ orders, we didn’t have finished goods inventory devaluing on a daily basis. Because we aligned our suppliers to deliver components as we used them, we were able to minimize raw material inventory. Reductions in component costs could be passed on to our customers quickly, which made them happier and improved our competitive advantage. It also allowed us to deliver the latest technology to our customers faster than our competitors. The direct model turns conventional manufacturing inside out. Conventional manufacturing, because your plant can’t keep going. But if you don’t know what you need to build because of dramatic changes in demand, you run the risk of ending up with terrific amounts of excess and obsolete inventory. That is not the goal. The concept behind the direct model has nothing to do with stockpiling and everything to do with information. The quality of your information is inversely proportional to the amount of assets required, in this case excess inventory. With less information about customer needs, you need massive amounts of inventory. So, if you have great information – that is, you know exactly what people want and how much - you need that much less inventory. Less inventory, of course, corresponds to less inventory depreciation. In the computer industry, component prices are always falling as suppliers introduce faster chips, bigger disk drives and modems with ever-greater bandwidth. Let’s say that Dell has six days of inventory. Compare that to an indirect competitor who has twenty-five days of inventory with another thirty in their distribution channel. That’s a difference of forty-nine days, and in forty-nine days, the cost of materials will decline about 6 percent. Then there’s the threat of getting stuck with obsolete inventory if you’re caught in a transition to a next- generation product, as we were with those memory chip in 1989. As the product approaches the end of its life, the manufacturer has to worry about whether it has too much in the channel and whether a competitor will dump products, destroying profit margins for everyone. This is a perpetual problem in the computer industry, but with the direct model, we have virtually eliminated it. We know when our customers are ready to move on technologically, and we can get out of the market before its most precarious time. We don’t have to subsidize our losses by charging higher prices for other products. And ultimately, our customer wins. Optimal inventory management really starts with the design process. You want to design the product so that the entire product supply chain, as well as the manufacturing process, is oriented not just for speed but for what we call velocity. Speed means being fast in the first place. Velocity means squeezing time out of every step in the process. Inventory velocity has become a passion for us. To achieve maximum velocity, you have to design your products in a way that covers the largest part of the market with the fewest number of parts. For example, you don’t need nine different disk drives when you can serve 98 percent of the market with only four. We also learned to take into account the variability of the lost cost and high cost components. Systems were reconfigured to allow for a greater variety of low-cost parts and a limited variety of expensive parts. The goal was to decrease the number of components to manage, which increased the velocity, which decreased the risk of inventory depreciation, which increased the overall health of our business system. We were also able to reduce inventory well below the levels anyone thought possible by constantly challenging and surprising ourselves with the result. We had our internal skeptics when we first started pushing for ever-lower levels of inventory. I remember the head of our procurement group telling me that this was like “flying low to the ground 300 knots.” He was worried that we wouldn’t see the trees.In 1993, we had $2.9 billion in sales and $220 million in inventory. Four years later, we posted $12.3 billion in sales and had inventory of $33 million. We’re now down to six days of inventory and we’re starting to measure it in hours instead of days. Once you reduce your inventory while maintaining your growth rate, a significant amount of risk comes from the transition from one generation of product to the next. Without traditional stockpiles of inventory, it is critical to precisely time the discontinuance of the older product line with the ramp-up in customer demand for the newer one. Since we were introducing new products all the time, it became imperative to avoid the huge drag effect from mistakes made during transitions. E&O; – short for “excess and obsolete” - became taboo at Dell. We would debate about whether our E&O; was 30 or 50 cent per PC. Since anything less than $20 per PC is not bad, when you’re down in the cents range, you’re approaching stellar performance.Find out the TRUE statement:
 ....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ....
MCQ-> When people react to their experiences with particular authorities, those authorities and the organizations or institutions that they represent often benefit if the people involved begin with high levels of commitment to the organization or institution represented by the authorities. First, in his studies of people's attitudes toward political and legal institutions, Tyler found that attitudes after an experience with the institution were strongly affected by prior attitudes. Single experiences influence post- experience loyalty but certainly do not overwhelm the relationship between pre-experience and post- experience loyalty. Thus, the best predictor of loyalty after an experience is usually loyalty before that experience. Second, people with prior loyalty to the organization or institution judge their dealings with the organization’s or institution's authorities to be fairer than do those with less prior loyalty, either because they are more fairly treated or because they interpret equivalent treatment as fairer.Although high levels of prior organizational or institutional commitment are generally beneficial to the organization or institution, under certain conditions high levels of prior commitment may actually sow the seeds of reduced commitment. When previously committed individuals feel that they were treated unfavourably or unfairly during some experience with the organization or institution, they may show an especially sharp decline in commitment. Two studies were designed to test this hypothesis, which, if confirmed, would suggest that organizational or institutional commitment has risks, as well as benefits. At least three psychological models offer predictions of how individuals’ reactions may vary as a function of a: their prior level of commitment and b: the favorability of the encounter with the organization or institution. Favorability of the encounter is determined by the outcome of the encounter and the fairness or appropriateness of the procedures used to allocate outcomes during the encounter. First, the instrumental prediction is that because people are mainly concerned with receiving desired outcomes from their encounters with organizations, changes in their level of commitment will depend primarily on the favorability of the encounter. Second, the assimilation prediction is that individuals' prior attitudes predispose them to react in a way that is consistent with their prior attitudes.The third prediction, derived from the group-value model of justice, pertains to how people with high prior commitment will react when they feel that they have been treated unfavorably or unfairly during some encounter with the organization or institution. Fair treatment by the other party symbolizes to people that they are being dealt with in a dignified and respectful way, thereby bolstering their sense of self-identity and self-worth. However, people will become quite distressed and react quite negatively if they feel that they have been treated unfairly by the other party to the relationship. The group-value model suggests that people value the information they receive that helps them to define themselves and to view themselves favorably. According to the instrumental viewpoint, people are primarily concerned with the more material or tangible resources received from the relationship. Empirical support for the group-value model has implications for a variety of important issues, including the determinants of commitment, satisfaction, organizational citizenship, and rule following. Determinants of procedural fairness include structural or interpersonal factors. For example, structural determinants refer to such things as whether decisions were made by neutral, fact-finding authorities who used legitimate decision-making criteria. The primary purpose of the study was to examine the interactive effect of individuals a: commitment to an organization or institution prior to some encounter and b: perceptions of how fairly they were treated during the encounter, on the change in their level of commitment. A basic assumption of the group-value model is that people generally value their relationships with people, groups, organizations, and institutions and therefore value fair treatment from the other party to the relationship. Specifically, highly committed members should have especially negative reactions to feeling that they were treated unfairly, more so than a: less- committed group members or b: highly committed members who felt that they were fairly treated.The prediction that people will react especially negatively when they previously felt highly committed but felt that they were treated unfairly also is consistent with the literature on psychological contracts. Rousseau suggested that, over time, the members of work organizations develop feelings of entitlement, i.e., perceived obligations that their employers have toward them. Those who are highly committed to the organization believe that they are fulfilling their contract obligations. However, if the organization acted unfairly, then highly committed individuals are likely to believe that the organization did not live up to its end of the bargain.The hypothesis mentioned in the passage tests at least one of the following ideas.
 ....
MCQ-> Study 1 the following information careful’ ly and answer the questions given below : Nine persons – G, H, I, J, K, R, S, T and U — are seated in a straight line facing North, with equal distance between each other but not necessarily in the same order. Only two persons sit to the left of I. Only one person sits between I and U. H sits fourth to the right of R. R is not an immediate neighbour of U. Less than three persons sit between R and U. Number of persons sitting between I and U is half as that between H and J. Only three persons sit between K and T. K is not an immediate neighbour of J. Only two persons sit between T and G.In which of the given pairs of persons, is odd number of persons sitting between them?
 ....
MCQ-> Choose the best answer for each question.The production of histories of India has become very frequent in recent years and may well call for some explanation. Why so many and why this one in particular? The reason is a two-fold one: changes in the Indian scene requiring a re-interpretation of the facts and changes in attitudes of historians about the essential elements of Indian history. These two considerations are in addition to the normal fact of fresh information, whether in the form of archeological discoveries throwing fresh light on an obscure period or culture, or the revelations caused by the opening of archives or the release of private papers. The changes in the Indian scene are too obvious to need emphasis. Only two generations ago British rule seemed to most Indian as well as British observers likely to extend into an indefinite future; now there is a teenage generation which knows nothing of it. Changes in the attitudes of historians have occurred everywhere, changes in attitudes to the content of the subject as well as to particular countries, but in India there have been some special features. Prior to the British, Indian historiographers were mostly Muslims, who relied, as in the case of Sayyid Ghulam Hussain, on their own recollection of events and on information from friends and men of affairs. Only a few like Abu’l Fazl had access to official papers. These were personal narratives of events, varying in value with the nature of the writer. The early British writers were officials. In the 18th century they were concerned with some aspect of Company policy, or like Robert Orme in his Military Transactions gave a straight narrative in what was essentially a continuation of the Muslim tradition. In the early 119th century the writers were still, with two notable exceptions, officials, but they were now engaged in chronicling, in varying moods of zest, pride, and awe, the rise of the British power in India to supremacy. The two exceptions were James Mill, with his critical attitude to the Company and John Marchman, the Baptist missionary. But they, like the officials, were anglo-centric in their attitude, so that the history of modern India in their hands came to be the history of the rise of the British in India.The official school dominated the writing of Indian history until we get the first professional historian’s approach. Ramsay Muir and P. E. Roberts in England and H. H. Dodwell in India. Then Indian historians trained in the English school joined in, of whom the most distinguished was Sir Jadunath Sarkar and the other notable writers: Surendranath Sen, Dr Radhakumud Mukherji, and Professor Nilakanta Sastri. They, it may be said, restored India to Indian history, but their bias was mainly political. Finally have come the nationalists who range from those who can find nothing good or true in the British to sophisticated historical philosophers like K. M. Panikker.Along the types of historians with their varying bias have gone changes in the attitude to the content of Indian history. Here Indian historians have been influenced both by their local situation and by changes of thought elsewhere. It is this field that this work can claim some attention since it seeks to break new ground, or perhaps to deepen a freshly turned furrow in the field of Indian history. The early official historians were content with the glamour and drama of political history from Plassey to the Mutiny, from Dupleix to the Sikhs. But when the raj was settled down, glamour departed from politics, and they turned to the less glorious but more solid ground of administration. Not how India was conquered but how it was governed was the theme of this school of historians. It found its archpriest in H. H. Dodwell, its priestess in Dame Lilian Penson, and its chief shrine in the Volume VI of the Cambridge History of India. Meanwhile, in Britain other currents were moving, which led historical study into the economic and social fields. R. C. Dutt entered the first of these currents with his Economic History of India to be followed more recently by the whole group of Indian economic historians. W. E. Moreland extended these studies to the Mughal Period. Social history is now being increasingly studied and there is also of course a school of nationalist historians who see modern Indian history in terms of the rise and the fulfillment of the national movement.All these approaches have value, but all share in the quality of being compartmental. It is not enough to remove political history from its pedestal of being the only kind of history worth having if it is merely to put other types of history in its place. Too exclusive an attention to economic, social, or administrative history can be as sterile and misleading as too much concentration on politics. A whole subject needs a whole treatment for understanding. A historian must dissect his subject into its elements and then fuse them together again into an integrated whole. The true history of a country must contain all the features just cited but must present them as parts of a single consistent theme.Which of the following may be the closest in meaning to the statement ‘restored India to Indian history’?
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions