1. The basic part numbers of ICs are the same regardless of the manufacturer because digital logic ICs have been standardized.



Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->Which Indian company has been named as the 2016 Digital Innovator of the Year by GE Digital?....
QA->What can be considered as basic building blocks of a digital circuit?....
QA->(P v q.≡ p q a famous law in logic known as :....
QA->The fat content of standardized milk and toned milk is :....
QA->The function of Arithmetic and Logic Unit (ALU) is?....
MCQ->The basic part numbers of ICs are the same regardless of the manufacturer because digital logic ICs have been standardized.....
MCQ-> Read the passage carefully and answer the questions given at the end of each passage:Turning the business involved more than segmenting and pulling out of retail. It also meant maximizing every strength we had in order to boost our profit margins. In re-examining the direct model, we realized that inventory management was not just core strength; it could be an incredible opportunity for us, and one that had not yet been discovered by any of our competitors. In Version 1.0 the direct model, we eliminated the reseller, thereby eliminating the mark-up and the cost of maintaining a store. In Version 1.1, we went one step further to reduce inventory inefficiencies. Traditionally, a long chain of partners was involved in getting a product to the customer. Let’s say you have a factory building a PC we’ll call model #4000. The system is then sent to the distributor, which sends it to the warehouse, which sends it to the dealer, who eventually pushes it on to the consumer by advertising, “I’ve got model #4000. Come and buy it.” If the consumer says, “But I want model #8000,” the dealer replies, “Sorry, I only have model #4000.” Meanwhile, the factory keeps building model #4000s and pushing the inventory into the channel. The result is a glut of model #4000s that nobody wants. Inevitably, someone ends up with too much inventory, and you see big price corrections. The retailer can’t sell it at the suggested retail price, so the manufacturer loses money on price protection (a practice common in our industry of compensating dealers for reductions in suggested selling price). Companies with long, multi-step distribution systems will often fill their distribution channels with products in an attempt to clear out older targets. This dangerous and inefficient practice is called “channel stuffing”. Worst of all, the customer ends up paying for it by purchasing systems that are already out of date Because we were building directly to fill our customers’ orders, we didn’t have finished goods inventory devaluing on a daily basis. Because we aligned our suppliers to deliver components as we used them, we were able to minimize raw material inventory. Reductions in component costs could be passed on to our customers quickly, which made them happier and improved our competitive advantage. It also allowed us to deliver the latest technology to our customers faster than our competitors. The direct model turns conventional manufacturing inside out. Conventional manufacturing, because your plant can’t keep going. But if you don’t know what you need to build because of dramatic changes in demand, you run the risk of ending up with terrific amounts of excess and obsolete inventory. That is not the goal. The concept behind the direct model has nothing to do with stockpiling and everything to do with information. The quality of your information is inversely proportional to the amount of assets required, in this case excess inventory. With less information about customer needs, you need massive amounts of inventory. So, if you have great information – that is, you know exactly what people want and how much - you need that much less inventory. Less inventory, of course, corresponds to less inventory depreciation. In the computer industry, component prices are always falling as suppliers introduce faster chips, bigger disk drives and modems with ever-greater bandwidth. Let’s say that Dell has six days of inventory. Compare that to an indirect competitor who has twenty-five days of inventory with another thirty in their distribution channel. That’s a difference of forty-nine days, and in forty-nine days, the cost of materials will decline about 6 percent. Then there’s the threat of getting stuck with obsolete inventory if you’re caught in a transition to a next- generation product, as we were with those memory chip in 1989. As the product approaches the end of its life, the manufacturer has to worry about whether it has too much in the channel and whether a competitor will dump products, destroying profit margins for everyone. This is a perpetual problem in the computer industry, but with the direct model, we have virtually eliminated it. We know when our customers are ready to move on technologically, and we can get out of the market before its most precarious time. We don’t have to subsidize our losses by charging higher prices for other products. And ultimately, our customer wins. Optimal inventory management really starts with the design process. You want to design the product so that the entire product supply chain, as well as the manufacturing process, is oriented not just for speed but for what we call velocity. Speed means being fast in the first place. Velocity means squeezing time out of every step in the process. Inventory velocity has become a passion for us. To achieve maximum velocity, you have to design your products in a way that covers the largest part of the market with the fewest number of parts. For example, you don’t need nine different disk drives when you can serve 98 percent of the market with only four. We also learned to take into account the variability of the lost cost and high cost components. Systems were reconfigured to allow for a greater variety of low-cost parts and a limited variety of expensive parts. The goal was to decrease the number of components to manage, which increased the velocity, which decreased the risk of inventory depreciation, which increased the overall health of our business system. We were also able to reduce inventory well below the levels anyone thought possible by constantly challenging and surprising ourselves with the result. We had our internal skeptics when we first started pushing for ever-lower levels of inventory. I remember the head of our procurement group telling me that this was like “flying low to the ground 300 knots.” He was worried that we wouldn’t see the trees.In 1993, we had $2.9 billion in sales and $220 million in inventory. Four years later, we posted $12.3 billion in sales and had inventory of $33 million. We’re now down to six days of inventory and we’re starting to measure it in hours instead of days. Once you reduce your inventory while maintaining your growth rate, a significant amount of risk comes from the transition from one generation of product to the next. Without traditional stockpiles of inventory, it is critical to precisely time the discontinuance of the older product line with the ramp-up in customer demand for the newer one. Since we were introducing new products all the time, it became imperative to avoid the huge drag effect from mistakes made during transitions. E&O; – short for “excess and obsolete” - became taboo at Dell. We would debate about whether our E&O; was 30 or 50 cent per PC. Since anything less than $20 per PC is not bad, when you’re down in the cents range, you’re approaching stellar performance.Find out the TRUE statement:
 ....
MCQ-> I think that it would be wrong to ask whether 50 years of India's Independence are an achievement or a failure. It would be better to see things as evolving. It's not an either-or question. My idea of the history of India is slightly contrary to the Indian idea.India is a country that, in the north, outside Rajasthan, was ravaged and intellectually destroyed to a large extent by the invasions that began in about AD 1000 by forces and religions that India had no means of understanding.The invasions are in all the schoolbooks. But I don't think that people understand that every invasion, every war, every campaign, was accompanied by slaughter, a slaughter always of the most talented people in the country. So these wars, apart from everything else led to a tremendous intellectual depletion of the country.I think that in the British period, and in the 50 years after the British period, there has been a kind of regrouping or recovery, a very slow revival of energy and intellect. This isn't an idea that goes with the vision of the grandeur of old India and all that sort of rubbish. That idea is a great simplification and it occurs because it is intellectually, philosophically easier for Indians to manage.What they cannot manage, and what they have not yet come to terms with, is that ravaging of all the north of India by various conquerors. That was ruined not by the act of nature, but by the hand of man. It is so painful that few Indians have begun to deal with it. It is much easier to deal with British imperialism. That is a familiar topic, in India and Britain. What is much less familiar is the ravaging of India before the British.What happened from AD 1000 onwards, really, is such a wound that it is almost impossible to face. Certain wounds are so bad that they can't be written about. You deal with that kind of pain by hiding from it. You retreat from reality. I do not think, for example, that the Incas of Peru or the native people of Mexico have ever got over their defeat by the Spaniards. In both places the head was cut off. I think the pre-British ravaging of India was as bad as that.In the place of knowledge of history, you have various fantasies about the village republic and the Old Glory. There is one big fantasy that Indians have always found solace in: about India having the capacity for absorbing its conquerors. This is not so. India was laid low by its conquerors.I feel the past 150 years have been years of every kind of growth. I see the British period and what has continued after that as one period. In that time, there has been a very slow intellectual recruitment. I think every Indian should make the pilgrimage to the site of the capital of the Vijayanagar empire, just to see what the invasion of India led to. They will see a totally destroyed town. Religious wars are like that. People who see that might understand what the centuries of slaughter and plunder meant. War isn't a game. When you lost that kind of war, your town was destroyed, the people who built the towns were destroyed. You are left with a headless population.That's where modern India starts from. The Vijayanagar capital was destroyed in 1565. It is only now that the surrounding region has begun to revive. A great chance has been given to India to start up again, and I feel it has started up again. The questions about whether 50 years of India since Independence have been a failure or an achievement are not the questions to ask. In fact, I think India is developing quite marvelously, people thought — even Mr Nehru thought — that development and new institutions in a place like Bihar, for instance, would immediately lead to beauty. But it doesn't happen like that. When a country as ravaged as India, with all its layers of cruelty, begins to extend justice to people lower down, it's a very messy business. It's not beautiful, it's extremely messy. And that's what you have now, all these small politicians with small reputations and small parties. But this is part of growth, this is part of development. You must remember that these people, and the people they represent, have never had rights before.When the oppressed have the power to assert themselves, they will behave badly. It will need a couple of generations of security, and knowledge of institutions, and the knowledge that you can trust institutions — it will take at least a couple of generations before people in that situation begin to behave well. People in India have known only tyranny. The very idea of liberty is a new idea. The rulers were tyrants. The tyrants were foreigners. And they were proud of being foreign. There's a story that anybody could run and pull a bell and the emperor would appear at his window and give justice. This is a child's idea of history — the slave's idea of the ruler's mercy. When the people at the bottom discover that they hold justice in their own hands, the earth moves a little. You have to expect these earth movements in India. It will be like this for a hundred years. But it is the only way. It's painful and messy and primitive and petty, but it’s better that it should begin. It has to begin. If we were to rule people according to what we think fit, that takes us back to the past when people had no voices. With self-awareness all else follows. People begin to make new demands on their leaders, their fellows, on themselves.They ask for more in everything. They have a higher idea of human possibilities. They are not content with what they did before or what their fathers did before. They want to move. That is marvellous. That is as it should be. I think that within every kind of disorder now in India there is a larger positive movement. But the future will be fairly chaotic. Politics will have to be at the level of the people now. People like Nehru were colonial — style politicians. They were to a large extent created and protected by the colonial order. They did not begin with the people. Politicians now have to begin with the people. They cannot be too far above the level of the people. They are very much part of the people. It is important that self-criticism does not stop. The mind has to work, the mind has to be active, there has to be an exercise of the mind. I think it's almost a definition of a living country that it looks at itself, analyses itself at all times. Only countries that have ceased to live can say it's all wonderful.The central thrust of the passage is that
 ....
MCQ-> Please read the three reports (newspaper articles) on ranking of different players and products in smart phones industry and answer the questions that follow. Report 1: (Feb, 2013) Apple nabs crown as current top US mobile phone vendor Apple’s reign may not be long, as Samsung is poised to overtake Apple in April, 2013. For the first time since Apple entered the mobile phone market in 2007, it has been ranked the top mobile phone vendor in the US. For the latter quarter of 2012, sales of its iPhone accounted for 34 percent of all mobile phone sales in the US - including feature phones - according to the latest data from Strategy Analytics. While the iPhone has consistently been ranked the top smartphone sold in the US, market research firm NPD noted that feature phone sales have fallen off a cliff recently, to the point where 8 out of every 10 mobile phones sold in the US are now smartphones. That ratio is up considerably from the end of 2011, when smartphones had just cracked the 50 percent mark. Given this fact it’s no surprise that Apple, which only sells smartphones, has been able to reach the top of the overall mobile phone market domestically. For the fourth quarter of 2012, Apple ranked number one with 34 percent of the US mobile market, up from 25.6 percent year over year. Samsung grew similarly, up to 32.3 percent from 26.9 percent - but not enough to keep from slipping to second place. LG dropped to 9 percent from 13.7 percent, holding its third place spot. It should be noted that Samsung and LG both sell a variety of feature phones in addition to smartphones. Looking only at smartphones, the ranking is a little different according to NPD. Apple holds the top spot with 39 percent of the US smartphone market, while Samsung again sits at number two with 30 percent. Motorola manages to rank third with 7 percent, while HTC dropped to fourth with 6 percent. In the US smartphone market, LG is fifth with 6 percent. Note how the percentages aren’t all that different from overall mobile phone market share - for all intents and purposes, the smartphone market is the mobile phone market in the US going forward. Still, Samsung was the top mobile phone vendor overall for 2012, and Strategy Analytics expects Samsung to be back on top soon. “Samsung had been the number one mobile phone vendor in the US since 2008, and it will surely be keen to recapture that title in 2013 by launching improved new models such as the rumored Galaxy S4”. And while Apple is the top vendor overall among smartphones, its iOS platform is still second to the Android platform overall. Samsung is the largest vendor selling Android-based smartphones, but Motorola, HTC, LG, and others also sell Android devices, giving the platform a clear advantage over iOS both domestically and globally. Report 2: Reader’s Response (2013, Feb) I don’t actually believe the numbers for Samsung. Ever since the debacle in early 2011, when Lenovo called into question the numbers Samsung was touting for tablet shipments, stating that Samsung had only sold 20,000 of the 1.5 million tablets they shipped into the US the last quarter of 2010, Samsung (who had no response to Lenovo) has refused to supply quarterly sales numbers for smartphones or tablets. That’s an indication that their sales aren’t what analysts are saying. We can look to several things to help understand why. In the lawsuit between Apple and Samsung here last year, both were required to supply real sales numbers for devices under contention. The phones listed turned out to have sales between one third and one half of what had been guessed by IDC and others. Tablet sales were even worse. Of the 1.5 million tablets supposedly shipped to the US during that time, only 38,000 were sold. Then we have the usage numbers. Samsung tablets have only a 1.5% usage rate, where the iPad has over 90%. Not as much a difference with the phones but it’s still overwhelmingly in favor of iPhone. The problem is that with Apple’s sales, we have actual numbers to go by. The companies who estimate can calibrate what they do after those numbers come out. But with Samsung and many others, they can’t ever calibrate their methods, as there are no confirming numbers released from the firms. A few quarters ago, as a result, we saw iSupply estimate Samsung’s smartphone sales for the quarter at 32 million, with estimates from others all over the place up to 50 million. Each time some other company reported a higher number for that same quarter, the press dutifully used that higher number as THE ONE. But none of them was the one. Without accurate self-reporting of actual sales to the end users, none of these market share charts are worth a damn! Report 3: Contradictory survey (Feb, 2013) iPhone5 Ranks Fifth In U.S. Customer Satisfaction Survey inShare. The iPhone5 ranks fifth in customer satisfaction according to the results of a recent survey from OnDevice Research, a mobile device research group. In the poll, they asked 320,000 smartphone and tablet users from six different countries, how satisfied they were with their devices. According to 93,825 people from the US, Motorola Atrix HD is the most satisfying and Motorola’s Droid Razr took second spot. HTC Corp (TPE : 2498)’s Rezound 4G and Samsung Galaxy Note 2 took third and fourth spots, while Apple’s iPhone5 landed in fifth spot. It appears that Apple may be lagging in consumer interest. OnDevice Research, Sarah Quinn explained, “Although Apple created one of the most revolutionary devices of the past decade, other manufactures have caught up, with some Android powered devices now commanding higher levels of user satisfaction.” Despite the lower rankings, things aren’t looking too bad for Apple Inc. (NASDAQ:AAPL) elsewhere. In the United Kingdom, they ranked second place, right after HTC One X. Interesting enough, Apple did take top spot for overall satisfaction of mobile device, whereas Google Inc. (NASDAQ:GOOG) ranked second. Motorola Mobility Holdings Inc. (NYSE:NOK) took third, fourth, and fifth places respectively, while Sony Ericsson trailed behind at sixth place. The survey sampled mobile device users in the following countries: United States, United Kingdom, France, Germany, Japan, and Indonesia. Although OnDevice didn’t share the full list of devices mentioned in the survey, it does show some insight to what customers want. Unfortunately, there were still many questions regarding the survey that were left unanswered. Everyone wants to know why Google Inc. (NASDAQ:GOOG) was on the list when they are not an actual smartphone maker and why was Samsung Electronics Co., Ltd. (LON:BC94) on the bottom of the satisfaction list when the brand is leading elsewhere. Source: 92.825 US mobile users, July 2012 - January 2013 Fortunately, those questions were answered by OnDevice Research’s representative. He explained that the survey was conducted on mobile web where the survey software could detect the taker’s device and since user’s rate their satisfaction levels on a 1 to 10 scale, thanks to the Nexus device, Google was included.If you analyze the three reports above, which of the following statements would be the best inference?
 ....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions