1. The first Operating System used in Microprocessor based system was:

Answer: CP/M

Reply

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->The first Operating System used in Microprocessor based system was:....
QA->A system call in Linux operating system to create a new child process, which is a copy of the parent process:....
QA->....... is generally regarded as the first microprocessor?....
QA->WHICH WAS THE FIRST MICROPROCESSOR IN THE WORLD....
QA->Name the First microprocessor?....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ...
MCQ-> Answer questions on the basis of information given in the following case. MBA entrance examination comprises two types of problems: formula - based problems and application - based problem. From the analysis of past data, Interesting School of Management (ISM) observes that students good at solving application - based problems are entrepreneurial in nature. Coaching institutes for MBA entrance exams train them to spot formula - based problems and answer them correctly, so as to obtain the required overall cut - off percentile. Thus students, in general, shy away from application - based problem and even those with entrepreneurial mind - set target formula - based problems. Half of a mark is deducted for every wrong answer.ISM wants more students with entrepreneurial mind - set in the next batch. To achieve this, ISM is considering following proposals: I. Preparing a question paper of two parts, Parts A and Part B of duration of one hour each. Part A and Part B would consist of formula - based problems and application - based problems, respectively. After taking away Part A, Part B would be distributed. The qualifying cut - off percentile would be calculated on the combined scores of two parts. II. Preparing a question paper comprising Part A and Part B. While Part A would comprise formula - based problems, Part B would comprise application - based problems, each having a separate qualifying cut - off percentile. III. Assigning one mark for formula - based problems and two marks for application based problems as an incentive for attempting application - based problems. IV. Allotting one mark for formula - based problems and three marks for application - based problem, without mentioning this is the question paper. Which of the following proposal (or combination of proposals) is likely to identify students with best entrepreneurial mind - set?...
MCQ-> Read the  following  discussion/passage  and provide an appropriate answer for the questions that follow. Of the several features of the Toyota Production System that have been widely studied, most important is the mode of governance of the shop - floor at Toyota. Work and inter - relations between workers are highly scripted in extremely detailed ‘operating procedures’ that have to be followed rigidly, without any deviation at Toyota. Despite such rule - bound rigidity, however, Toyota does not become a ‘command - control system’. It is able to retain the character of a learning organizationIn fact, many observers characterize it as a community of scientists carrying out several small experiments simultaneously. The design of the operating procedure is the key. Every principal must find an expression in the operating procedure – that is how it has an effect in the domain of action. Workers on the shop - floor, often in teams, design the ‘operating procedure’ jointly with the supervisor through a series of hypothesis that are proposed and validated or refuted through experiments in action. The rigid and detailed ‘operating procedure’ specification throws up problems of the very minute kind; while its resolution leads to a reframing of the procedure and specifications. This inter - temporal change (or flexibility) of the specification (or operating procedure) is done at the lowest level of the organization; i.e. closest to the site of action. One implication of this arrangement is that system design can no longer be rationally optimal and standardized across the organization. It is quite common to find different work norms in contiguous assembly lines, because each might have faced a different set of problems and devised different counter - measures to tackle it. Design of the coordinating process that essentially imposes the discipline that is required in large - scale complex manufacturing systems is therefore customized to variations in man - machine context of the site of action. It evolves through numerous points of negotiation throughout the organization. It implies then that the higher levels of the hierarchy do not exercise the power of the fiat in setting work rules, for such work rules are no longer a standard set across the whole organization. It might be interesting to go through the basic Toyota philosophy that underlines its system designing practices. The notion of the ideal production system in Toyota embraces the following -‘the ability to deliver just - in - time (or on demand) a customer order in the exact specification demanded, in a batch size of one (and hence an infinite proliferation of variants, models and specifications), defect - free, without wastage of material, labour, energy or motion in a safe and (physically and emotionally) fulfilling production environment’. It did not embrace the concept of a standardized product that can be cheap by giving up variations. Preserving consumption variety was seen, in fact, as one mode of serving society. It is interesting to note that the articulation of the Toyota philosophy was made around roughly the same time that the Fordist system was establishing itself in the US automotive industry. What can be best defended as the asset which Toyota model of production leverages to give the vast range of models in a defect - free fashion?
 ...
MCQ-> DIRECTIONS for questions 24 to 50: Each of the five passages given below is followed by questions. For each question, choose the best answer.The World Trade Organisation (WTO) was created in the early 1990s as a component of the Uruguay Round negotiation. However, it could have been negotiated as part of the Tokyo Round of the 1970s, since that negotiation was an attempt at a 'constitutional reform' of the General Agreement on Tariffs and Trade (GATT). Or it could have been put off to the future, as the US government wanted. What factors led to the creation of the WTO in the early 1990s?One factor was the pattern of multilateral bargaining that developed late in the Uruguay Round. Like all complex international agreements, the WTO was a product of a series of trade-offs between principal actors and groups. For the United States, which did not want a new Organisation, the dispute settlement part of the WTO package achieved its longstanding goal of a more effective and more legal dispute settlement system. For the Europeans, who by the 1990s had come to view GATT dispute settlement less in political terms and more as a regime of legal obligations, the WTO package was acceptable as a means to discipline the resort to unilateral measures by the United States. Countries like Canada and other middle and smaller trading partners were attracted by the expansion of a rules-based system and by the symbolic value of a trade Organisation, both of which inherently support the weak against the strong. The developing countries were attracted due to the provisions banning unilateral measures. Finally, and perhaps most important, many countries at the Uruguay Round came to put a higher priority on the export gains than on the import losses that the negotiation would produce, and they came to associate the WTO and a rules-based system with those gains. This reasoning - replicated in many countries - was contained in U.S. Ambassador Kantor's defence of the WTO, and it amounted to a recognition that international trade and its benefits cannot be enjoyed unless trading nations accept the discipline of a negotiated rules-based environment.A second factor in the creation of the WTO was pressure from lawyers and the legal process. The dispute settlement system of the WTO was seen as a victory of legalists over pragmatists but the matter went deeper than that. The GATT, and the WTO, are contract organisations based on rules, and it is inevitable that an Organisation created to further rules will in turn be influenced by the legal process. Robert Hudec has written of the 'momentum of legal development', but what is this precisely? Legal development can be defined as promotion of the technical legal values of consistency, clarity (or, certainty) and effectiveness; these are values that those responsible for administering any legal system will seek to maximise. As it played out in the WTO, consistency meant integrating under one roof the whole lot of separate agreements signed under GATT auspices; clarity meant removing ambiguities about the powers of contracting parties to make certain decisions or to undertake waivers; and effectiveness meant eliminating exceptions arising out of grandfather-rights and resolving defects in dispute settlement procedures and institutional provisions. Concern for these values is inherent in any rules-based system of co-operation, since without these values rules would be meaningless in the first place. Rules, therefore, create their own incentive for fulfilment.The momentum of legal development has occurred in other institutions besides the GATT, most notably in the European Union (EU). Over the past two decades the European Court of Justice (ECJ) has consistently rendered decisions that have expanded incrementally the EU's internal market, in which the doctrine of 'mutual recognition' handed down in the case Cassis de Dijon in 1979 was a key turning point. The Court is now widely recognised as a major player in European integration, even though arguably such a strong role was not originally envisaged in the Treaty of Rome, which initiated the current European Union. One means the Court used to expand integration was the 'teleological method of interpretation', whereby the actions of member states were evaluated against 'the accomplishment of the most elementary community goals set forth in the Preamble to the [Rome] treaty'. The teleological method represents an effort to keep current policies consistent with stated goals, and it is analogous to the effort in GATT to keep contracting party trade practices consistent with stated rules. In both cases legal concerns and procedures are an independent force for further cooperation.In large part the WTO was an exercise in consolidation. In the context of a trade negotiation that created a near- revolutionary expansion of international trade rules, the formation of the WTO was a deeply conservative act needed to ensure that the benefits of the new rules would not be lost. The WTO was all about institutional structure and dispute settlement: these are the concerns of conservatives and not revolutionaries, which is why lawyers and legalists took the lead on these issues. The WTO codified the GATT institutional practice that had developed by custom over three decades, and it incorporated a new dispute settlement system that was necessary to keep both old and new rules from becoming a sham. Both the international structure and the dispute settlement system were necessary to preserve and enhance the integrity of the multilateral trade regime that had been built incrementally from the 1940s to the 1990s.What could be the closest reason why the WTO was not formed in the 1970s?
 ...
MCQ-> Read the following passage carefully and answer the questions given at the end. The second issue I want to address is one that comes up frequently - that Indian banks should aim to become global. Most people who put forward this view have not thought through the costs and benefits analytically; they only see this as an aspiration consistent with India’s growing international profile. In its 1998 report, the Narasimham (II) Committee envisaged a three tier structure for the Indian banking sector: 3 or 4 large banks having an international presence on the top, 8-10 mid-sized banks, with a network of branches throughout the country and engaged in universal banking, in the middle, and local banks and regional rural banks operating in smaller regions forming the bottom layer. However, the Indian banking system has not consolidated in the manner envisioned by the Narasimham Committee. The current structure is that India has 81 scheduled commercial banks of which 26 are public sector banks, 21 are private sector banks and 34 are foreign banks. Even a quick review would reveal that there is no segmentation in the banking structure along the lines of Narasimham II.A natural sequel to this issue of the envisaged structure of the Indian banking system is the Reserve Bank’s position on bank consolidation. Our view on bank consolidation is that the process should be market-driven, based on profitability considerations and brought about through a process of mergers & amalgamations (M&As;). The initiative for this has to come from the boards of the banks concerned which have to make a decision based on a judgment of the synergies involved in the business models and the compatibility of the business cultures. The Reserve Bank’s role in the reorganisation of the banking system will normally be only that of a facilitator.lt should be noted though that bank consolidation through mergers is not always a totally benign option. On the positive side are a higher exposure threshold, international acceptance and recognition, improved risk management and improvement in financials due to economies of scale and scope. This can be achieved both through organic and inorganic growth. On the negative side, experience shows that consolidation would fail if there are no synergies in the business models and there is no compatibility in the business cultures and technology platforms of the merging banks.Having given that broad brush position on bank consolidation let me address two specific questions: (i) can Indian banks aspire to global size?; and (ii) should Indian banks aspire to global size? On the first question, as per the current global league tables based on the size of assets, our largest bank, the State Bank of India (SBI), together with its subsidiaries, comes in at No.74 followed by ICICI Bank at No. I45 and Bank of Baroda at 188. It is, therefore, unlikely that any of our banks will jump into the top ten of the global league even after reasonable consolidation.Then comes the next question of whether Indian banks should become global. Opinion on this is divided. Those who argue that we must go global contend that the issue is not so much the size of our banks in global rankings but of Indian banks having a strong enough, global presence. The main argument is that the increasing global size and influence of Indian corporates warrant a corresponding increase in the global footprint of Indian banks. The opposing view is that Indian banks should look inwards rather than outwards, focus their efforts on financial deepening at home rather than aspiring to global size.It is possible to take a middle path and argue that looking outwards towards increased global presence and looking inwards towards deeper financial penetration are not mutually exclusive; it should be possible to aim for both. With the onset of the global financial crisis, there has definitely been a pause to the rapid expansion overseas of our banks. Nevertheless, notwithstanding the risks involved, it will be opportune for some of our larger banks to be looking out for opportunities for consolidation both organically and inorganically. They should look out more actively in regions which hold out a promise of attractive acquisitions.The surmise, therefore, is that Indian banks should increase their global footprint opportunistically even if they do not get to the top of the league table.Identify the correct statement from the following:
 ...
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions