1. On the basis of Memory size & performance, which type of computer is known as "Big iron"?





Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Show Similar Question And Answers
QA->A computer has 8 MB in main memory, 128 KB cache with block size of 4KB. If direct mapping scheme is used, how many different main memory blocks can map into a given physical cache block?....
QA->A computer with a 32 bit wide data bus implements its memory using 8 K x 8 static RAM chips. The smallest memory that this computer can have is:....
QA->The size of the virtual memory depends on the size of :....
QA->The maximum size of main memory of a computer is determined by:....
QA->A byte addressable computer has memory capacity of 4096 KB and can perform 64 operations. An instruction involving 3 memory operands and one operator needs:....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ....
MCQ-> Read the following passage carefully and answer the questions given at the end.Passage 4Public sector banks (PSBs) are pulling back on credit disbursement to lower rated companies, as they keep a closer watch on using their own scarce capital and the banking regulator heightens its scrutiny on loans being sanctioned. Bankers say the Reserve Bank of India has started strictly monitoring how banks are utilizing their capital. Any big-ticket loan to lower rated companies is being questioned. Almost all large public sector banks that reported their first quarter results so far have showed a contraction in credit disbursal on a year-to-date basis, as most banks have shifted to a strategy of lending largely to government-owned "Navratna" companies and highly rated private sector companies. On a sequential basis too, banks have grown their loan book at an anaemic rate.To be sure, in the first quarter, loan demand is not quite robust. However, in the first quarter last year, banks had healthier loan growth on a sequential basis than this year. The country's largest lender State Bank of India grew its loan book at only 1.21% quarter-on-quarter. Meanwhile, Bank of Baroda and Punjab National Bank shrank their loan book by 1.97% and 0.66% respectively in the first quarter on a sequential basis.Last year, State Bank of India had seen sequential loan growth of 3.37%, while Bank of Baroda had seen a smaller contraction of 0.22%. Punjab National Bank had seen a growth of 0.46% in loan book between the January-March and April-June quarters last year. On a year-to-date basis, SBI's credit growth fell more than 2%, Bank of Baroda's credit growth contracted 4.71% and Bank of India's credit growth shrank about 3%. SBI chief Arundhati Bhattacharya said the bank's year-to-date credit growth fell as the bank focused on ‘A’ rated customers. About 90% of the loans in the quarter were given to high-rated companies. "Part of this was a conscious decision and part of it is because we actually did not get good fresh proposals in the quarter," Bhattacharya said.According to bankers, while part of the credit contraction is due to the economic slowdown, capital constraints and reluctance to take on excessive risk has also played a role. "Most of the PSU banks are facing pressure on capital adequacy. It is challenging to maintain 9% core capital adequacy. The pressure on monitoring capital adequacy and maintaining capital buffer is so strict that you cannot grow aggressively," said Rupa Rege Nitsure, chief economist at Bank of Baroda.Nitsure said capital conservation pressures will substantially cut down "irrational expansion of loans" in some smaller banks, which used to grow at a rate much higher than the industry average. The companies coming to banks, in turn, will have to make themselves more creditworthy for banks to lend. "The conservation of capital is going to inculcate a lot of discipline in both banks and borrowers," she said.For every loan that a bank disburses, some amount of money is required to be set aside as provision. Lower the credit rating of the company, riskier the loan is perceived to be. Thus, the bank is required to set aside more capital for a lower rated company than what it otherwise would do for a higher rated client. New international accounting norms, known as Basel III norms, require banks to maintain higher capital and higher liquidity. They also require a bank to set aside "buffer" capital to meet contingencies. As per the norms, a bank's total capital adequacy ratio should be 12% at any time, in which tier-I, or the core capital, should be at 9%. Capital adequacy is calculated by dividing total capital by risk-weighted assets. If the loans have been given to lower rated companies, risk weight goes up and capital adequacy falls.According to bankers, all loan decisions are now being assessed on the basis of the capital that needs to be set aside as provision against the loan and as a result, loans to lower rated companies are being avoided. According to a senior banker with a public sector bank, the capital adequacy situation is so precarious in some banks that if the risk weight increases a few basis points, the proposal gets cancelled. The banker did not wish to be named. One basis point is one hundredth of a percentage point. Bankers add that the Reserve Bank of India has also started strictly monitoring how banks are utilising their capital. Any big-ticket loan to lower rated companies is being questioned.In this scenario, banks are looking for safe bets, even if it means that profitability is being compromised. "About 25% of our loans this quarter was given to Navratna companies, who pay at base rate. This resulted in contraction of our net interest margin (NIM)," said Bank of India chairperson V.R. Iyer, while discussing the bank's first quarter results with the media. Bank of India's NIM, or the difference between yields on advances and cost of deposits, a key gauge of profitability, fell in the first quarter to 2.45% from 3.07% a year ago, as the bank focused on lending to highly rated customers.Analysts, however, say the strategy being followed by banks is short-sighted. "A high rated client will take loans at base rate and will not give any fee income to a bank. A bank will never be profitable that way. Besides, there are only so many PSU companies to chase. All banks cannot be chasing them all at a time. Fact is, the banks are badly hit by NPA and are afraid to lend now to big projects. They need capital, true, but they have become risk-averse," said a senior analyst with a local brokerage who did not wish to be named.Various estimates suggest that Indian banks would require more than Rs. 2 trillion of additional capital to have this kind of capital adequacy ratio by 2019. The central government, which owns the majority share of these banks, has been cutting down on its commitment to recapitalize the banks. In 2013-14, the government infused Rs. 14,000 crore in its banks. However, in 2014-15, the government will infuse just Rs. 11,200 crore.Which of the following statements is correct according to the passage?
 ....
MCQ->On the basis of Memory size & performance, which type of computer is known as "Big iron"?....
MCQ-> Read the passage carefully and answer the questions givenMore and more companies, government agencies, educational institutions and philanthropic organisations are today in the grip of a new phenomenon: ‘metric fixation’. The key components of metric fixation are the belief that it is possible - and desirable - to replace professional judgment (acquired through personal experience and talent) with numerical indicators of comparative performance based upon standardised data (metrics); and that the best way to motivate people within these organisations is by attaching rewards and penalties to their measured performance. The rewards can be monetary, in the form of pay for performance, say, or reputational, in the form of college rankings, hospital ratings, surgical report cards and so on. But the most dramatic negative effect of metric fixation is its propensity to incentivise gaming: that is, encouraging professionals to maximise the metrics in ways that are at odds with the larger purpose of the organisation. If the rate of major crimes in a district becomes the metric according to which police officers are promoted, then some officers will respond by simply not recording crimes or downgrading them from major offences to misdemeanours. Or take the case of surgeons. When the metrics of success and failure are made public - affecting their reputation and income - some surgeons will improve their metric scores by refusing to operate on patients with more complex problems, whose surgical outcomes are more likely to be negative. Who suffers? The patients who don’t get operated upon.When reward is tied to measured performance, metric fixation invites just this sort of gaming. But metric fixation also leads to a variety of more subtle unintended negative consequences. These include goal displacement, which comes in many varieties: when performance is judged by a few measures, and the stakes are high (keeping one’s job, getting a pay rise or raising the stock price at the time that stock options are vested), people focus on satisfying those measures - often at the expense of other, more important organisational goals that are not measured. The best-known example is ‘teaching to the test’, a widespread phenomenon that has distorted primary and secondary education in the United States since the adoption of the No Child Left Behind Act of 2001.Short-termism is another negative. Measured performance encourages what the US sociologist Robert K Merton in 1936 called ‘the imperious immediacy of interests … where the actor’s paramount concern with the foreseen immediate consequences excludes consideration of further or other consequences’. In short, advancing short-term goals at the expense of long-range considerations. This problem is endemic to publicly traded corporations that sacrifice long-term research and development, and the development of their staff, to the perceived imperatives of the quarterly report.To the debit side of the ledger must also be added the transactional costs of metrics: the expenditure of employee time by those tasked with compiling and processing the metrics in the first place - not to mention the time required to actually read them. . . .All of the following can be a possible feature of the No Child Left Behind Act of 2001, EXCEPT:
 ....
MCQ-> Answer questions on the basis of information given in the following case. Bright Engineering College (BEC) has listed 20 elective courses for the next term and students have to choose any 7 of them. Simran, a student of BEC, notices that there are three categories of electives: Job - oriented (J), Quantitative - oriented (Q) and Grade - oriented (G). Among these 20 electives, some electives are both Job and Grade - oriented but are not Quantitative - oriented (JG type). QJ type electives are both job and Quantitative - oriented but are not Grade - oriented and QG type electives are both Quantitative and Grade - oriented but are not Job - oriented. Simran also notes that the total number of QJ type electives is 2 less than QG type electives. Similarly, the total number of QG type electives is 2 less than JG type and there is only 1 common elective (JQG) across three categories. Furthermore, the number of only Quantitative - oriented electives is same as only Job - oriented electives, but less than the number of only Grade - oriented electives. Each elective has at least one registration and there is at least one elective in each category, or combinations of categories.On her way back Simran met her friend Raj and shared the above information. Raj is preparing for XAT and is only interested in Grade - oriented (G) electives. He wanted to know the number of G - type electives being offered. Simran replied, “You have all the information. Calculate the number of G - type electives yourself. It would help your XAT preparation”. Raj calculates correctly and says that there can be _______ possible answers. Which of the following options would b est fit the blank above?
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions