1. Which one of the given responses would be a meaningful order of the following ? (A) substance (B) atom (C) molecule (D) proton






Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

  • By: anil on 05 May 2019 01.55 am
    In this case proton is present in atom and atom is present inside molecule and cluster of molecules make a substance . Hence the order of given elements is : protons --> atoms ---> molecule --> substance (D - B - C - A)
Show Similar Question And Answers
QA->One molecule of hemoglobin can carry how many atom of oxygen?....
QA->HOW MANY PROTON IN THE NUCLEUS OF A HYDROGEN ATOM....
QA->A hydrogen atom is paramagnetic. What is a hydrogen molecule?....
QA->Anamika said, ‘God spared our lives because he wants us to do something meaningful’. (Reported Speech)....
QA->A Pigment responsible for photomorphogenetic responses is....
MCQ->Which one of the given responses would be a meaningful order of the following ? (A) substance (B) atom (C) molecule (D) proton....
MCQ-> Cells are the ultimate multi-taskers: they can switch on genes and carry out their orders, talk to each other, divide in two, and much more, all at the same time. But they couldn’t do any of these tricks without a power source to generate movement. The inside of a cell bustles with more traffic than Delhi roads, and, like all vehicles, the cell’s moving parts need engines. Physicists and biologists have looked ‘under the hood’ of the cell and laid out the nuts and bolts of molecular engines.The ability of such engines to convert chemical energy into motion is the envy nanotechnology researchers looking for ways to power molecule-sized devices. Medical researchers also want to understand how these engines work. Because these molecules are essential for cell division, scientists hope to shut down the rampant growth of cancer cells by deactivating certain motors. Improving motor-driven transport in nerve cells may also be helpful for treating diseases such as Alzheimer’s, Parkinson’s or ALS, also known as Lou Gehrig’s disease.We wouldn’t make it far in life without motor proteins. Our muscles wouldn’t contract. We couldn’t grow, because the growth process requires cells to duplicate their machinery and pull the copies apart. And our genes would be silent without the services of messenger RNA, which carries genetic instructions over to the cell’s protein-making factories. The movements that make these cellular activities possible occur along a complex network of threadlike fibers, or polymers, along which bundles of molecules travel like trams. The engines that power the cell’s freight are three families of proteins, called myosin, kinesin and dynein. For fuel, these proteins burn molecules of ATP, which cells make when they break down the carbohydrates and fats from the foods we eat. The energy from burning ATP causes changes in the proteins’ shape that allow them to heave themselves along the polymer track. The results are impressive: In one second, these molecules can travel between 50 and 100 times their own diameter. If a car with a five-foot-wide engine were as efficient, it would travel 170 to 340 kilometres per hour.Ronald Vale, a researcher at the Howard Hughes Medical Institute and the University of California at San Francisco, and Ronald Milligan of the Scripps Research Institute have realized a long-awaited goal by reconstructing the process by which myosin and kinesin move, almost down to the atom. The dynein motor, on the other hand, is still poorly understood. Myosin molecules, best known for their role in muscle contraction, form chains that lie between filaments of another protein called actin. Each myosin molecule has a tiny head that pokes out from the chain like oars from a canoe. Just as rowers propel their boat by stroking their oars through the water, the myosin molecules stick their heads into the actin and hoist themselves forward along the filament. While myosin moves along in short strokes, its cousin kinesin walks steadily along a different type of filament called a microtubule. Instead of using a projecting head as a lever, kinesin walks on two ‘legs’. Based on these differences, researchers used to think that myosin and kinesin were virtually unrelated. But newly discovered similarities in the motors’ ATP-processing machinery now suggest that they share a common ancestor — molecule. At this point, scientists can only speculate as to what type of primitive cell-like structure this ancestor occupied as it learned to burn ATP and use the energy to change shape. “We’ll never really know, because we can’t dig up the remains of ancient proteins, but that was probably a big evolutionary leap,” says Vale.On a slightly larger scale, loner cells like sperm or infectious bacteria are prime movers that resolutely push their way through to other cells. As L. Mahadevan and Paul Matsudaira of the Massachusetts Institute of Technology explain, the engines in this case are springs or ratchets that are clusters of molecules, rather than single proteins like myosin and kinesin. Researchers don’t yet fully understand these engines’ fueling process or the details of how they move, but the result is a force to be reckoned with. For example, one such engine is a spring-like stalk connecting a single-celled organism called a vorticellid to the leaf fragment it calls home. When exposed to calcium, the spring contracts, yanking the vorticellid down at speeds approaching three inches (eight centimetres) per second.Springs like this are coiled bundles of filaments that expand or contract in response to chemical cues. A wave of positively charged calcium ions, for example, neutralizes the negative charges that keep the filaments extended. Some sperm use spring-like engines made of actin filaments to shoot out a barb that penetrates the layers that surround an egg. And certain viruses use a similar apparatus to shoot their DNA into the host’s cell. Ratchets are also useful for moving whole cells, including some other sperm and pathogens. These engines are filaments that simply grow at one end, attracting chemical building blocks from nearby. Because the other end is anchored in place, the growing end pushes against any barrier that gets in its way.Both springs and ratchets are made up of small units that each move just slightly, but collectively produce a powerful movement. Ultimately, Mahadevan and Matsudaira hope to better understand just how these particles create an effect that seems to be so much more than the sum of its parts. Might such an understanding provide inspiration for ways to power artificial nano-sized devices in the future? “The short answer is absolutely,” says Mahadevan. “Biology has had a lot more time to evolve enormous richness in design for different organisms. Hopefully, studying these structures will not only improve our understanding of the biological world, it will also enable us to copy them, take apart their components and recreate them for other purpose.”According to the author, research on the power source of movement in cells can contribute to
 ....
MCQ-> People are continually enticed by such "hot" performance, even if it lasts for brief periods. Because of this susceptibility, brokers or analysts who have had one or two stocks move up sharply, or technicians who call one turn correctly, are believed to have established a credible record and can readily find market followings. Likewise, an advisory service that is right for a brief time can beat its drums loudly. Elaine Garzarelli gained near immortality when she purportedly "called" the 1987 crash. Although, as the market strategist for Shearson Lehman, her forecast was never published in a research report, nor indeed communicated to its clients, she still received widespread recognition and publicity for this call, which was made in a short TV interview on CNBC. Still, her remark on CNBC that the Dow could drop sharply from its then 5300 level rocked an already nervous market on July 23, 1996. What had been a 40-point gain for the Dow turned into a 40-point loss, a good deal of which was attributed to her comments.The truth is, market-letter writers have been wrong in their judgments far more often than they would like to remember. However, advisors understand that the public considers short-term results meaningful when they are, more often than not, simply chance. Those in the public eye usually gain large numbers of new subscribers for being right by random luck. Which brings us to another important probability error that falls under the broad rubric of representativeness. Amos Tversky and Daniel Kahneman call this one the "law of small numbers.". The statistically valid "law of large numbers" states that large samples will usually be highly representative of the population from which they are drawn; for example, public opinion polls are fairly accurate because they draw on large and representative groups. The smaller the sample used, however (or the shorter the record), the more likely the findings are chance rather than meaningful. Yet the Tversky and Kahneman study showed that typical psychological or educational experimenters gamble their research theories on samples so small that the results have a very high probability of being chance. This is the same as gambling on the single good call of an advisor. The psychologists and educators are far too confident in the significance of results based on a few observations or a short period of time, even though they are trained in statistical techniques and are aware of the dangers.Note how readily people over generalize the meaning of a small number of supporting facts. Limited statistical evidence seems to satisfy our intuition no matter how inadequate the depiction of reality. Sometimes the evidence we accept runs to the absurd. A good example of the major overemphasis on small numbers is the almost blind faith investors place in governmental economic releases on employment, industrial production, the consumer price index, the money supply, the leading economic indicators, etc. These statistics frequently trigger major stock- and bond-market reactions, particularly if the news is bad. Flash statistics, more times than not, are near worthless. Initial economic and Fed figures are revised significantly for weeks or months after their release, as new and "better" information flows in. Thus, an increase in the money supply can turn into a decrease, or a large drop in the leading indicators can change to a moderate increase. These revisions occur with such regularity you would think that investors, particularly pros, would treat them with the skepticism they deserve. Alas, the real world refuses to follow the textbooks. Experience notwithstanding, investors treat as gospel all authoritative-sounding releases that they think pinpoint the development of important trends. An example of how instant news threw investors into a tailspin occurred in July of 1996. Preliminary statistics indicated the economy was beginning to gain steam. The flash figures showed that GDP (gross domestic product) would rise at a 3% rate in the next several quarters, a rate higher than expected. Many people, convinced by these statistics that rising interest rates were imminent, bailed out of the stock market that month. To the end of that year, the GDP growth figures had been revised down significantly (unofficially, a minimum of a dozen times, and officially at least twice). The market rocketed ahead to new highs to August l997, but a lot of investors had retreated to the sidelines on the preliminary bad news. The advice of a world champion chess player when asked how to avoid making a bad move. His answer: "Sit on your hands”. But professional investors don't sit on their hands; they dance on tiptoe, ready to flit after the least particle of information as if it were a strongly documented trend. The law of small numbers, in such cases, results in decisions sometimes bordering on the inane. Tversky and Kahneman‘s findings, which have been repeatedly confirmed, are particularly important to our understanding of some stock market errors and lead to another rule that investors should follow.Which statement does not reflect the true essence of the passage? I. Tversky and Kahneman understood that small representative groups bias the research theories to generalize results that can be categorized as meaningful result and people simplify the real impact of passable portray of reality by small number of supporting facts. II. Governmental economic releases on macroeconomic indicators fetch blind faith from investors who appropriately discount these announcements which are ideally reflected in the stock and bond market prices. III. Investors take into consideration myopic gain and make it meaningful investment choice and fail to see it as a chance of occurrence. IV. lrrational overreaction to key regulators expressions is same as intuitive statistician stumbling disastrously when unable to sustain spectacular performance.....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ....
MCQ-> Read the following passage carefully and answer the questions given below it. Certain words/phrases have been printed in ‘’bold’’ to help you locate them while answering some of the questions.As increasing dependence on information systems develops, the need for such system to be reliable and secure also becomes more essential. As growing numbers of ordinary citizens use computer networks for banking, shopping, etc., network security in potentially a ‘’massive’’ problem. Over the last few years, the need for computer and information security system has become increasingly evident, as web sites are being defaced with greater frequency, more and more denial-of-service attacks are being reported, credit card information is being stolen, there is increased sophistication of hacking tools that are openly available to the public on the Internet, and there is increasing damage being caused by viruses and worms to critical information system resources.At the organizational level, institutional mechanism have to be designed in order to review policies, practices, measures and procedures to review e-security regularly and assess whether these are appropriate to their environment. It would be helpful if organizations share information about threats and vulnerabilities, and implement procedures of rapid and effective cooperation to prevent, detect and respond to security incidents. As new threats and vulnerabilities are continuously discovered there is a strong need for co-operation among organizations and, if necessary, we could also consider cross-border information sharing. We need to understand threats and dangers that could be ‘’vulnerable’’ to and the steps that need to be taken to ‘’mitigate’’ these vulnerabilities. We need to understand access control systems and methodology, telecommunications and network security, and security management practise. We should be well versed in the area of application and systems development security, cryptography, operations security and physical security.The banking sector is ‘’poised’’ for more challenges in the near future. Customers of banks can now look forward to a large array of new offerings by banks, from an ‘’era’’ of mere competition, banks are now cooperating among themselves so that the synergistic benefits are shared among all the players. This would result in the information of shared payment networks (a few shared ATM networks have already been commissioned by banks), offering payment services beyond the existing time zones. The Reserve Bank is also facilitating new projects such as the Multi Application Smart Card Project which, when implemented, would facilitate transfer of funds using electronic means and in a safe and secure manner across the length and breadth of the country, with reduced dependence on paper currency. The opportunities of e-banking or e-power is general need to be harnessed so that banking is available to all customers in such a manner that they would feel most convenient, and if required, without having to visit a branch of a bank. All these will have to be accompanied with a high level of comfort, which again boils down to the issue of e-security.One of the biggest advantages accruing to banks in the future would be the benefits that arise from the introduction of Real Time Gross Settlement (RTGS). Funds management by treasuries of banks would be helped greatly by RTGS. With almost 70 banks having joined the RTGS system, more large value funds transfer are taking place through this system. The implementation of Core Banking solutions by the banks is closely related to RTGS too. Core Banking will make anywhere banking a reality for customers of each bank. while RTGS bridges the need for inter-bank funds movement. Thus, the days of depositing a cheque for collection and a long wait for its realization would soon be a thing of the past for those customers who would opt for electronic movement of funds, using the RTGS system, where the settlement would be on an almost ‘’instantaneous’’ basis. Core Banking is already in vogue in many private sector and foreign banks; while its implementation is at different stages amongst the public sector banks.IT would also facilitate better and more scientific decision-making within banks. Information system now provide decision-makers in banks with a great deal of information which, along with historical data and trend analysis, help in the building up of efficient Management Information Systems. This, in turn, would help in better Asset Liability Management (ALM) which, today’s world of hairline margins is a key requirement for the success of banks in their operational activities. Another benefit which e-banking could provide for relates to Customer Relationship Management (CRM). CRM helps in stratification of customers and evaluating customer needs on a holistic basis which could be paving the way for competitive edge for banks and complete customer care for customer of banks.The content of the passage ‘’mainly’’ emphasizes----
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions