1. The static balancing is satisfactory for low speed rotors but with increasing speeds, dynamic balancing becomes necessary. This is because, the





Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->A car during its journey travels 30 minutes at the speed of 40 km/hr. another 45 minutes at the speed of 60 km /hr and for two hours at a speed of 70 km/hr. Find the average speed of the car?....
QA->India govt plan to construct new corridor for high speed train with speed range 300-350 kmph. What is the present maximum speed of long distance train in India?....
QA->Synonym of Dynamic (adj.)....
QA->An aerodynamic device mounted at the rear of an automobile to reduce lift at high speeds is :....
QA->A static member function can have access to:....
MCQ->The static balancing is satisfactory for low speed rotors but with increasing speeds, dynamic balancing becomes necessary. This is because, the....
MCQ-> Analyse the following passage and provide appropriate answers for the questions that follow: Each piece, or part, of the whole of nature is always merely an approximation to the complete truth, or the complete truth so far as we know it. In fact, everything we know is only some kind of approximation, because we know that we do not know all the laws as yet. Therefore, things must be learned only to be unlearned again or, more likely, to be corrected. The principal of science, the definition, almost, is the following: The test of all knowledge is experiment. Experiment is the sole judge of scientific “truth.” But what is the source of knowledge? Where do the laws that are to be tested come from? Experiment, itself, helps to produce these laws, in the sense that it gives us hints. But also needed is imagination to create from these laws, in the sense that it gives us hints. But also needed is imagination to create from these hints the great generalizations – to guess at the wonderful, simple, but very strange patterns beneath them all, and then to experiment to check again whether we have made the right guess. This imagining process is so difficult that there is a division of labour in physics: there are theoretical physicists who imagine, deduce, and guess at new laws, but do not experiment; and then there are experimental physicists who experiment, imagine, deduce, and guess. We said that the laws of nature are approximate: that we first find the “wrong” ones, and then we find the “right” ones. Now, how can an experiment be “wrong”? First, in a trivial way: the apparatus can be faulty and you did not notice. But these things are easily fixed and checked back and forth. So without snatching at such minor things, how can the results of an experiment be wrong? Only by being inaccurate. For example, the mass of an object never seems to change; a spinning top has the same weight as a still one. So a “law” was invented: mass is constant, independent of speed. That “law” is now found to be incorrect. Mass is found is to increase with velocity, but appreciable increase requires velocities near that of light. A true law is: if an object moves with a speed of less than one hundred miles a second the mass is constant to within one part in a million. In some such approximate form this is a correct law. So in practice one might think that the new law makes no significant difference. Well, yes and no. For ordinary speeds we can certainly forget it and use the simple constant mass law as a good approximation. But for high speeds we are wrong, and the higher the speed, the wrong we are. Finally, and most interesting, philosophically we are completely wrong with the approximate law. Our entire picture of the world has to be altered even though the mass changes only by a little bit. This is a very peculiar thing about the philosophy, or the ideas, behind the laws. Even a very small effect sometimes requires profound changes to our ideas.Which of the following options is DEFINITLY NOT an approximation to the complete truth?
 ....
MCQ->Which of the following statements are correct about static functions? Static functions can access only static data. Static functions cannot call instance functions. It is necessary to initialize static data. Instance functions can call static functions and access static data. this reference is passed to static functions.....
MCQ-> The painter is now free to paint anything he chooses. There are scarcely any forbidden subjects, and today everybody is prepared o admit that a painting of some fruit can be as important as painting of a hero dying. The Impressionists did as much as anybody to win this previously unheard of freedom for the artist. Yet, by the next generation, painters began to abandon tie subject altogether, and began to paint abstract pictures. Today the majority of pictures painted are abstract.Is there a connection between these two developments? Has art gone abstract because the artist is embarrassed by his freedom? Is it that, because he is free to paint anything, he doesn’t know what to paint? Apologists for abstract art often talk of it as Inc art of maximum freedom. But could this be the freedom of the desert island? It would take too long to answer these questions properly. I believe there is a connection. Many things have encouraged the development of abstract art. Among them has been the artists’ wish to avoid the difficulties of finding subjects when all subjects are equally possible.I raise the matter now because I want to draw attention to the fact that the painter’s choice of a subject is a far more complicated question than it would at first seem. A subject does not start with what is put in front of the easel or with something which the painter happens to remember. A subject starts with the painter deciding he would like to paint such-and-such because for some reason or other he finds it meaningful. A subject begins when the artist selects something for special mention. (What makes it special or meaningful may seem to the artist to be purely visual — its colours or its form.) When the subject has been selected, the function of the painting itself is to communicate and justify the significance of that selection.It is often said today that subject matter is unimportant. But this is only a reaction against the excessively literary and moralistic interpretation of subject matter in the nineteenth century. In truth the subject is literally the beginning and end of a painting. The painting begins with a selection (I will paint this and not everything else in the world); it is finished when that selection is justified (now you can see all that I saw and felt in this and how it is more than merely itself).Thus, for a painting to succeed it is essential that the painter and his public agree about what is significant. The subject may have a personal meaning for the painter or individual spectator; but there must also be the possibility of their agreement on its general meaning. It is at this point that the culture of the society and period in question precedes the artist and his art. Renaissance art would have meant nothing to the Aztecs — and vice versa. If, to some extent, a few intellectuals can appreciate them both today it is because their culture is an historical one: its inspiration is history and therefore it can include within itself, in principle if not in every particular, all known developments to date.When culture is secure and certain of its values, it presents its artists with subjects. The general agreement about what is significant is so well established that the significance of a particular subject accrues and becomes traditional. This is true, for instance, of reeds and water in China, of the nude body in Renaissance, of the animal in Africa. Furthermore in such cultures the artist is unlikely to be a free agent: he will be employed for the sake of particular subjects, and the problem, as we have just described it, will not occur to him.When a culture is in a state of disintegration or transitions the freedom of the artist increases — but the question of subject matter becomes problematic for him: he, himself, has to choose for society. This was at the basis of all the increasing crises in European art during the nineteenth century. It is too often forgotten how any of the art scandals of that time were provoked by the choice of subject (Gericault, Courbet, Daumier, Degas, Lautrec, Van Gogh, etc.).By the end of the nineteenth century there were, roughly speaking, two ways in which the painter could meet this challenge of deciding what to paint and so choosing for society. Either he identified himself with the people and so allowed their lives to dictate his subjects to him or he had to find his subjects within himself as painter. By people I mean everybody except the, bourgeoisie. Many painters did of course work for the bourgeoisie according to their copy-book of approved subjects, but all of them, filling the Salon and the Royal Academy year after year, are now forgotten, buried under the hypocrisy of those they served so sincerely.When a culture is insecure, the painter chooses his subject on the basis of:
 ....
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions