1. When the phase velocity of an EM wave depends on frequency in any medium the phenomenon is called





Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->When a ray of light is going from one medium to another; how is its frequency?....
QA->When a ray of light is going from one medium to another, how is its frequency?....
QA->What is the product of the time period of a wave and its frequency?....
QA->What is the frequency of a wave whose time-period is 05 second?....
QA->How is the frequency of ultrasound wave typically?....
MCQ->When the phase velocity of an EM wave depends on frequency in any medium, the phenomenon is called....
MCQ->When the phase velocity of an EM wave depends on frequency in any medium the phenomenon is called....
MCQ->In a phase discriminator fc, is the IF to which primary circuit is tuned. Consider the following statements: The primary and secondary voltages are exactly 90° out of phase when input frequency is fcmore than 90° out of phase when input frequency is higher than fcless than 90° out of phase when input frequency is lower than fc Which of the above statements is correct?....
MCQ-> Modern science, exclusive of geometry, is a comparatively recent creation and can be said to have originated with Galileo and Newton. Galileo was the first scientist to recognize clearly that the only way to further our understanding of the physical world was to resort to experiment. However obvious Galileo’s contention may appear in the light of our present knowledge, it remains a fact that the Greeks, in spite of their proficiency in geometry, never seem to have realized the importance of experiment. To a certain extent this may be attributed to the crudeness of their instruments of measurement. Still an excuse of this sort can scarcely be put forward when the elementary nature of Galileo’s experiments and observations is recalled. Watching a lamp oscillate in the cathedral of Pisa, dropping bodies from the leaning tower of Pisa, rolling balls down inclined planes, noticing the magnifying effect of water in a spherical glass vase, such was the nature of Galileo’s experiments and observations. As can be seen, they might just as well have been performed by the Greeks. At any rate, it was thanks to such experiments that Galileo discovered the fundamental law of dynamics, according to which the acceleration imparted to a body is proportional to the force acting upon it.The next advance was due to Newton, the greatest scientist of all time if account be taken of his joint contributions to mathematics and physics. As a physicist, he was of course an ardent adherent of the empirical method, but his greatest title to fame lies in another direction. Prior to Newton, mathematics, chiefly in the form of geometry, had been studied as a fine art without any view to its physical applications other than in very trivial cases. But with Newton all the resources of mathematics were turned to advantage in the solution of physical problems. Thenceforth mathematics appeared as an instrument of discovery, the most powerful one known to man, multiplying the power of thought just as in the mechanical domain the lever multiplied our physical action. It is this application of mathematics to the solution of physical problems, this combination of two separate fields of investigation, which constitutes the essential characteristic of the Newtonian method. Thus problems of physics were metamorphosed into problems of mathematics.But in Newton’s day the mathematical instrument was still in a very backward state of development. In this field again Newton showed the mark of genius by inventing the integral calculus. As a result of this remarkable discovery, problems, which would have baffled Archimedes, were solved with ease. We know that in Newton’s hands this new departure in scientific method led to the discovery of the law of gravitation. But here again the real significance of Newton’s achievement lay not so much in the exact quantitative formulation of the law of attraction, as in his having established the presence of law and order at least in one important realm of nature, namely, in the motions of heavenly bodies. Nature thus exhibited rationality and was not mere blind chaos and uncertainty. To be sure, Newton’s investigations had been concerned with but a small group of natural phenomena, but it appeared unlikely that this mathematical law and order should turn out to be restricted to certain special phenomena; and the feeling was general that all the physical processes of nature would prove to be unfolding themselves according to rigorous mathematical laws.When Einstein, in 1905, published his celebrated paper on the electrodynamics of moving bodies, he remarked that the difficulties, which surrouned the equations of electrodynamics, together with the negative experiments of Michelson and others, would be obviated if we extended the validity of the Newtonian principle of the relativity of Galilean motion, which applies solely to mechanical phenomena, so as to include all manner of phenomena: electrodynamics, optical etc. When extended in this way the Newtonian principle of relativity became Einstein’s special principle of relativity. Its significance lay in its assertion that absolute Galilean motion or absolute velocity must ever escape all experimental detection. Henceforth absolute velocity should be conceived of as physically meaningless, not only in the particular ream of mechanics, as in Newton’s day, but in the entire realm of physical phenomena. Einstein’s special principle, by adding increased emphasis to this relativity of velocity, making absolute velocity metaphysically meaningless, created a still more profound distinction between velocity and accelerated or rotational motion. This latter type of motion remained absolute and real as before. It is most important to understand this point and to realize that Einstein’s special principle is merely an extension of the validity of the classical Newtonian principle to all classes of phenomena.According to the author, why did the Greeks NOT conduct experiments to understand the physical world?
 ....
MCQ-> Read the passage carefully and answer the questions given at the end of each passage:Turning the business involved more than segmenting and pulling out of retail. It also meant maximizing every strength we had in order to boost our profit margins. In re-examining the direct model, we realized that inventory management was not just core strength; it could be an incredible opportunity for us, and one that had not yet been discovered by any of our competitors. In Version 1.0 the direct model, we eliminated the reseller, thereby eliminating the mark-up and the cost of maintaining a store. In Version 1.1, we went one step further to reduce inventory inefficiencies. Traditionally, a long chain of partners was involved in getting a product to the customer. Let’s say you have a factory building a PC we’ll call model #4000. The system is then sent to the distributor, which sends it to the warehouse, which sends it to the dealer, who eventually pushes it on to the consumer by advertising, “I’ve got model #4000. Come and buy it.” If the consumer says, “But I want model #8000,” the dealer replies, “Sorry, I only have model #4000.” Meanwhile, the factory keeps building model #4000s and pushing the inventory into the channel. The result is a glut of model #4000s that nobody wants. Inevitably, someone ends up with too much inventory, and you see big price corrections. The retailer can’t sell it at the suggested retail price, so the manufacturer loses money on price protection (a practice common in our industry of compensating dealers for reductions in suggested selling price). Companies with long, multi-step distribution systems will often fill their distribution channels with products in an attempt to clear out older targets. This dangerous and inefficient practice is called “channel stuffing”. Worst of all, the customer ends up paying for it by purchasing systems that are already out of date Because we were building directly to fill our customers’ orders, we didn’t have finished goods inventory devaluing on a daily basis. Because we aligned our suppliers to deliver components as we used them, we were able to minimize raw material inventory. Reductions in component costs could be passed on to our customers quickly, which made them happier and improved our competitive advantage. It also allowed us to deliver the latest technology to our customers faster than our competitors. The direct model turns conventional manufacturing inside out. Conventional manufacturing, because your plant can’t keep going. But if you don’t know what you need to build because of dramatic changes in demand, you run the risk of ending up with terrific amounts of excess and obsolete inventory. That is not the goal. The concept behind the direct model has nothing to do with stockpiling and everything to do with information. The quality of your information is inversely proportional to the amount of assets required, in this case excess inventory. With less information about customer needs, you need massive amounts of inventory. So, if you have great information – that is, you know exactly what people want and how much - you need that much less inventory. Less inventory, of course, corresponds to less inventory depreciation. In the computer industry, component prices are always falling as suppliers introduce faster chips, bigger disk drives and modems with ever-greater bandwidth. Let’s say that Dell has six days of inventory. Compare that to an indirect competitor who has twenty-five days of inventory with another thirty in their distribution channel. That’s a difference of forty-nine days, and in forty-nine days, the cost of materials will decline about 6 percent. Then there’s the threat of getting stuck with obsolete inventory if you’re caught in a transition to a next- generation product, as we were with those memory chip in 1989. As the product approaches the end of its life, the manufacturer has to worry about whether it has too much in the channel and whether a competitor will dump products, destroying profit margins for everyone. This is a perpetual problem in the computer industry, but with the direct model, we have virtually eliminated it. We know when our customers are ready to move on technologically, and we can get out of the market before its most precarious time. We don’t have to subsidize our losses by charging higher prices for other products. And ultimately, our customer wins. Optimal inventory management really starts with the design process. You want to design the product so that the entire product supply chain, as well as the manufacturing process, is oriented not just for speed but for what we call velocity. Speed means being fast in the first place. Velocity means squeezing time out of every step in the process. Inventory velocity has become a passion for us. To achieve maximum velocity, you have to design your products in a way that covers the largest part of the market with the fewest number of parts. For example, you don’t need nine different disk drives when you can serve 98 percent of the market with only four. We also learned to take into account the variability of the lost cost and high cost components. Systems were reconfigured to allow for a greater variety of low-cost parts and a limited variety of expensive parts. The goal was to decrease the number of components to manage, which increased the velocity, which decreased the risk of inventory depreciation, which increased the overall health of our business system. We were also able to reduce inventory well below the levels anyone thought possible by constantly challenging and surprising ourselves with the result. We had our internal skeptics when we first started pushing for ever-lower levels of inventory. I remember the head of our procurement group telling me that this was like “flying low to the ground 300 knots.” He was worried that we wouldn’t see the trees.In 1993, we had $2.9 billion in sales and $220 million in inventory. Four years later, we posted $12.3 billion in sales and had inventory of $33 million. We’re now down to six days of inventory and we’re starting to measure it in hours instead of days. Once you reduce your inventory while maintaining your growth rate, a significant amount of risk comes from the transition from one generation of product to the next. Without traditional stockpiles of inventory, it is critical to precisely time the discontinuance of the older product line with the ramp-up in customer demand for the newer one. Since we were introducing new products all the time, it became imperative to avoid the huge drag effect from mistakes made during transitions. E&O; – short for “excess and obsolete” - became taboo at Dell. We would debate about whether our E&O; was 30 or 50 cent per PC. Since anything less than $20 per PC is not bad, when you’re down in the cents range, you’re approaching stellar performance.Find out the TRUE statement:
 ....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions