1. Information concerning the probability distribution of a profit rate can be generated by using:






Write Comment

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->A farmer has 50 kg wheat in hand, part of which he sells at 8% profit and the rest at 18% profit. He gains 14% altogether. What is the quantity of wheat sold by him at 18% profit?....
QA->As per Co-operative societies rule ‘Net profit’ means net profit as certified by :....
QA->Assume there are 4 file servers each with a 95 chance of being up at any instant. Probability of at least one being available is :....
QA->With which can Radio wave of constant amplitude be generated?....
QA->By which Radio waves of constant amplitude can be generated?....
MCQ-> Analyze the following passage and provide appreciate answers for the questions that follow. Ideas involving the theory probability play a decisive part in modern physics. Yet we will still lack a satisfactory, consistence definition of probability; or, what amounts to much the same, we still lack a satisfactory axiomatic system for the calculus of probability. The relations between probability and experience are also still in need of clarification. In investigating this problem we shall discover what will at first seem an almost insuperable objection to my methodological views. For although probability statements play such a vitally important role in empirical science, they turn out to be in principle impervious to strict falsification. Yet this very stumbling block will become a touchstone upon which to test my theory, in order to find out what it is worth. Thus, we are confronted with two tasks. The first is to provide new foundations for the calculus of probability. This I shall try to do by developing the theory of probability as a frequency theory, along the lines followed by Richard von Mises, But without the use of what he calls the ‘axiom of convergence’ (or ‘limit axiom’) and with a somewhat weakened ‘axiom of randomness’ The second task is to elucidate the relations between probability and experience. This means solving what I call the problem of decidability statements. My hope is that the investigations will help to relieve the present unsatisfactory situation in which physicists make much use of probabilities without being able to say, consistently, what they mean by ‘probability’.The statement, “The relations between probability and experience are still in need of clarification” implies that:
 ....
MCQ-> Analyse the following passage and provide an appropriate answer for the questions that follow. When we speak of the “probability of death”, the exact meaning of the experience can be defined in the following way only. We must not think of an individual, but of this expression can be defined in the following way only. We must not think of an individual, but of a certain class as a whole, e.g., “all insured men forty-one years old living in a given country and not engaged in certain dangerous occupations.” A probability of death is attached to the class of men or to another class that can be defined in a similar way. We can say nothing about the probability of death of an individual even if we know this condition of life and health in detail. The phrase “probability of death”, which it refers to a single person, has no meaning at all.Which of the following conclusions can be drawn from the passage? 1. Singular, non replicable events can be assigned numerical probability value. 2. Probability calculation requires data of the class of people or of events. 3. The data about a class of events can be used to predict the future of any specific event.....
MCQ-> Read the passage carefully and answer the questions given at the end of each passage:Turning the business involved more than segmenting and pulling out of retail. It also meant maximizing every strength we had in order to boost our profit margins. In re-examining the direct model, we realized that inventory management was not just core strength; it could be an incredible opportunity for us, and one that had not yet been discovered by any of our competitors. In Version 1.0 the direct model, we eliminated the reseller, thereby eliminating the mark-up and the cost of maintaining a store. In Version 1.1, we went one step further to reduce inventory inefficiencies. Traditionally, a long chain of partners was involved in getting a product to the customer. Let’s say you have a factory building a PC we’ll call model #4000. The system is then sent to the distributor, which sends it to the warehouse, which sends it to the dealer, who eventually pushes it on to the consumer by advertising, “I’ve got model #4000. Come and buy it.” If the consumer says, “But I want model #8000,” the dealer replies, “Sorry, I only have model #4000.” Meanwhile, the factory keeps building model #4000s and pushing the inventory into the channel. The result is a glut of model #4000s that nobody wants. Inevitably, someone ends up with too much inventory, and you see big price corrections. The retailer can’t sell it at the suggested retail price, so the manufacturer loses money on price protection (a practice common in our industry of compensating dealers for reductions in suggested selling price). Companies with long, multi-step distribution systems will often fill their distribution channels with products in an attempt to clear out older targets. This dangerous and inefficient practice is called “channel stuffing”. Worst of all, the customer ends up paying for it by purchasing systems that are already out of date Because we were building directly to fill our customers’ orders, we didn’t have finished goods inventory devaluing on a daily basis. Because we aligned our suppliers to deliver components as we used them, we were able to minimize raw material inventory. Reductions in component costs could be passed on to our customers quickly, which made them happier and improved our competitive advantage. It also allowed us to deliver the latest technology to our customers faster than our competitors. The direct model turns conventional manufacturing inside out. Conventional manufacturing, because your plant can’t keep going. But if you don’t know what you need to build because of dramatic changes in demand, you run the risk of ending up with terrific amounts of excess and obsolete inventory. That is not the goal. The concept behind the direct model has nothing to do with stockpiling and everything to do with information. The quality of your information is inversely proportional to the amount of assets required, in this case excess inventory. With less information about customer needs, you need massive amounts of inventory. So, if you have great information – that is, you know exactly what people want and how much - you need that much less inventory. Less inventory, of course, corresponds to less inventory depreciation. In the computer industry, component prices are always falling as suppliers introduce faster chips, bigger disk drives and modems with ever-greater bandwidth. Let’s say that Dell has six days of inventory. Compare that to an indirect competitor who has twenty-five days of inventory with another thirty in their distribution channel. That’s a difference of forty-nine days, and in forty-nine days, the cost of materials will decline about 6 percent. Then there’s the threat of getting stuck with obsolete inventory if you’re caught in a transition to a next- generation product, as we were with those memory chip in 1989. As the product approaches the end of its life, the manufacturer has to worry about whether it has too much in the channel and whether a competitor will dump products, destroying profit margins for everyone. This is a perpetual problem in the computer industry, but with the direct model, we have virtually eliminated it. We know when our customers are ready to move on technologically, and we can get out of the market before its most precarious time. We don’t have to subsidize our losses by charging higher prices for other products. And ultimately, our customer wins. Optimal inventory management really starts with the design process. You want to design the product so that the entire product supply chain, as well as the manufacturing process, is oriented not just for speed but for what we call velocity. Speed means being fast in the first place. Velocity means squeezing time out of every step in the process. Inventory velocity has become a passion for us. To achieve maximum velocity, you have to design your products in a way that covers the largest part of the market with the fewest number of parts. For example, you don’t need nine different disk drives when you can serve 98 percent of the market with only four. We also learned to take into account the variability of the lost cost and high cost components. Systems were reconfigured to allow for a greater variety of low-cost parts and a limited variety of expensive parts. The goal was to decrease the number of components to manage, which increased the velocity, which decreased the risk of inventory depreciation, which increased the overall health of our business system. We were also able to reduce inventory well below the levels anyone thought possible by constantly challenging and surprising ourselves with the result. We had our internal skeptics when we first started pushing for ever-lower levels of inventory. I remember the head of our procurement group telling me that this was like “flying low to the ground 300 knots.” He was worried that we wouldn’t see the trees.In 1993, we had $2.9 billion in sales and $220 million in inventory. Four years later, we posted $12.3 billion in sales and had inventory of $33 million. We’re now down to six days of inventory and we’re starting to measure it in hours instead of days. Once you reduce your inventory while maintaining your growth rate, a significant amount of risk comes from the transition from one generation of product to the next. Without traditional stockpiles of inventory, it is critical to precisely time the discontinuance of the older product line with the ramp-up in customer demand for the newer one. Since we were introducing new products all the time, it became imperative to avoid the huge drag effect from mistakes made during transitions. E&O; – short for “excess and obsolete” - became taboo at Dell. We would debate about whether our E&O; was 30 or 50 cent per PC. Since anything less than $20 per PC is not bad, when you’re down in the cents range, you’re approaching stellar performance.Find out the TRUE statement:
 ....
MCQ->Information concerning the probability distribution of a profit rate can be generated by using:....
MCQ->The respective ratio of advertisement revenues generated from printed version by magazine P to advertisement revenues generated from online version by the same magazine in July is same as the respective ratio of advertisement revenues generated from printed version by Magazine Q to advertisement revenues generated from online version by the same magazine in March. If the advertisement revenue generated from online version by Magazine P in July was INR 1,08,000/-, what was the advertisement revenue generated from printed version by the same magazine in July ?....
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions