1. Which element is produced in large scale by using the Claus Process ?

Answer: Sulfur

Reply

Type in
(Press Ctrl+g to toggle between English and the chosen language)

Comments

Tags
Show Similar Question And Answers
QA->Which element is produced in large scale by using the Claus Process ?....
QA->Which element is industrially produced through the Claus Process?....
QA->WHICH ELEMENT IS INDUSTRIALLY MANUFACTURED THROUGH CLAUS PROCESS....
QA->The first large scale human trials of the experimental vaccines of Ebola have begun in which country?....
QA->A large shop selling large variety of goods ?....
MCQ-> Read the following passage carefully and answer the questions given below it. Certain words/phrases have been given in bold to help you locate them while answering some of the questions: In every religion, culture and civilization feeding the poor and hungry is considered one of the most noble deeds. However such large scale feeding will require huge investment both in resources and time. A better alternative is to create conditions by which proper wholesome food is available to all the rural poor at affordable price. Getting this done will be the biggest charity.Our work with the rural poor in villages of Western Maharashtra has shown that most of these people are landless laborers. After working the whole day in the fields in scorching sun they come home in the evening and have to cook for the whole family. The cooking is done on the most primitive chulha (wood stove) which results in tremendous indoor air pollution. Many of them also have no electricity so they use primitive and polluting kerosene lamps. World Health Organization (WHO) data has shown that about 300,000 deaths/ year in India can be directly attributed to indoor air pollution in such -nuts. At the same time this pollution results in many respiratory ailments and these people spend close Rs. 200-400 per month on medical bills. Besides the pollution, rural poor also eat very poor diet. They eat  whatever is available daily at Public Distribution System (PDS) shops and most of the times these shops are out of rations. Thus they cook whatever is available. The hard work together with poor eating takes a heavy toll on their health. Besides this malnutrition also affects the physical and mental health of their children and may lead to creation of a whole generation of mentally challenged citizens. So I feel that the best way to provide adequate food for rural poor is by setting up rural restaurants on large scale. These restaurants will be similar to regular ones but for people below poverty line (BPL) they will provide meals at subsidized rates. These citizens will pay only Rs. 10 per meal and the rest, which is expected to be quite small, will come as a part of Government subsidy. With existing open market prices of vegetables and groceries average cost of simple meal for a family of four comes to Rs. 50 per meal or Rs. 12.50 per person per meal. If the PDS prices are taken for the groceries then the average cost will be Rs. 7.50 per person per meal. This makes the subsidy approximately Rs. 2.50 per person per meal only and hence quite small. The buying of meals could be by the use of UID (Aadhar) card by rural poor. The total cost should be Rs. 30 per day for three vegetarian meals of breakfast, lunch and dinner. The rural poor will get better nutrition and tasty food by eating  in these restaurants. Besides the time saved can be used for resting and other gainful activities like teaching children. Since the food will not be cooked in huts, this strategy will result in less pollution in rural households. This will be beneficial for their health. Besides, women's chores will be reduced drastically. Another advantage of eating in these restaurants will be increased social interaction of rural poor since this could also become a meeting place. Eating in restaurants will also require fewer utensils in house and hence less expenditure. For other things like hot water for bath, making tea, boiling milk and cooking on holidays some utensils and fuel will be required. Our Institute NARI has developed an extremely efficient and environment-friendly stove which provides simultaneously both light and heat for cooking and hence may provide the necessary functions. Providing reasonably priced wholesome food is the basic aim and program of Government of India (GOI). This is the basis of their much touted food security  program.However in 65years they have not been able to do so. Thus I feel a public private partnership can help in this. To help the restaurant owners the GOI or state Governments should provide them with soft loans and other line of credit for setting up such facilities. Corporate world can take this up as a part of their corporate social responsibility activity. Their participation will help ensure good quality restaurants and services. Besides the charitable work, this will also make good business sense. McDonald's-type restaurant systems for rural areas can be a good model to be set up for quality control both in terms of hygiene and in terms of quality of food material. However focus will be on availability of wholesome simple vegetarian food in these restaurants.More clientele (volumes) will make these restaurants economical. Existing models of dhabas, udipi type restaurants etc. can be used in this scheme. These restaurants may also be able to provide midday meals in rural schools. At present the midday meal program is faltering due to various reasons. Food coupons in western countries provide cheap food for poor. However quite a number of fast food restaurants in US do not accept them. Besides these coupons are most of the times used for non-food items, it will be mandatory for rural restaurants to accept payment via UID cards for BPL citizens. Existing soup kitchens, lagers and temple food are based on charity. For large scale rural use it should be based on good social enterprise  business model. Cooking food in these restaurants will also result in much more efficient use of energy since energy/ kg of food cooked in households is greater than that in restaurants. The main thing however will be to reduce drastically the food wastage In these restaurants. Rural restaurants can also be forced to use clean fuels like LPG or locally produced biomass-based liquid fuels. This strategy is very difficult to enforce for individual households. Large scale employment generation in rural areas may result because of this activity. With an average norm of 30 people employed/ 100-chair restaurant, this program has the potential of generating about 20 million jobs permanently in rural areas. Besides the infrastructure development in setting up restaurants and establishing the food chain etc will help the local farmers and will create huge wealth generation in these areas. In the long run this strategy may provide better food security for rural poor than the existing one which is based on cheap food availability in PDS - a system which is prone to corruption and leakage.In accordance with the view expressed by the writer of this article, what is the biggest charity ?
 ...
MCQ-> In a modern computer, electronic and magnetic storage technologies play complementary roles. Electronic memory chips are fast but volatile (their contents are lost when the computer is unplugged). Magnetic tapes and hard disks are slower, but have the advantage that they are non-volatile, so that they can be used to store software and documents even when the power is off.In laboratories around the world, however, researchers are hoping to achieve the best of both worlds. They are trying to build magnetic memory chips that could be used in place of today’s electronics. These magnetic memories would be nonvolatile; but they would also he faster, would consume less power, and would be able to stand up to hazardous environments more easily. Such chips would have obvious applications in storage cards for digital cameras and music- players; they would enable handheld and laptop computers to boot up more quickly and to operate for longer; they would allow desktop computers to run faster; they would doubtless have military and space-faring advantages too. But although the theory behind them looks solid, there are tricky practical problems and need to be overcome.Two different approaches, based on different magnetic phenomena, are being pursued. The first, being investigated by Gary Prinz and his colleagues at the Naval Research Laboratory (NRL) in Washington, D.c), exploits the fact that the electrical resistance of some materials changes in the presence of magnetic field— a phenomenon known as magneto- resistance. For some multi-layered materials this effect is particularly powerful and is, accordingly, called “giant” magneto-resistance (GMR). Since 1997, the exploitation of GMR has made cheap multi-gigabyte hard disks commonplace. The magnetic orientations of the magnetised spots on the surface of a spinning disk are detected by measuring the changes they induce in the resistance of a tiny sensor. This technique is so sensitive that it means the spots can be made smaller and packed closer together than was previously possible, thus increasing the capacity and reducing the size and cost of a disk drive. Dr. Prinz and his colleagues are now exploiting the same phenomenon on the surface of memory chips, rather spinning disks. In a conventional memory chip, each binary digit (bit) of data is represented using a capacitor-reservoir of electrical charge that is either empty or fill -to represent a zero or a one. In the NRL’s magnetic design, by contrast, each bit is stored in a magnetic element in the form of a vertical pillar of magnetisable material. A matrix of wires passing above and below the elements allows each to be magnetised, either clockwise or anti-clockwise, to represent zero or one. Another set of wires allows current to pass through any particular element. By measuring an element’s resistance you can determine its magnetic orientation, and hence whether it is storing a zero or a one. Since the elements retain their magnetic orientation even when the power is off, the result is non-volatile memory. Unlike the elements of an electronic memory, a magnetic memory’s elements are not easily disrupted by radiation. And compared with electronic memories, whose capacitors need constant topping up, magnetic memories are simpler and consume less power. The NRL researchers plan to commercialise their device through a company called Non-V olatile Electronics, which recently began work on the necessary processing and fabrication techniques. But it will be some years before the first chips roll off the production line.Most attention in the field in focused on an alternative approach based on magnetic tunnel-junctions (MTJs), which are being investigated by researchers at chipmakers such as IBM, Motorola, Siemens and Hewlett-Packard. IBM’s research team, led by Stuart Parkin, has already created a 500-element working prototype that operates at 20 times the speed of conventional memory chips and consumes 1% of the power. Each element consists of a sandwich of two layers of magnetisable material separated by a barrier of aluminium oxide just four or five atoms thick. The polarisation of lower magnetisable layer is fixed in one direction, but that of the upper layer can be set (again, by passing a current through a matrix of control wires) either to the left or to the right, to store a zero or a one. The polarisations of the two layers are then either the same or opposite directions.Although the aluminum-oxide barrier is an electrical insulator, it is so thin that electrons are able to jump across it via a quantum-mechanical effect called tunnelling. It turns out that such tunnelling is easier when the two magnetic layers are polarised in the same direction than when they are polarised in opposite directions. So, by measuring the current that flows through the sandwich, it is possible to determine the alignment of the topmost layer, and hence whether it is storing a zero or a one.To build a full-scale memory chip based on MTJs is, however, no easy matter. According to Paulo Freitas, an expert on chip manufacturing at the Technical University of Lisbon, magnetic memory elements will have to become far smaller and more reliable than current prototypes if they are to compete with electronic memory. At the same time, they will have to be sensitive enough to respond when the appropriate wires in the control matrix are switched on, but not so sensitive that they respond when a neighbouring elements is changed. Despite these difficulties, the general consensus is that MTJs are the more promising ideas. Dr. Parkin says his group evaluated the GMR approach and decided not to pursue it, despite the fact that IBM pioneered GMR in hard disks. Dr. Prinz, however, contends that his plan will eventually offer higher storage densities and lower production costs.Not content with shaking up the multi-billion-dollar market for computer memory, some researchers have even more ambitious plans for magnetic computing. In a paper published last month in Science, Russell Cowburn and Mark Well and of Cambridge University outlined research that could form the basis of a magnetic microprocessor — a chip capable of manipulating (rather than merely storing) information magnetically. In place of conducting wires, a magnetic processor would have rows of magnetic dots, each of which could be polarised in one of two directions. Individual bits of information would travel down the rows as magnetic pulses, changing the orientation of the dots as they went. Dr. Cowbum and Dr. Welland have demonstrated how a logic gate (the basic element of a microprocessor) could work in such a scheme. In their experiment, they fed a signal in at one end of the chain of dots and used a second signal to control whether it propagated along the chain.It is, admittedly, a long way from a single logic gate to a full microprocessor, but this was true also when the transistor was first invented. Dr. Cowburn, who is now searching for backers to help commercialise the technology, says he believes it will be at least ten years before the first magnetic microprocessor is constructed. But other researchers in the field agree that such a chip, is the next logical step. Dr. Prinz says that once magnetic memory is sorted out “the target is to go after the logic circuits.” Whether all-magnetic computers will ever be able to compete with other contenders that are jostling to knock electronics off its perch — such as optical, biological and quantum computing — remains to be seen. Dr. Cowburn suggests that the future lies with hybrid machines that use different technologies. But computing with magnetism evidently has an attraction all its own.In developing magnetic memory chips to replace the electronic ones, two alternative research paths are being pursued. These are approaches based on:
 ...
MCQ-> India is rushing headlong toward economic success and modernisation, counting on high- tech industries such as information technology and biotechnology to propel the nation toprosperity. India’s recent announcement that it would no longer produce unlicensed inexpensive generic pharmaceuticals bowed to the realities of the World TradeOrganisation while at the same time challenging the domestic drug industry to compete with the multinational firms. Unfortunately, its weak higher education sector constitutes the Achilles’ Heel of this strategy. Its systematic disinvestment in higher education inrecent years has yielded neither world-class research nor very many highly trained scholars, scientists, or managers to sustain high-tech development. India’s main competitors especially China but also Singapore, Taiwan, and South Korea — are investing in large and differentiated higher education systems. They are providingaccess to large number of students at the bottom of the academic system while at the same time building some research-based universities that are able to compete with theworld’s best institutions. The recent London Times Higher Education Supplement ranking of the world’s top 200 universities included three in China, three in Hong Kong,three in South Korea, one in Taiwan, and one in India (an Indian Institute of Technology at number 41.— the specific campus was not specified). These countries are positioningthemselves for leadership in the knowledge-based economies of the coming era. There was a time when countries could achieve economic success with cheap labour andlow-tech manufacturing. Low wages still help, but contemporary large-scale development requires a sophisticated and at least partly knowledge-based economy.India has chosen that path, but will find a major stumbling block in its university system. India has significant advantages in the 21st century knowledge race. It has a large high ereducation sector — the third largest in the world in student numbers, after China andthe United States. It uses English as a primary language of higher education and research. It has a long academic tradition. Academic freedom is respected. There are asmall number of high quality institutions, departments, and centres that can form the basis of quality sector in higher education. The fact that the States, rather than the Central Government, exercise major responsibility for higher education creates a rather cumbersome structure, but the system allows for a variety of policies and approaches. Yet the weaknesses far outweigh the strengths. India educates approximately 10 per cent of its young people in higher education compared with more than half in the major industrialised countries and 15 per cent in China. Almost all of the world’s academic systems resemble a pyramid, with a small high quality tier at the top and a massive sector at the bottom. India has a tiny top tier. None of its universities occupies a solid position at the top. A few of the best universities have some excellent departments and centres, and there is a small number of outstanding undergraduate colleges. The University Grants Commission’s recent major support of five universities to build on their recognised strength is a step toward recognising a differentiated academic system and fostering excellence. At present, the world-class institutions are mainly limited to the Indian Institutes of Technology (IITs), the Indian Institutes of Management (IIMs) and perhaps a few others such as the All India Institute of Medical Sciences and the Tata Institute of Fundamental Research. These institutions, combined, enroll well under 1 percent of the student population. India’s colleges and universities, with just a few exceptions, have become large, under-funded, ungovernable institutions. At many of them, politics has intruded into campus life, influencing academic appointments and decisions across levels. Under-investment in libraries, information technology, laboratories, and classrooms makes it very difficult to provide top-quality instruction or engage in cutting-edge research.The rise in the number of part-time teachers and the freeze on new full-time appointments in many places have affected morale in the academic profession. The lackof accountability means that teaching and research performance is seldom measured. The system provides few incentives to perform. Bureaucratic inertia hampers change.Student unrest and occasional faculty agitation disrupt operations. Nevertheless, with a semblance of normality, faculty administrators are. able to provide teaching, coordinate examinations, and award degrees. Even the small top tier of higher education faces serious problems. Many IIT graduates,well trained in technology, have chosen not to contribute their skills to the burgeoning technology sector in India. Perhaps half leave the country immediately upon graduation to pursue advanced study abroad — and most do not return. A stunning 86 per cent of students in science and technology fields from India who obtain degrees in the United States do not return home immediately following their study. Another significant group, of about 30 per cent, decides to earn MBAs in India because local salaries are higher.—and are lost to science and technology.A corps of dedicated and able teachers work at the IlTs and IIMs, but the lure of jobs abroad and in the private sector make it increasingly difficult to lure the best and brightest to the academic profession.Few in India are thinking creatively about higher education. There is no field of higher education research. Those in government as well as academic leaders seem content to do the “same old thing.” Academic institutions and systems have become large and complex. They need good data, careful analysis, and creative ideas. In China, more than two-dozen higher education research centers, and several government agencies are involved in higher education policy.India has survived with an increasingly mediocre higher education system for decades.Now as India strives to compete in a globalized economy in areas that require highly trained professionals, the quality of higher education becomes increasingly important.India cannot build internationally recognized research-oriented universities overnight,but the country has the key elements in place to begin and sustain the process. India will need to create a dozen or more universities that can compete internationally to fully participate in the new world economy. Without these universities, India is destined to remain a scientific backwater.Which of the following ‘statement(s) is/are correct in the context of the given passage ? I. India has the third largest higher education sector in the world in student numbers. II. India is moving rapidly toward economic success and modernisation through high tech industries such as information technology and bitechonology to make the nation to prosperity. III. India’s systematic disinvestment in higher education in recent years has yielded world class research and many world class trained scholars, scientists to sustain high-tech development....
MCQ-> The broad scientific understanding today is that our planet is experiencing a warming trend over and above natural and normal variations that is almost certainly due to human activities associated with large-scale manufacturing. The process began in the late 1700s with the Industrial Revolution, when manual labor, horsepower, and water power began to be replaced by or enhanced by machines. This revolution, over time, shifted Britain, Europe, and eventually North America from largely agricultural and trading societies to manufacturing ones, relying on machinery and engines rather than tools and animals.The Industrial Revolution was at heart a revolution in the use of energy and power. Its beginning is usually dated to the advent of the steam engine, which was based on the conversion of chemical energy in wood or coal to thermal energy and then to mechanical work primarily the powering of industrial machinery and steam locomotives. Coal eventually supplanted wood because, pound for pound, coal contains twice as much energy as wood (measured in BTUs, or British thermal units, per pound) and because its use helped to save what was left of the world's temperate forests. Coal was used to produce heat that went directly into industrial processes, including metallurgy, and to warm buildings, as well as to power steam engines. When crude oil came along in the mid- 1800s, still a couple of decades before electricity, it was burned, in the form of kerosene, in lamps to make light replacing whale oil. It was also used to provide heat for buildings and in manufacturing processes, and as a fuel for engines used in industry and propulsion.In short, one can say that the main forms in which humans need and use energy are for light, heat, mechanical work and motive power, and electricity which can be used to provide any of the other three, as well as to do things that none of those three can do, such as electronic communications and information processing. Since the Industrial Revolution, all these energy functions have been powered primarily, but not exclusively, by fossil fuels that emit carbon dioxide (CO2), To put it another way, the Industrial Revolution gave a whole new prominence to what Rochelle Lefkowitz, president of Pro-Media Communications and an energy buff, calls "fuels from hell" - coal, oil, and natural gas. All these fuels from hell come from underground, are exhaustible, and emit CO2 and other pollutants when they are burned for transportation, heating, and industrial use. These fuels are in contrast to what Lefkowitz calls "fuels from heaven" -wind, hydroelectric, tidal, biomass, and solar power. These all come from above ground, are endlessly renewable, and produce no harmful emissions.Meanwhile, industrialization promoted urbanization, and urbanization eventually gave birth to suburbanization. This trend, which was repeated across America, nurtured the development of the American car culture, the building of a national highway system, and a mushrooming of suburbs around American cities, which rewove the fabric of American life. Many other developed and developing countries followed the American model, with all its upsides and downsides. The result is that today we have suburbs and ribbons of highways that run in, out, and around not only America s major cities, but China's, India's, and South America's as well. And as these urban areas attract more people, the sprawl extends in every direction.All the coal, oil, and natural gas inputs for this new economic model seemed relatively cheap, relatively inexhaustible, and relatively harmless-or at least relatively easy to clean up afterward. So there wasn't much to stop the juggernaut of more people and more development and more concrete and more buildings and more cars and more coal, oil, and gas needed to build and power them. Summing it all up, Andy Karsner, the Department of Energy's assistant secretary for energy efficiency and renewable energy, once said to me: "We built a really inefficient environment with the greatest efficiency ever known to man."Beginning in the second half of the twentieth century, a scientific understanding began to emerge that an excessive accumulation of largely invisible pollutants-called greenhouse gases - was affecting the climate. The buildup of these greenhouse gases had been under way since the start of the Industrial Revolution in a place we could not see and in a form we could not touch or smell. These greenhouse gases, primarily carbon dioxide emitted from human industrial, residential, and transportation sources, were not piling up along roadsides or in rivers, in cans or empty bottles, but, rather, above our heads, in the earth's atmosphere. If the earth's atmosphere was like a blanket that helped to regulate the planet's temperature, the CO2 buildup was having the effect of thickening that blanket and making the globe warmer.Those bags of CO2 from our cars float up and stay in the atmosphere, along with bags of CO2 from power plants burning coal, oil, and gas, and bags of CO2 released from the burning and clearing of forests, which releases all the carbon stored in trees, plants, and soil. In fact, many people don't realize that deforestation in places like Indonesia and Brazil is responsible for more CO2 than all the world's cars, trucks, planes, ships, and trains combined - that is, about 20 percent of all global emissions. And when we're not tossing bags of carbon dioxide into the atmosphere, we're throwing up other greenhouse gases, like methane (CH4) released from rice farming, petroleum drilling, coal mining, animal defecation, solid waste landfill sites, and yes, even from cattle belching. Cattle belching? That's right-the striking thing about greenhouse gases is the diversity of sources that emit them. A herd of cattle belching can be worse than a highway full of Hummers. Livestock gas is very high in methane, which, like CO2, is colorless and odorless. And like CO2, methane is one of those greenhouse gases that, once released into the atmosphere, also absorb heat radiating from the earth's surface. "Molecule for molecule, methane's heat-trapping power in the atmosphere is twenty-one times stronger than carbon dioxide, the most abundant greenhouse gas.." reported Science World (January 21, 2002). “With 1.3 billion cows belching almost constantly around the world (100 million in the United States alone), it's no surprise that methane released by livestock is one of the chief global sources of the gas, according to the U.S. Environmental Protection Agency ... 'It's part of their normal digestion process,' says Tom Wirth of the EPA. 'When they chew their cud, they regurgitate [spit up] some food to rechew it, and all this gas comes out.' The average cow expels 600 liters of methane a day, climate researchers report." What is the precise scientific relationship between these expanded greenhouse gas emissions and global warming? Experts at the Pew Center on Climate Change offer a handy summary in their report "Climate Change 101. " Global average temperatures, notes the Pew study, "have experienced natural shifts throughout human history. For example; the climate of the Northern Hemisphere varied from a relatively warm period between the eleventh and fifteenth centuries to a period of cooler temperatures between the seventeenth century and the middle of the nineteenth century. However, scientists studying the rapid rise in global temperatures during the late twentieth century say that natural variability cannot account for what is happening now." The new factor is the human factor-our vastly increased emissions of carbon dioxide and other greenhouse gases from the burning of fossil fuels such as coal and oil as well as from deforestation, large-scale cattle-grazing, agriculture, and industrialization.“Scientists refer to what has been happening in the earth’s atmosphere over the past century as the ‘enhanced greenhouse effect’”, notes the Pew study. By pumping man- made greenhouse gases into the atmosphere, humans are altering the process by which naturally occurring greenhouse gases, because of their unique molecular structure, trap the sun’s heat near the earth’s surface before that heat radiates back into space."The greenhouse effect keeps the earth warm and habitable; without it, the earth's surface would be about 60 degrees Fahrenheit colder on average. Since the average temperature of the earth is about 45 degrees Fahrenheit, the natural greenhouse effect is clearly a good thing. But the enhanced greenhouse effect means even more of the sun's heat is trapped, causing global temperatures to rise. Among the many scientific studies providing clear evidence that an enhanced greenhouse effect is under way was a 2005 report from NASA's Goddard Institute for Space Studies. Using satellites, data from buoys, and computer models to study the earth's oceans, scientists concluded that more energy is being absorbed from the sun than is emitted back to space, throwing the earth's energy out of balance and warming the globe."Which of the following statements is correct? (I) Greenhouse gases are responsible for global warming. They should be eliminated to save the planet (II) CO2 is the most dangerous of the greenhouse gases. Reduction in the release of CO2 would surely bring down the temperature (III) The greenhouse effect could be traced back to the industrial revolution. But the current development and the patterns of life have enhanced their emissions (IV) Deforestation has been one of the biggest factors contributing to the emission of greenhouse gases Choose the correct option:...
MCQ-> Directions: Read the following passage carefully and answer the questions given below it. Certain words have been printed in bold to help you locate them while answering some of the questions.Financial Inclusion (FI) is an emerging priority for banks that have nowhere else to go to achieve business growth. The viability of FI Business is under Question, because while banks and their delivery partners continue to make investments, they haven't seen commensurate returns. In markets like India, most programmes are focussed on customer onboarding, an expensive process which people often find difficult to afford, involving issuance of smart cards to the customers. However, large-scale customer acquisition hasn't translated into large-scale business, with many accounts lying dormant and therefore yielding no return on the bank's investment. For the same reason, Business Correspondent Agents, who constitute the primary channel for financial inclusion, are unable to pursue their activity as a full-time job. One major reason for this state of events is that the customer onboarding process is often delayed after the submission of documents (required to validate the details of the concerned applicant) by the applicant and might take as long as two weeks. By this time initial enthusiasm of applicants fades away. Moreover, the delivery partners don't have the knowledge and skill to propose anything other than the most basic financial products to the customer and hence do not serve their banks' goal to expanding the offering in unbanked markets.Contrary to popular perception, the inclusion segment is not a singular impoverished, undifferentiated mass and it is important to navigate its diversity to identify the right target customers for various programmes. Rural markets do have their share of rich people who do not use banking services simply because they are inconvenient to access or have low perceived value. At the same time, urban markets, despite a high branch density, have a multitude of low wage earners outside the financial net. Moreover, the branch timings of banks rarely coincide with the off-work hours of the labour class.Creating affordability is crucial in tapping the unbanked market. No doubt pricing is a tool, but banks also need to be innovative in right-sizing their proposition to convince customers that they can derive big value even from small amounts. One way of doing this is to show the target audience that a bank account is actually a lifestyle enabler, a convenient and safe means to send money to family or make a variety of purchases. Once banks succeed in hooking customers with this value proposition they must sustain their interest by introducing a simple and intuitive user application, ubiquitous access over mobile and other touch points, and adopting a banking mechanism which is not only secure but also reassuring to the customer. Technology is the most important element of financial inclusion strategy and an enabler of all others. The choice of technology is, therefore, a crucial decision, which could make or mar the agenda. Of the various selection criteria, cost is perhaps the most important. This certainly does not mean buying the cheapest package, but rather choosing that solution which by scaling transactions to huge volumes reduces per unit operating cost. An optimal mix of these strategies would no doubt offer an innovative means of expansion in the unbanked market.Which of the following facts is true as per the passage?
 ...
Terms And Service:We do not guarantee the accuracy of available data ..We Provide Information On Public Data.. Please consult an expert before using this data for commercial or personal use
DMCA.com Protection Status Powered By:Omega Web Solutions
© 2002-2017 Omega Education PVT LTD...Privacy | Terms And Conditions