The Life and Death of Primary Health Care: Presented to the UN ‘Global Summit’ (1995)

by David Werner

In the Alma Ata Declaration on 1978, the world’s nations affirmed that both health and health care are basic human rights. But in recent years, for growing numbers of destitute people the possibilities of receiving adequate health services has been growing more remote. The reason in large part is financial. Since the early 1980s the income gap between rich and poor has been widening, both between countries and within them. Today over one billion persons, or one in five of the world’s people, survive (or try to survive) on less than one dollar per day. [^1] In many countries, while unemployment rises, minimum wages have fallen so low that they do not cover the family’s basic food needs. Worsening people’s hardships yet further, at the same time that poverty is deepening in most Third World nations, the costs for basic health services are being systematically shifted from the public sector to the individual consumer.

The Colonial and Neocolonial Medical Model

Ever since it was introduced into Southern countries in colonial times, the Western medical model has been a two edged sword. Its urban-based, doctor controlled, expensively equipped “disease palaces” have always catered to the privileged. [^2] The few health services directed at the ‘natives’ were mainly designed to keep them productively working in plantations and/or factories owned by the rich. Thus in the colonial era, health for the poor was not thought of as a right, but as a requirement for well-managed performance of labor. (As David Legge will subsequently discuss, the World Bank’s new mandate for Investing in Health, with its emphasis on cost effectiveness for productive contribution to the national and international economy, is a regression to this colonial mind-set. Critics accurately speak of it as neo-colonial.)

In the post-World War II era, there was a gradually evolving social consciousness that both health and health care were fundamental human rights. As a part of the new basic needs approach, development planners looked for ways to make the Western medical/health care model more widely accessible. Health auxiliaries were trained to provide rudimentary services through rural dispensaries. While this “rural penetration” brought some benefits, it also introduced new problems. One was added costs to consumers. Even where services were subsidized, travel to the dispensaries, which tended to be few and far between, usually involved considerable time and expense. In addition, the medicines, when available, were often costly; or sick persons were referred to a distant city hospital at still greater costs to the poor family. Under the influence of new Western wonder drugs, people’s faith in traditional home remedies began to decline. “A pill for every ill” became standard treatment and injections acquired an almost magical ethos. Budding multinational pharmaceutical companies eagerly capitalized on this growing drug dependency. The pharmaceuticalization and commercialization of health care, together with an erosion of traditional forms of self-care, became a growing obstacle to health.

By the late 1970s, wide recognition that the Western medical model in the Third World had largely failed to reach those in greatest need led to a growing demand for reform. So in 1978 the World Health Organization (WHO) and UNICEF convened the renowned global conference in Alma Ata, Russia.

The Birth and Death of Primary Health Care

In terms of what Third World citizens could hope for and expect from official health policy, the Alma Ata Declaration of 1978 was a major watershed. Endorsed by virtually all governments, it declared that both health and health care were basic human rights. To advance toward the ambitious goal of Health for All by the Year 2000, it called for a potentially revolutionary approach. Primary Health Care (PHC) was conceived as a comprehensive strategy that would not only include an equitable, consumer-centered approach to health services, but would address the underlying social and political factors that determine the health of populations. It called for accountability of health workers and health ministries to the common people, and for social guarantees to make sure that the basic needs—including food needs—of all people are met. In recognition that socially progressive change only comes from organized demand, it called for strong popular participation,

Unhappily, these high expectations of Alma Ata have not been met. Today, 17 years later, it is painfully evident that the goal of Health for All is in many ways growing more distant, not just for the poor, but for humanity. Some critics say that Primary Health Care has failed. Others protest that it has never really been tried.

The latter observation comes closer to the truth. But there are notable exceptions. Although for most health ministries PHC was a radical new concept, many of its practices had long been implemented by non-government community-based health programs in several Third World countries, and by a few more socially-oriented governments that gave high priority to people’s basic needs. China’s revolutionary approach to community-based (or commune-based) health care, featuring barefoot doctors, had provided much of the basis for the Alma Ata design of Primary Health Care.

However, for most governments and health professionals, the comprehensive, equity-oriented approach to Primary Health Care was too revolutionary. To those in positions of power, the Alma Ata Declaration’s call to give ordinary people more control over their health and lives sounded dangerously leftist and subversive. So, soon after Alma Ata, high-level health planners in the North joined together with health ministries in the South to methodically disembowel Primary Health Care of its most progressive elements. The model was converted, at best, into a means for extending fairly conventional, top-down health services into underserved areas. At worst—and most recently Primary Health Care has been redesigned to fit into the prevailing development model that is increasing global poverty, ill health, and environmental demise.

Strategically, there have been three major events that have in effect sabotaged the revolutionary essence of Primary Health Care: 1) the introduction of Selective Primary Health Care at the end of the 1970s, 2) Structural Adjustment Programs and the push for Cost Recovery or User-financed Health Services, introduced in 1980s, and 3) the take over of Third World health care policy-making by the World Bank in the 1990s. All three of these monumental assaults on Primary Health Care are a reflection of the prevailing regressive socio-political and economic trends. I will now briefly describe the first of these assaults on PHC. The second and third assaults will be discussed by Michel Chossudovsky and David Legge, respectively.

1. Selective Primary Health Care

No sooner had the dust settled from the Alma Ata Conference in 1978, than top-ranking health experts in the North began to trim the wings of Primary Health Care. They asserted that, in view of the economic recession and shrinking health budgets of poor countries, a comprehensive or holistic approach would be impractical and too costly. If any health statistics were to be improved, they argued, high risk groups must be “targeted” with a few carefully selected, cost effective interventions. This new politically-sanitized version of PHC was dubbed Selective Primary Health Care.

UNICEF had been one of the strongest advocates of Comprehensive Primary Health Care as declared at Alma Ata. But frustrated by the unwillingness of major donor agencies and health ministries to seriously promote such a radical model, and confronted by the socially regressive political climate of the 80s, UNICEF compromised. It began to advocate Selective PHC as being more “realistic.” Through its so-called Child Survival Revolution—which some critics have called a counter-revolution—UNICEF promoted four selected interventions known as GOBI (Growth monitoring, Oral rehydration therapy (ORT), Breast feeding, and Immunization). UNICEF later attempted to broaden its limited package of health technologies to GOBI-FFF (adding Food supplements, Female education, and Family planning). But in practice, in most countries PHC became even more selectively reduced to the twin engines of Child Survival: ORT and Immunization.

The global Child Survival Campaign in its most limited and vertical form quickly won high-level support. For those in positions of privilege and power, it was safe and politically useful. It held the promise of improving a widely accepted health indicator, child mortality rate, while it prudently avoided confronting (except in rhetoric) the social and economic causes of poor health. Not surprisingly, many health professionals, governments, and USAID quickly jumped on the Child Survival bandwagon. Even the World Bank—which had previously not put much investment in health—began to lend its support.

But while technological solutions are sometimes helpful, they can only go so far in combating health problems whose roots are social and political. Predictably, the Child Survival initiative has had less impact than was hoped. An estimated 13 million children still die each year (roughly the same number as 15 years ago, although the percentage is somewhat reduced). Most of these deaths still are related to poverty and undernutrition.

It has become increasingly clear that reducing child mortality through selected technological interventions does not necessarily improve children’s health or quality of life (or reduce population growth rates, as has been speculated). Improvement is even less likely if such interventions are introduced in dependency-creating or disempowering ways that do little to combat poverty or improve living standards.

During the 1980s a disturbing pattern began to emerge in the health indicators of children in some of the poorest countries: while infant and child mortality rates dropped, undernutrition and morbidity rates increased. Such a pattern carries an ominous forecast for a sustained decrease in mortality. And sure enough, in the late 80s and early 90s, in many countries the decline in child mortality rates has slowed or halted, and in several countries (especially in sub-Saharan Africa) has actually begun to increase. [^3]

Equally disturbing, the two most heavily promoted technologies for reducing child mortality are proving difficult to sustain. Since the beginning of the 90s in the Third World there has been a backsliding both in Oral Rehydration Therapy usage and Immunization coverage. [^4] The recent decline in immunization and increase in measles cases are shown on the two graphs from UNICEF’s State of the World’s Children Report, 1994, Figures 1 and 2. As for oral rehydration, even Egypt’s national program—until recently celebrated as a great success story—has in the 90s experienced a precipitous decline in ORS usage rates: from more than 50% to 23%. [^5]

The disappointing and in some countries diminishing impact of Oral Rehydration Therapy can in part be explained by structural adjustment policies that have shifted more of the costs of health services and products onto the poor. But it is partly due to the dependency-creating, disempowering way the intervention was introduced.

There are two basic approaches to ORT: (1) manufactured packets of Oral Rehydration Salts (ORS), and (2) “home fluids.” Promotion of appropriate home fluids permits greater self-reliance and control over ORT at the family and community level. Home fluids also tend to be much more economical for poor families. And when home-made rehydration drinks are prepared with cereals or starches rather than sugars they can be safer and more effective than the sugar-based ORS formula.

But from the start, WHO, UNICEF, and USAID put their biggest investment into promoting factory-made packets, thus pharmaceuticalizing a “simple solution” and creating dependency on a manufactured product, the price and availability of which are outside family and community control. At first ORS packets were widely distributed to health posts free. But when health budgets were slashed by adjustment policies, pressure was put on health ministries to privatize production and distribution of ORS packets. This commercialization of a potentially “life saving technology,” means that today in some countries poor families must spend up to one forth of their daily wages for a single packet of ORS. Since undernutrition is the predisposing condition leading to death from diarrhea, it is easy to see how social marketing campaigns that induce poor families to spend their limited food money on commercial ORS packets may be counterproductive in terms of lowering child mortality. However virtually no studies have been done to determine how family expenditure on ORS may negatively effect child nutrition and survival.

Wisely, in the last few years, UNICEF and WHO have begun to place more emphasis on increased home fluids and continued feeding (including breast feeding) rather than such a disproportionate emphasis on ORS packets. But after a decade of marketing the packets as a wonder drug, it is proving difficult to reeducate people (and especially health practitioners) that they can save both money and lives by using appropriate home drinks.

Zimbabwe is one country that has taken a courageous stand in favor of home solutions and has refused to use ORS packets. The Zimbabwe Health Ministry firmly refuses to let ORS packets be used even in health centers, on the grounds that this would make people think home drinks are a second best substitute. Instead, nurses teach mothers how to prepare and give the same drinks they are encouraged to use at home.

Many other ‘people-centered’ approaches to ORT have been promoted in different countries, mostly by non-government and community-based programs. In Child-to-child initiatives, school children are taught about dehydration and rehydration through discovery-based learning, and carry out participatory epidemiology in their villages to observe the correlations between undernutrition, frequency of diarrhea, and breast feeding versus bottle feeding. In this way they not only master problem solving skills, but schooling becomes more relevant to their day to day lives and needs. [^6] And this can be the start of a big step forward.

In closing, I would like to say that the comprehensive approach to meeting health needs as espoused in the Alma Ata Declaration has been shown to effective, especially when introduced within an overall development paradigm based on meeting all people’s basic needs rather than on lop-sided economic growth. This was confirmed in a study sponsored by the Rockefeller Foundation in 1985. The study, .called “Good Health at Low Cost,” explored why a few countries—China, Costa Rica, Sri Lanka, and Kerala State in India—had achieved relatively high levels of health and child survival despite low economic status. [^7] They found that each of these societies had a strong social and political commitment to equity. This was demonstrated by 1) comprehensive health services for all, 2) universal primary education, and 3) assurance of adequate food at all levels of society. Cuba, which has followed a similar path of equity-oriented development, has in turn achieved remarkable levels of health, in some ways superior to those of the United States which has a GNP per capita 20 times that of Cuba. [^8] All these countries—although their successes have begun to falter in an increasingly conservative global environment—demonstrate that a comprehensive approach to health care backed by strong commitment to equity—as espoused in the Alma Ata Declaration—olds great promise in terms of advancing toward Health for All.

But as we have seen, Primary Health Care as promoted at Alma Ata has a potentially liberating potential. It embraces the concept of equity and participation of people in the decisions that shape their health and their lives. At its best, it can be introduced in ways that sow the seeds of critical analysis, organized action, and changes that may some day lead to healthier social structures. As such, from the persspective of national elites and the global power structure, it is potentially subversive. Therefore, this comprehensive, user-frienly approach has been replaced by Selective Primary Health Care. Selective PHC tends to be top down and restrictive not only in its vertical interventions but also in their implementation. Unlike Comprehensive Primary Health Care as envisioned in Alma Ata, Selective PHC leaves the inequities and injustices of the status quo firmly in place. So we see in poor countries today that the conditions of poverty, hunger, poor health, and early death remain much the same, and in some ways are worse, than before.

But Selective Primary Health Care is only the first assault. As subsequent speakers will explain, the potentially liberating health strategy of Alma Ata has been still further undermined by the global power structure, which would turn both health care and working-class people into cost-effective commodities on the global market.

References

1 Brown LR, editor. State of the World, 1994. New York; London. W. W. Norton. Foreword.

2 The term “disease palaces” was first used by primary health care pioneer David Morley in his classic book “Paediatric Priorities in Developing Countries.”

3 Grant JP. State of the World’s Children, 1994, UNICEF, p. 80; Brown LR, editor. State of the World, 1990, Durning, AB. “Ending Poverty”, p. 138.

4 Grant JP. State of the World’s Children, 1994,UNICEF, p. 3,6,7.

5 Grant JP. State of the World’s Children, 1994,UNICEF, p. 6.

6 Werner D, Bower B. Helping Health Workers Learn. Hesperian Foundation. Palo Alto CA. 1982. pp. 24-17 to 24-30.

7 Scott B. Halstead, Julia A. Walsh, and Kenneth S. Warren, eds., “Good Health at Low Cost” Rockefeller Foundation, New York, 1985.

8 Grant JP. State of the World’s Children, 1994,UNICEF, p. 65.

 

Publication Information

 

Part of a presentation for the International People’s Health Council at the NGO Forum United Nations “Global Summit,” March 7, 1995