Social welfare is an encompassing and imprecise term, but most often it is defined in terms of “organized activities,” “interventions,” or some other element that suggests policy and programs to respond to recognized social problems or to improve the well-being of those at risk. To define social welfare in terms of programs or problems alone, however, is to miss a larger and more enduring element. Titmuss (1958) observed that social welfare is concerned with the “right order” of relationships in society; that is, it is some ideal of the way in which a society works and fits together to form a suitable place for human habitation and development. From a different perspective, Murray (1984) referred to social welfare as establishing the “rules of the game,” with the “game” being the system of distributing valued resources, such as money; jobs; housing; and educational, health, and social services. Both Murray and Titmuss—and, therefore, both what is called the “Right” and the “Left”—have some vision of the good society. Social welfare, then, is perhaps best understood as an idea, that idea being one of a decent society that provides opportunities for work and human meaning, provides reasonable security from want and assault, promotes fairness and evaluation based on individual merit, and is economically productive and stable. This idea of social welfare is based on the assumption that human society can be organized and governed to produce and provide these things, and because it is feasible to do so, the society has a moral obligation to bring it to fruition.
This entry describes the history and development of social welfare in the United States with a focus on understanding the character and sources of the current idea of social welfare. It was written in the context of the 1990s, however, a time when much of what is commonly thought of as making up the sound welfare state is under an apparent cloud of doubt.
The beginning of the 20th century was fertile ground for the development and expansion of broad governmental responsibility for social problems in the United States. Industrialization, immigration, the growth of cities, the rapid increase in capital and wealth, and labor unrest all contributed to a dramatic change in the role of government. There was a prevailing sense that through political will, effective professional service, and adequate supports a myriad of social difficulties could be solved. Social work, born of “scientific charity” and christened in American Progressivism, reflected this nearly boundless hope and the easy acceptance of state responsibility for society and its people.
However, in the 1990s, things do not seem so clear. The century has not been altogether kind to the welfare state, and there is talk of a new paradigm. The post–World War II economic boom is long past, and postindustrial restructuring has left the welfare state vulnerable to a new conservatism that has eroded some of the programs and much of the ideology of welfare. Liberal optimism is not in vogue and seems to have given way to a less generous appraisal of human potential. What for most of this century seemed so hopeful and just two decades ago seemed inevitable currently seems to be teetering on some historical edge.
The development of social welfare in the modern sense of the term depended on the assumption of state responsibility for the provision of social assistance. This assumption established a context in which policy and programs could develop in a uniform and visible fashion and in response to social and economic circumstances reflected in political pressures and processes. From the U.S. perspective, the emergence of a central role for the state in social provision had occurred before American colonization, and so the story of the transition from “private,” sacred, charitable aid to “public,” secular, citizenship-based aid begins in England.
Elizabethan Poor Laws
The development of a clearly defined governmental role for the provision of aid to those in need is typically associated with the Elizabethan Poor Law of 1601. As de Schweinitz (1943) observed, the statute of Elizabeth 43 represents the culmination of a two-century process of the state's progressive attempts to control aid to poor people, first through “repressive” measures and later through the establishment of a “positive obligation.” This positive approach involved a system in which local parishes publicly administered locally derived tax funds that were used to provide direct grants to unemployable people, work for able-bodied individuals, and apprenticeship or some form of foster care for neglected children. The elements of the Elizabethan Poor Laws remained the basis for English and American provision to the poor for 300 years and continue to have a great influence.
Feudal system. The Poor Law statutes, beginning in 1349 with the Statutes of Laborers and culminating in Elizabeth 43, represent an effort by the state to deal with the decline and fall of the feudal system and the transition to a modern, wage-based economic and social order. The feudal system was land based, with a hereditary hierarchy controlling all property. The vast majority of people were landless serfs, who were tied to property and were required to labor for the lords in their fields or in small-scale manufacture. The serfs paid rents and taxes and were obligated to the land-controlling lords for life. In return, they were provided protection and some small measure of security. This agrarian system produced little surplus, much of which went to support the well born. Every household gave to the church (which also had a hierarchy to support), and some of the surplus was provided in the form of charitable aid. The feudal system was one of fixed social classes and minimal geographic or social mobility of individuals.
Centers of trade.
Centers of trade. By the 13th and 14th centuries, however, things began to change. Sufficient political stability gave rise to larger areas of trade, which allowed for higher degrees of specialization in agriculture and manufacturing. The result was the emergence of towns as centers of craft and trade and the beginnings of a “middle class” that was not composed of serfs, nobles, or clergy. As more and more rural agricultural workers migrated to the towns in search of greater employment opportunities, the consequent labor shortages in the countryside were a source of vexation to the landholders. The Black Plague, beginning in the mid-1300s and continuing at least 100 years, substantially reduced the population and contributed to greater labor shortages, but the underlying process was a wholesale economic and social transformation that would ultimately produce a political transformation as well. This transformation involved the emergence of large-scale markets, a surge of technology, and population shifts that would create capitalism, great cities, industrialization, and the middle classes, as well as a “working” class, and would ultimately shift democratic political power to these latter groups and away from the old aristocracy (Polyani, 1957).
State protection of public interests.
State protection of public interests. At the time of the Poor Laws, the state represented the interests of powerful and landed individuals, and it could not allow the church, monasteries, and various foundations, guilds, and private donors to give aid in ways that were not consistent with what it construed to be the “public interest.” The public interest, of course, was what maintained the system of economic and social relations that this landed aristocracy was used to and benefited from and that advanced the control of the government. It seemed only reasonable to have policies that would standardize aid and prevent multiple suppliers of social aid from providing benefits to whom they liked, when and where they liked, and administered by persons not accountable to public authority. So over time, starting with the Statutes of Laborers in 1349 through the acts of Henry VII, Edward VI, and Elizabeth, the government established laws determining who could be given aid, where they could be given aid, how funds were to be raised, who should administer the aid, and what punishments should befall the poor and the providers of aid for failure to comply with the demands of law. By the mid-1500s, the church had been essentially chased out of any important role in fundraising, in determining eligibility, or in administering the assistance. In its place was a system of taxes, eligibility laws, official lists of the needy, and overseers of the poor—all part of a policy designed to provide assistance to the poor in various categories while preserving local standards and ensuring no harm to the social, economic, and political order through large-scale subsidy of a begging, criminal, and dependent class.
Changes in the Poor Laws
1662 Law of Settlement.
1662 Law of Settlement. The Poor Laws were modified twice to add major features of policy. The first, the Law of Settlement of 1662, allowed the return of people to their former parish of residence if they had meager resources and appeared to be likely recipients of public aid. Thus, it clearly established a residency requirement and, not for the last time in Poor Law history, attempted to remove any incentives for geographic mobility that public aid might create.
1834 Poor Law reforms.
1834 Poor Law reforms. The second major change came in the 1834 Poor Law Reforms; these measures were designed to reduce perceived work-disincentive aspects of poor relief. The reforms sought to reduce “outdoor” relief by restricting aid to able-bodied individuals to the workhouse, providing for tougher and more centralized administration, and seeking to eliminate the overlap of assistance levels and available wages. This last reform was to be accomplished through applying the principles of “less eligibility,” by which benefit levels were fixed below the wages of the “laborer of the lowest class.” These reforms were due, in part, but only in part, to the consequences of the Speenhamland system that was adopted in some parts of the country in 1795. Speenhamland was a system of wage subsidy that effectively created a guaranteed income of sorts for persons in some agricultural districts. It was an updated expression of the old pre–wage-relationship days of “noblesse oblige” in which workers were provided a specified level of security, regardless of employment or productivity and were tied to a particular district. However, the days of obligation to landowners had passed, and so the predictable occurred: Wages declined because they could be shifted to the public treasury; unemployment increased because it was “insured”; normal mobility to growing urban areas ceased; and agricultural productivity declined. The framers of Speenhamland had depended on the social forces of tradition and obligation to keep things intact, but these traditional forces had been destroyed by social and economic development. By 1834, the English government wanted nothing to do with wage subsidy and sought to extricate poor relief entirely from the labor system (de Schweinitz, 1943).
Poor Laws in American Colonies
In the American colonies, the Poor Laws became the basis for poor relief, and the same issues of indoor and outdoor relief, the centrality of administration, and problems of dependence arose. In general, however, the colonists did not pursue the administration of the Poor Laws with the same vigor or anxiety as did their English counterparts. As Rothman (1971) pointed out, the colonial pattern of poor relief was based on a fixed idea of social relations, in which the poor were seen as a permanent order, “integral” to society and not a danger to it. The probable explanation for this view was that the new world was not changing social relations or geographic mobility in ways that threatened any old order or established economy. There was no old order, and the economy was agrarian and trade based. There was no fear of migration to urban areas and no view of poor relief as a threat to the good order of society. There were certainly those who questioned whether poor relief might threaten the good character of those who got it, particularly in the more Calvinistic colonies, but in general there was toleration for the poor and no particular urge to reform them or the society in which they lived. This attitude of benign near-neglect of the poor did not survive the American Revolution and its aftermath. By the 1820s, the United States entered into the first of many episodes of searching for a solution to the problem of dependence and developing its own American version of reform.
PHILOSOPHICAL AND CULTURAL CONTEXT
To understand the particular character of social welfare development in the United States (and England) in the 19th century and much of the 20th, it is important to reflect on the ideas that guided such development and the cultural, political, and social forces that created and sustained them. These ideas have many variations, but are well represented by the works of John Locke, Adam Smith, and Jeremy Bentham. To these ideas should perhaps be added the contributions of Martin Luther, John Calvin, John Wesley, and the other Protestant reformers who sought to bring religion out of the monastery and to see God's hand in everyday work and the marketplace and who railed against elaborate and nearly royal church hierarchy. Together they, and many others, established an influential “liberal” construct involving economic liberty, political freedom, and utilitarian philosophy. These ideas have nourished and sustained the ideas of individualism, personal responsibility, the moral importance of work, and distrust of collectivism and centralized government (Leiby, 1978).
Collapse of Feudalism
All these developments in political philosophy, moral philosophy, and theology are the consequence, at least in part, of the collapse of feudalism and the emergence of a social class that was not defined by birth or knowledge or bound by law to land or master, but defined by usefulness. In short, it was not a matter of who the members of this class were, but what they did. The economic order that developed put the tradesman and the merchant in central roles, and they came to define themselves by their utility and to apply this standard to others. As the economic and social influence of this middle class increased, ideas of political liberty and equality developed, set against aristocratic claims to power and wealth. The emergence of a middle class has left a powerful cultural legacy and a political order that seeks to eliminate the useless. To some, the poor are suspect; others condemn the church or the corporation, but the basic value is the same: A person has moral worth only to the extent that he or she is identifiably useful to others and in so doing contributes to the welfare of others. This is why work is seen as a moral act, not simply as an economic act. It is the way in which an individual demonstrates his or her value to others. This system of evaluation of persons and things by their consequences has profound implications for both social welfare policy and operations.
Culture of Capitalism
In the United States, with its strong cultural commitment to individualism and personal freedom, the search for worthiness has been particularly intense. No element has contributed to the architecture of the U.S. social welfare system to an equal degree. Thus, social welfare has developed in the context of what Wilensky and Lebeaux (1965) referred to as the “culture of capitalism.” That culture is not, of course, devoid of social value placed on family, community, and humanity. Therefore, the American response to human need has been a continuous compromise between values of security and humanitarianism, on the one hand, and self-reliance and competitiveness, on the other hand. This compromise has given the system of social welfare an odd and uneven quality and has produced a patchwork of programs with little apparent coherence of articulation. However, beneath the complex program level resides perhaps a more enduring and coherent system of social thought reflecting the ideals of personal responsibility, individual utility, and equity that, combined with the U.S. multilevel government structure, gives U.S. social welfare history a consistency and character that is uniquely American.
SOCIAL WELFARE IN THE NEW REPUBLIC
The American Revolution, like many subsequent revolutions, gave rise to high social expectations. England had been blamed for many things, and once English “oppression” was lifted, the social order was expected to become more nearly perfect. America, after all, had vast and available resources and none of the burdensome elements of European social order and government. Perhaps because of these rising expectations, the new government forms in the United States, the nature of politics, or all these elements combined, the issue of poor people and dependence would become prominent. As a result, the colonial pattern of providing relief within a largely unquestioned community responsibility and with a typically casual administrative system gave way to a new and less tolerant view. This new view saw poverty as a social problem; as a potential source of crime, social unrest, and long-term dependence; and, therefore, as a proper target of reform. By the 1820s and 1830s, the social and political concern with poverty had forged a new direction for policy and a “new paradigm” for treating the problem.
Indoor Relief as Institutions of Reform
The new approach was based on the assumption that the existence of poor people was evidence that the social order needed repair. This was the age of engineering, and once the problem was identified, it was to be fixed through the application of human analytical skills and an intervention designed to fit the problem. Analysis and design often occurred through governmental commissions, such as the Yates Commission in New York and the Quincy Commission in Massachusetts. After surveying the problem and hearing much testimony, these and other similar groups came to remarkably similar conclusions: The poor have been ruined by the Poor Laws, specifically by outdoor relief, and the answer is to end these community temptations to become permanent denizens of the public dole (Rothman, 1971).
In place of the old system would be the perfect institution: a grand almshouse, where, through order, cleanliness, discipline, and routine, the poor could be transformed into useful and productive members of society. In many American communities and most large cities, almshouses of great size and expense were constructed, often becoming the most imposing public buildings in these communities. The almshouses represented not only a growing intolerance for poor people and a disregard for what would currently be considered basic civil rights, but an American optimism that anyone could be changed and reformed in the right environment. The European pessimism and belief in a fixed order had not survived the trip across the Atlantic, and the American policy was based on the belief that people are corrupted by social arrangements and can be reformed by good influence properly applied. This policy was to be applied not only to poor people, but to criminals and mentally ill individuals, and thus a similar increase in the building of asylums and penitentiaries occurred during the same period. This increase in such institutions was to work out badly, of course, and within a short time almshouses and other institutions that had been built with such pride and optimism were being described as places of routine abuse and despair. Dorothea Dix led a movement for institutional reform for mentally ill people, and by the 1850s outdoor relief for the poor was reestablished as the norm (Trattner, 1989).
Precursors of Scientific Charity
The return of relief to the poor at home was partly the result of the decline of institutions and partly the result of continued and growing immigration and periods of economic distress that made institutional relief impractical. Private organizations, such as the Society for the Prevention of Pauperism and the Association for the Improvement of the Condition of the Poor, influenced the community administration of aid and sometimes played a direct hand in such assistance. These groups generally believed that aid to the poor was the duty of society and must be provided, but that the causes of pauperism were generally personal. Therefore, it was important to provide aid cautiously and to seek to reform the individual away from intemperance, foolish spending, laziness, or whatever aspect of personal character or circumstance was believed to have led to the problem. The emphasis on social structure as a cause of poverty was minimal and confined mainly to the community regulation of alcohol and the “ruinous” effects of outdoor relief.
This approach led, within 25 years, to the development of “scientific charity” and ultimately to the emergence of casework and the profession of social work. However, its development would be interrupted by the Civil War, which would fundamentally reorder the relations of the federal government and the states, create new agencies and new responsibilities at the national level, and raise the issue of opportunities and rights for African Americans to an enduring and high level. This latter issue would influence American social policy and social welfare in the most basic and profound way.
CIVIL WAR AND POSTCIVIL WAR ERA
The Civil War changed everything. Before the war, the United States was a collection of states; after the war, it was a genuine nation. To say that the relations between the federal government and the states would never be the same is a vast understatement. The emergence of a central government, the economic and industrial growth spurred by the war, the tremendous rate of immigration and migration west, and technological change all combined to create a new United States, with many of the social, demographic, and cultural features associated with modern times.
For social welfare, the changes came gradually but inexorably. During the Civil War, the U.S. Sanitary Commission had been organized by the War Department and, despite its quasi-voluntary character, it contributed to the later development of veterans' and public health programs. The Freedman's Bureau, established in 1865 just before the end of the war, was the first federal welfare agency. Despite its short life of four years and its limited appropriations, it managed to aid a large number of displaced people and to contribute to the establishment of a new nonslave economic role for African Americans in the South. Its promise of land reform and the spread of landownership to former slaves was never realized, but its contributions in relief and education were substantial (Trattner, 1989).
Impacts of Industrialization
The federal role began to shift in other ways as well. Land grants to states for educational and other institutions increased, creating, among other things, the current system of land-grant colleges. By the 1880s, the problems of industrial growth and labor had led to the establishment of the Bureau of Labor Statistics, the Interstate Commerce Commission, and such legislation as the Sherman Anti-Trust Act. Labor unions were consolidating during this period; the militant Knights of Labor, for example, increased membership from 50,000 in 1880 to more than 700,000 in the mid-1880s. The Knights later gave way to the far more moderate American Federation of Labor, but organized labor would have an important political role, both in the industrial states and at the national level. It would, as Ehrenreich (1985) noted, not only have a specific political impact, but would create a climate of disorder and a sense that American industrial life must be “stabilized.” In a real sense, American social policy in the 20th century was the product of this desire for an orderly and decent society. Child labor, sweatshops, miserable wages, industrial accidents, urban slums, disease, and conflict all came with the grander aspects of social change and opportunity. The streets surely were not paved with gold for most individuals, and the realities of this new industrial order were harsh, indeed. This situation would eventually offend a large number of Americans who had a different vision of the American ideal.
Charity Organization Societies and Scientific Charity
Social welfare with regard to poor people in the late 19th century was not equipped to deal with such rapid change of people and things. Charitable agencies proliferated, but with little common purpose and little ideological change from their prewar philosophy. Many states established boards of charities, but the leadership for a new movement came from the private Charity Organization Societies (COS), first established in the United States in the late 1870s. The COS role in shaping American social welfare and the social work profession was extraordinary. This was a period of radical American labor and agrarian populists, on the one hand, and social Darwinism, laissez-faire, and the Gilded Age, on the other hand. Among most government and business leaders, the ideas of unfettered competition were in vogue, and the prevailing attitude toward social benefits for the poor was largely a hostile one until after the turn of the century. The COS, in this context, managed to organize disparate charitable groups and become a dominant presence in all the major cities of the country. Most important, the COS invented “scientific charity” and in so doing established a rationale, a method, and a system of training that would lead directly to the social work profession as it is currently known (Katz, 1986).
Social Darwinism. Scientific charity was a concept based in social Darwinism. Sociologists Herbert Spencer, in England, and William Graham Sumner, in the United States, popularized the application of Darwinistic evolutionary theory to society. Social Darwinists saw in competition and the “shouldering aside of the weak” a process of social evolution that would reward the productive and able and punish the incapable and those who lacked the virtues of thrift, hard work, and farsightedness. This was a compelling and influential idea, but one so unfriendly to social intervention or even simple humanitarianism that it seemed to be an unlikely basis for the administration of social benefits to poor people; however, this is the remarkable feat that the COS accomplished.
Development of casework.
Development of casework. In practice, scientific charity was a matter of creating a common registry of the needy and ensuring that the applicants for aid were both “worthy,” in the sense of an absence of personal aberration or depravity, and subject to a process that would soon be called social history and casework. The emphasis was on the individual, not the social environment. The process was not simply the application of 19th-century moralism, but something closer to an assessment of potentials and barriers that would be roughly recognizable today. Josephine Shaw Lowell was the often-stern ideologue of the early COS, and her warnings about unwise philanthropy that would undermine human character and will were strong and frequent (Stewart, 1911). This was the 19th century after all, and character and will played a major role in the understanding of human life. Still, the COS provided sustained relief to poor people in an inhospitable social and political climate, consolidated many social welfare interests and organizations, pioneered in record keeping and “social research,” developed training programs, and sought to establish links with the universities of the time (which were seeking a new relevance through professional education). They also created a circumstance in which the social work profession was social policy with regard to the poor and child welfare for nearly the first third of the 20th century. If the COS can be faulted for a lack of interest in social reform, this fault would be corrected by Jane Addams and others who were active in the settlement movement and the larger Progressive movement.
THE PROGRESSIVE ERA
Multifaceted Middle-Class Movement
Oddly enough, considering the social war between labor and industrialists and the competing politics and ideologies involved, Progressivism was a middle-class movement that emphasized parks and beautification, education, the “Americanization” of new citizens, and professionalization, as well as social insurance and regulatory controls. Immigration and the complexities of urban and industrial life did drive Progressivism, but as Hofstader (1955) noted, the typical Progressive and the typical immigrant were “vastly different.” Herbert Croly's (1909) The Promise of American Life would become a Progressive handbook, but one has the impression that the Progressives, who were mostly Anglo-Saxon, had to remind themselves constantly that the immigrant was a hard-working, brave individual with a cultural heritage and great potential as an American. Without such reminders and much mental discipline, the old prejudices might not be held at bay (Hofstader, 1955).
Influential leaders. The Progressives counted among their ranks many prominent persons, including Jane Addams, George Herbert Mead, Robert Park, Richard Ely, Paul Douglas, and John Dewey. This was a group of people, all influential within academia and politics, who shared a common commitment to an active, morally responsible government (as opposed to a laissez-faire government) and a view that economics and politics were corrupted by forces that must be constrained by rule making. They also believed that although industrial capitalism might well be a great engine of wealth and spreading prosperity, it was at best a polluting engine that needed to be finely tuned and its waste products taken care of.
Enduring cultural impact.
Enduring cultural impact. The Progressives gave rise to a political party that would later nominate former Republican President Theodore Roosevelt as its presidential candidate in 1912 (and come in a strong third); however, this political party was not the essence of Progressivism, and Progressivism did not die with the decline of that party. It is best to think of Progressivism as a cultural movement that would show itself in many forms and in many venues. The National Consumer League, the Urban League, the National Child Labor Committee, the American Association for Labor Legislation (AALL), and the National Association for the Advancement of Colored People were all Progressive to a large extent, and they had at least as much influence in the classroom as in the legislature and with greater and more enduring effect (Crunden, 1982).
Social Welfare Outcomes
The social welfare consequence of the Progressive movement was great in three areas: prevention through “social insurance,” the use of government regulation, and the role of professions in society.
Social insurance. With regard to social insurance, the Progressives, through such organizations as AALL, sought to popularize the idea that the collectivization of the normal risks of life was a superior form of provision to charity based on need. Their basic thesis was that charity is demeaning and corrupting to both the recipient and the giver and in any case is likely to be sporadic and meager. A better system is one in which citizens “contribute” to a common fund that would provide benefits on a “membership” basis without demeaning tests of character or need and without the presumption of deviance. Addams, for one, detested charity, and this new construct, derived from Europe, seemed to fit the multiethnic American democracy. It would eliminate the charitable worker and the consequent attitude of superiority of well-to-do individuals, it would create a common bond between diverse people, and it would be regularly budgeted and administered through public auspices.
The greatest success in applying this idea was with workers' compensation, and between 1910 and 1921 the majority of states passed such provisions. Health and unemployment insurance were rejected at that time, under pressure from organized medicine and industrial interests, but old age assistance and Aid to Dependent Children (ADC) were commonplace by 1920. In the case of these last two programs, conformance to the social insurance ideal was minimal. Mother's Aid and old age “pensions” were not social insurance except, perhaps, in spirit. Both had a means test, and eligibility had nothing to do with contributions. Although old age assistance was provided on the basis of some presumed previous social role and therefore had a somewhat lower stigma (and higher benefits), ADC was, from the beginning, fraught with tests of “worthiness” of one kind or another. Nevertheless, the programs put into place through Progressive pressure established a social welfare presence in every state and set into motion the steady expansion of public welfare programs, benefits, and recipients—an expansion that would set the stage for the federal government's assumption of funding for state programs and the development of a genuine social insurance system nationally (Skocpol, 1992).
Regulations. Progressives supported the concept of regulation and spearheaded many efforts on the state and national levels to use regulation for social reform. At the federal level, these efforts included the Interstate Commerce Commission, antitrust regulation, civil service and merit system requirements for employment, the Federal Trade Commission, banking regulation, and the Food and Drug Administration. At the state and local level, regulatory advances included the areas of child and women's labor, wages, housing and fire codes, public health, food processing, merit employment requirements, property zoning, and many political reforms, including referendum and recall.
Child labor, women's suffrage, immigration, and temperance were all major national issues and attracted the involvement of many prominent social workers. The extension of suffrage to women occurred in 1919, as did the passage of another constitutional amendment to prohibit the manufacture, sale, importation, and consumption of alcohol. Prohibition was promoted by its supporters as a social welfare policy that would protect women and children and promote employment and productivity. Likewise, immigration controls passed in 1921 were promoted not as nativism, but as rational planning for improved wages and working conditions and the stabilization of cities.
Professionalization. No aspect of the Progressive mind was more important than was the strong belief in education and professionalization. The idea of a profession—a group of people with knowledge and skills dedicated to the “public interest” and to whom responsibility for major social and human problems could be delegated—was a critical part of the Progressive strategy. Middle-class, educated individuals were rational problem solvers, planners who would harmonize and stabilize society. Medicine, by the turn of the century, was establishing itself as a model profession and successfully fighting off the efforts of the insurance industry or local governments to render them employees.
The University of Chicago and Johns Hopkins were model universities, with schools in engineering, medicine, law, education, and other professional areas. The COS had established training programs associated with universities, and by 1920 several of these programs had developed into schools of social work. By 1930, there were many more schools of social work and the beginnings of a national system of curricular standards. The COS developed into what was to become family services; the public agencies expanded, especially in child welfare areas; and the mental hygiene movement produced professional opportunities in the 1920s. These factors, combined with the founding of the American Association of Social Work in 1921, along with other more specialized associations; support from the Russell Sage Foundation and other foundations; the centrality of such books as Richmond's Social Diagnosis (1917); and the publication of journals like The Compass, The Survey, and The Family all combined to create an active and visible profession of social work.
This new profession was concerned mostly with individual problems and adjustment and developed as its primary method a casework method that was strongly influenced by the COS and scientific charity. Social work had been admonished by Abraham Flexner in 1915 for not being adequately professional, and it had sought diligently to heed his words and develop something equivalent to a medical model for social work (Flexner, 1915). Casework met this criterion and it reflected the American commitment to individual responsibility and practical problem solving. This is not to say that the social reform influence on the Progressives, as embodied in the settlement houses and espoused by Addams, Florence Kelley, and Lillian Wald, was chased out. Far from it, for despite the profession's method of casework and its individual orientation, social work continued to represent a Progressive-style social environmentalism, preferring explanations of human problems in terms of the deficiencies of families, communities, and social structures. The predominant professional view was that these deficiencies could be compensated for by the development of well-staffed social services, as opposed to wholesale social reform, but it is the nature of professionals to think in such terms.
The high point of Progressivism was certainly before World War I. The war and the ensuing political and economic developments cooled the Progressive passion in the 1920s. However, Progressives had accomplished a great deal. They had put on the social policy and political agenda virtually every important social problem of the 20th century: poverty, immigration, slums, child welfare, mental health, public health, and, of course, gender and race (if not exactly class). Furthermore, they had provided the means to deal with these problems: a government concerned with social welfare; groups mobilized for social action; professionals trained to work with such problems; and model programs of social insurance, social aid, and social services. By the late 1920s, they had created a context that would serve as a vessel for federal money and federal administration efforts in the 1930s and thereafter. They established the intellectual, political, and governmental context that would allow the United States to move toward the sort of welfare-state models that would have seemed unlikely a quarter of a century earlier.
DEPRESSION AND THE AMERICAN WELFARE STATE
The 1920s was a period of general economic prosperity and extravagance. This prosperity had benefited the social services and social welfare generally, even though it played a role in diminishing the political demand for social reform on any scale. The Republican party, which was the original home of Progressivism, was the predominant national party, giving the country the presidencies of Warren G. Harding, Calvin Coolidge, and Herbert Hoover. These presidents were, to some degree, in a Republican tradition that had been greatly influenced by Progressivism; Hoover especially represented a commitment to what has been termed “social engineering.” Social engineering, as opposed to reform, seeks to intervene in specific ways to solve particular problems. In this spirit, the 1920s saw the continued expansion of child welfare, the development of a public health service and vocational education and rehabilitation, and the continuation of White House conferences on social matters that generated and focused support for social programs. Hoover was poorly equipped to deal with a massive economic downturn, however, and despite some efforts, such as the Reconstruction Finance Corporation Act of 1932, lost to Franklin Delano Roosevelt in the election of 1932.
The Roosevelt administration came to power promising a “New Deal.” The “old” deal had surely turned sour for millions of Americans who were facing unemployment, the loss of their homes, and poverty at unprecedented levels. The Great Depression was worldwide and would cause the United States and Europe to make some fundamental reassessments of politics and economics. Some of these reassessments would have disastrous results, but in the United States, the basic elements of the economy and governmental order would remain intact. Even so, the New Deal brought to the United States a version of the welfare state and established a pattern in social welfare that is still present (Trattner, 1989).
Roosevelt had been elected in 1932 largely because he was neither a Republican nor a Hoover. Both had worn out their welcome with the majority of Americans who were desperate for a return to some measure of normalcy and security. Manufacturing output had declined nearly 40 percent by 1932, average wages were down 25 percent, unemployment was approaching 25 percent, the banking system was near collapse, and there was organized disorder in the cities and factories and on the farms. Whatever the causes of the depression, it was clearly a dangerous thing by 1932, and Roosevelt was elected to deal with it. His campaign did little to suggest much beyond business as usual (promising a balanced budget, for example), but Roosevelt had a pragmatic and authoritative character and a strong desire to succeed. He observed that if he were to fail at being president in this time of crisis, he might well be the last president.
New Legislation and Federal Agencies
Roosevelt got to a fast start with a rash of legislation and a great deal of activity and planning at the White House. To deal with the immediate effects of the crisis, he established in 1933 the Federal Emergency Relief Administration (FERA), headed by Harry Hopkins, a social worker whom he had used in New York State in a similar capacity. FERA provided funds and no small amount of administrative directive to states for the purpose of providing relief. Whatever state programs that had been in place had largely gone bankrupt, and the demand to provide aid to the unemployed had long since exceeded the capacity of private agencies. Within a short time, FERA gave rise to work programs such as the Works Progress Administration and the Civilian Conservation Corps, which became the principal means of providing assistance to unemployed poor workers in the mid-1930s. Economic reform was also part of the overall strategy; it included the National Recovery Act, with its codes for industry that sought to establish wage and price controls and to ensure labor rights in a “planned economy,” and the Agricultural Adjustment Act, which sought to reform the agricultural side of the economy through allotments for farm production and the stabilization of market prices. Both acts would succumb to the Supreme Court's determination that such far-reaching legislation went beyond constitutional limits, but both would have a substantial political effect and would shape later New Deal social policies.
Social security. With the congressional elections of 1934, Roosevelt substantially increased his party's political strength. The elections brought many liberals into Congress. Outside Congress, Huey Long, Father Coughlin, and others were proposing radical solutions and gaining national attention, adding a new dimension to Roosevelt's political challenge. To these developments were added the Supreme Court decisions, the increasing militancy of labor, the sluggish economic recovery, and defections of business support. As a result, the administration moved to develop a more dramatic and long-term program for social reform, including a new program for individual economic security. The latter program was begun by the appointment of a Committee on Social Security in June 1934, and the Social Security Act became law by January 1935. The Social Security Act is the basic document of the American social welfare system, establishing a federal social insurance system for old age, unemployment, and disability and a state-federal public assistance system, including aid for dependent children and for needy elderly and disabled persons. In addition, the act established a system of federal grants to states in related social services areas. To the social security system must be added the Wagner Act, which substantially increased the rights of organized labor; the Fair Labor Standards Act, which regulated wages and hours; and a collection of programs in vocational rehabilitation, public health, housing, and child welfare.
Welfare state. More than any specific piece of legislation, the New Deal brought to American life an unprecedented federal-level focus, a new progressive coalition, and a permanent strengthening of the federal government that would place it at the center of responsibility for the character of American society and the welfare of its citizens. If the welfare state was not present in the United States in every programmatic piece (there was no health care, for example), it was nevertheless present in spirit and intention. The welfare state idea that would be well established in England and Europe after World War II would be based on three areas of government commitment: full employment, the prevention and relief of poverty, and universal services for basic needs. Although these commitments were organized in various ways in Western countries, what emerged is often referred to as the Keynesian-Beveridge welfare state, noting the contributions of the economist John Maynard Keynes to policies of full employment and economic stabilization and of Lord Beveridge to social insurance and services. The United States lacked national health insurance, it allowed considerable variation by state in its federal system, and it did not replace all the “residual” with “institutional” social programs. However, it took its place among the welfare states, and within a few decades it would be allotting nearly 20 percent of its gross national product (GNP) to social welfare.
Separation of Insurance and Assistance
Earnings versus need.
Earnings versus need. The social policies that emerged from the New Deal and were embodied in the Social Security Act contained a distinction between social insurance and public assistance that was at once practical and troublesome—practical because such a policy based on work and contribution created both a mechanism for funding and a powerful link to an important social value, but troublesome because those involved in uncovered work or who were out of the labor force had to have their needs provided otherwise, which meant the continuation of a means-tested charitylike system, albeit publicly funded and organized. This distinction created, on the one hand, a popular, politically acceptable, non-means-tested system of social insurance providing benefits to those who had “earned” them by virtue of work and, on the other hand, a much smaller, state-based system of public aid providing benefits on the basis of need. Thus, the worthiness problem was solved on the social insurance side by the device of work-based contribution, but the problem continued for those who relied on public assistance. Although other countries developed more varied means to provide nearly universal aid on a presumed worthiness basis (children's allowances, for example), the United States created a system that would prove to be a philosophical, administrative, legal, and programmatic problem. Social insurance grew in every sense to become a permanent feature of the American political and social landscape. Public assistance, particularly Aid to Families with Dependent Children, drew criticism and various welfare “reform” efforts from the beginning.
Impact on citizens.
Impact on citizens. The lessons of this separation of public aid and social insurance seemed clear enough: Social policy must either effectively incorporate social and cultural values that relate to the evaluation of individuals or use programs to create common social and political interests among large groups of presumably “worthy,” dissimilar people. The separation of social insurance and public aid in the American model segregated the very poor and marginal people in the labor force from mainstream social programs and created a vulnerable class that was dependent on programs that were less than generously funded and always seemed to be at the center of controversy. The losers in this American policy model, of course, were women, children, and people of color, all of whom had a lower probability of having their needs well met by work-based social insurance.
Throughout this period, women continued to play a prominent role in social welfare policy and program development. They had achieved suffrage in 1919 and had leadership roles in many organizations that promoted social services and social reform, but the policies that developed from the depression emphasized the traditional role of women in social and economic life and therefore tied the interests of women to family, specifically to maternity, child care, and marriage. Some groups, particularly African Americans, had a more visible role in the New Deal than in any previous administration, but the New Deal coalition depended heavily on southern Democrats and thus was constrained from advocating full equality. This was a time of legal segregation in the American South, and although organizations, such as the National Association for the Advancement of Colored People or the Brotherhood of Sleeping Car Porters and Maids had some visibility, civil rights gains were hard to come by. The employment programs of the New Deal did try to reduce discrimination, and Roosevelt ultimately created a Fair Employment Practices Commission in 1941, but despite these modest gains, programs like the New Deal's Agricultural Adjustment Act and later agricultural programs displaced thousands of minority farmworkers while compensating farm owners, and programs like Aid to Dependent Children would come to be seen as both inhibiting the economic participation of women and contributing to family problems.
WORLD WAR II, 1940S, AND 1950S
The New Deal lost some of its zeal with the elections of 1938 and the shift of focus to the instability of Europe. The impending war stimulated the American economy; by 1941, the unemployment rate dropped below 10 percent and by 1944, it was 1.2 percent, a low for the century. Roosevelt died just before the war ended in 1945, and he was succeeded by Vice President Harry S Truman.
The administration had been concerned that the end of the war would bring an inevitable economic downturn and a possible return to the depression. This concern had been the partial basis for the 1941 report of the National Resources Planning Board concerning the basic needs and rights of citizens—a report that was similar to the Beveridge report in England. Roosevelt had proposed the Economic Bill of Rights in 1944, seeking to establish full employment as a national goal and to engage in the sort of Keynesian fiscal policies that would support it. After much deliberation and compromise, the product of this proposal was the Full Employment Act of 1946, which specified that full employment was a primary goal of the government and established the Council of Economic Advisors and a system of economic indicators used to measure the country's national economic health. The war, and the draft used to support it, had revealed common health and mental health problems among inductees, and the administration was successful in expanding the Veterans Administration and in passing the Mental Health Act of 1946, which created the National Institute of Mental Health and ultimately ended the state institutional monopoly on public treatment programs and ushered in the era of community-based services.
The shift to the left and the reformist wave that occurred in Western Europe after the war and institutionalized much of the welfare state in Great Britain, West Germany, Belgium, and elsewhere did not occur in the United States. Despite the Truman administration's good intentions, there was little sense in the United States of a country being remade, as there was in Europe, and the continuation of the conservative Congress showed increasing divisions over civil rights. As a result, although there was no wholesale attack on the American social welfare system, there was little political support for expanding it.
Social and Cultural Developments
The 1950s was a period of little social welfare programmatic development, but of much social and cultural development that would influence the subsequent decade. The GI bill expanded opportunities for Americans to obtain a higher education, general prosperity created new economic opportunities, and mobility around the country was high. The Supreme Court ruled against segregated public education in 1954, and Little Rock, Arkansas, became the site of the test of the federal government's commitment in 1956.
President Dwight D. Eisenhower presided over an administration that witnessed the rapid rise in East–West tension that would become the full-blown cold war. Despite Eisenhower's interest in lowering military spending, cutting taxes, and allowing the economy to grow its own way, commitments to European defense, costly missile development, and threats such as Germany and Cuba frustrated his plans.
Eisenhower had few social welfare interests beyond education, but his administration had a working relationship with the Democratic leadership in Congress, especially Lyndon B. Johnson, and supported the shoring up and expansion of social security. The administration also reorganized federal programs into a new cabinet-level Department of Health, Education, and Welfare in 1953. Later, when states sought to “reform” welfare through “suitable home” or other provisions, this department intervened. The most notable of these interventions occurred in 1961, when Secretary Arthur Fleming stipulated that Louisiana could not remove 2,300 children from Aid to Dependent Children roles without making some arrangement for their welfare. The administration did support both the Federal Housing Authority and urban renewal housing programs and, however reluctantly, also supported school desegregation orders (with federal troops in the case of Little Rock) and sought the continued desegregation of the armed forces.
In the 1950s, despite what is viewed as a time of political conservatism, American academics and intellectuals who were concerned with American social life often emphasized poverty, racism, urban decay, delinquency, and the like. Social scientists, such as William Whyte, Albert Cohen, Lloyd Ohlin, Kenneth Clark, and Robert Merton, all represented a structuralist view, which emphasized that human behavior is a product of social structure and the social roles created by structure. From the structuralist perspective, juvenile delinquency, for example, was not so much a matter of pathological character, as a matter of the “structure of opportunity” that would create a “subculture” of the gang. John Kenneth Galbraith, who wrote The Affluent Society in 1958, and Michael Harrington, who followed with The Other America in 1962, influenced a new and larger generation of college students in the 1950s and, combined with important demographic and political changes that occurred in the decades after the New Deal, produced a new politics and a new policy direction in the 1960s (Diggins, 1992).
KENNEDY, JOHNSON, AND THE NEW WELFARE
John F. Kennedy was narrowly elected president in 1960 over Richard M. Nixon, who had been Eisenhower's vice president. Kennedy, a moderate Democratic senator from Massachusetts, defeated the much more liberal Hubert H. Humphrey to gain the Democratic party nomination. He came to office promising to “get the country moving again,” by which he meant a tax cut that would stimulate the economy and to close the “missile gap” with the Soviet Union. However, a movement more powerful than politics would pervade the Kennedy administration and, indeed, American politics in general and forge a new vision of American social policy. This movement was the civil rights movement.
Civil Rights Movement
The civil rights movement had been a continuous element in American life, but it took on a new character after World War II. Beginning primarily with the Montgomery bus boycott and the response to the murder of Emmett Till, it led in a few short years to widespread protests and often violent response by southern public authorities and segregationist individuals and groups. Martin Luther King, a young Baptist minister, became a national figure; the Student Non-violent Coordinating Committee brought thousands of college students to work for voter registration in the deep South; and many Americans heard of Selma, Alabama, for the first time.
The Kennedy administration was pressed to support a variety of measures that were introduced into Congress, and after some period of reluctance, the administration proposed its own civil rights bill. In June 1963, President Kennedy gave an impassioned television speech on behalf of the bill and the civil rights of African Americans, a speech that was seen as courageous and a position that came to symbolize the Kennedy presidency for many (Jansson, 1988).
That Civil Rights bill passed Congress in 1964 and became one of the hallmarks of American social policy, but not before Kennedy was murdered in Dallas and Lyndon B. Johnson, an accomplished legislator but an unhappy vice president, assumed the presidency.
Shift to development.
Shift to development. Johnson inherited from Kennedy not only the pending civil rights bill, but some notable social welfare accomplishments. Kennedy had succeeded in getting the Manpower Development and Training Act passed in 1962 and establishing the Area Redevelopment Agency in 1961. The former created the first jobs program since the New Deal, and the latter led to the Appalachian Regional Commission and other programs for “depressed areas.” Both represented a shift from the “old welfare” of economic security through transfer payments to the “new welfare” of opportunity and development. This shift was accompanied by the rapid increase in public assistance roles in the early 1960s and the perceived failure of the “services strategy” of the 1962 amendments to the Social Security Act, which lowered caseloads in public welfare and increased federal support for social services. This shift also reflected a new commitment to opportunity, not the least of which was to be for people of color. The prevailing view was that economic and political opportunity structures had been closed to many Americans and that this situation had to end. Such a change could not be accomplished through a reliance on traditional social welfare programs that tended either to reflect or reinforce limitations on opportunities.
Mental health. Optimism and reformism were manifested in any number of programs, including the administration's interest in promoting alternative models of mental health care. The result was the passage of the Community Mental Health Centers Act in 1963, which emphasized a preventive public health model and created the context for rapid deinstitutionalization over the following decade.
Poverty and economic opportunity.
Poverty and economic opportunity. The Johnson administration continued this expression of reformist interest and sought to expand this emphasis on opportunity and the elimination of old barriers in a number of ways. President Johnson had continued a Kennedy task force on poverty and in 1964 proposed the Economic Opportunity Act (EOA), an ambitious program that called on communities to organize themselves to fight poverty and to design programs that would be suited to their particular needs. Based on the Ford Foundation's Grey Areas project, especially Mobilization for Youth in New York City, the EOA allowed cities to establish Community Action Boards as nonprofit agencies and allowed these boards to submit proposals that would be funded by the Office of Economic Opportunity. The innovations were numerous, but most important were the bypassing of city governments in favor of direct funding of citizens' groups and the requirements for “maximum feasible participation” of poor people, meaning representation on agency boards, the hiring of poor people for staff positions, and attempts to “mobilize” them for community action. The community action programs were controversial for all these reasons, and by 1966 were tamed by new requirements for the involvement of public officials and oversight by the state governments. Nevertheless, the idea that poverty was a product of limited opportunity and inadequate representation of poor citizens in community decision making was influential and led to requirements for citizen participation in governmental programs in many areas of public services and policy.
Other programs. In addition to the EOA, the Johnson administration successfully supported many other programs of consequence. The Food Stamps Act of 1964 and the Civil Rights Act of 1965 extended voting rights and created the Equal Opportunity Commission. Medicare and Medicaid (Titles XVIII and XIX, respectively, of the Social Security Act) passed in 1965, as did the Elementary and Secondary Education Act, which extended federal aid to schools with a high proportion of low-income children, and the Older Americans Act, which established Area Agencies on Aging. The Johnson administration was the most active in social welfare since Roosevelt's New Deal (Katz, 1986).
Income Maintenance and Work Incentives
During this time of activism and the strong interaction between the civil rights and antipoverty movements, there were those who believed in a more overt, citizens' rights-based system of income support. This belief influenced many proposals, from establishing more clearly the legal rights of welfare recipients (which the Supreme Court did by 1970, partly as a result of the Legal Services Program initiated under the EOA) to programs for a guaranteed income with few questions asked. President Johnson established a number of task forces to make recommendations on income maintenance. These task forces led both to research studies of income maintenance experiments and to the Heineman Commission in 1969, which reported on violence to the next president, Richard M. Nixon.
In 1967, Congress passed amendments to the public assistance titles of the Social Security Act, referred to as the Work Incentive Program (WIN) amendments. These amendments represented a dramatic shift in policy and a recognition that there was some overlap of public assistance benefits with low-end wages and, consequently, some work-disincentive effort of public assistance. Before 1967, there was no provision in the public assistance programs for work incentives. There were provisions for discounting work expenses, but there was no policy that held that a recipient of Aid to Families with Dependent Children (AFDC) or other public aid would be better off by working. The 1967 amendments sought to establish this very principle through a rule that allowed the setting aside of a small portion of earnings before subtracting income from needs to determine the state benefits. The WIN amendments resulted in little discernible increase in labor force participation among recipients of public welfare, but did begin a long period of search in American social policy for a “work”-based public aid system. The Johnson administration supported this reform of the “old” welfare system, but further pursuit of this idea was left to subsequent, mostly Republican, administrations.
NIXON, FORD, AND CARTER
President Johnson declined to run in 1968, chased by the Vietnam War into retirement. His vice president, Hubert Humphrey, distanced himself from the administration's war policy after the violent Democratic convention in Chicago, but lost a close election to Richard M. Nixon, who was completing a remarkable political recovery.
Despite his California upbringing, Nixon was an “eastern” Republican, as opposed to the “western” Republican, Barry Goldwater, who had lost badly in 1964. Nixon always incorporated conservative political themes in his public statements, but the administration's policy positions were not typical of conservatives at the time. With its policy of “détente,” determination to open relations with China, and several notable social policy proposals, the Nixon administration seemed determined to break new ground and to operate without rigid ideological constraints.
The Nixon administration came to an unhappy end over the Watergate break-in in 1973, but not before the administration had supported separation of the adult categories from public aid, creating the Supplementary Security Income program; a major expansion of the Food Stamp program; increased funding for social services under Title XX of the Social Security Act; the reinstitution of federal support for job training through the Comprehensive Employment and Training Act; establishment of the Occupational Safety and Health Administration; automatic cost-of-living adjustments in social security benefits; and the Child Abuse Prevention Act of 1973. However, the administration seemed to have little interest or faith in the social services and impounded funds for budgetary purposes that would have benefited social services, including community mental health centers. In retrospect this latter action seems especially unfortunate because it occurred in the context of court decisions that rapidly reduced the institutional population and increased demands on providers of community services.
Family Assistance Plan
The Nixon administration believed that the policies of Kennedy and Johnson in the War on Poverty had been little more than “heat and air” and had succeeded only in raising expectations and urban tempers. Nixon strongly believed in work and opposed public welfare, especially AFDC. He appointed a Democrat, Daniel Patrick Moynihan, as his urban affairs adviser, and although Moynihan had characterized the War on Poverty in a critical analysis as the Maximum Feasible Misunderstanding (1969), he convinced the president that a new approach to support for the poor was needed. The result was the Family Assistance Plan (FAP). FAP would have effectively nationalized the administration and benefit structure of public assistance, provided benefits to two-parent families, and incorporated work incentives. It would have subsidized poor people who were working, as well as those who were out of the labor market, simultaneously expanding eligibility and changing the mix of “welfare” recipients. However, it would have also effectively reduced benefits in some states and produced a somewhat less “eligibilitylike” benefit structure. Although this plan was passed by the House of Representatives, it failed in the Senate, owing, in substantial part, to an odd coalition between conservatives who feared a major expansion of welfare provision at the federal level and Democratic liberals, who were influenced by the National Welfare Rights Organization and others, including NASW, that believed the benefits were too low and the work requirements were too stringent. Nevertheless, the FAP debate seemed to establish an enduring agenda for welfare reform and produced the concrete development of federalizing the old age assistance, aid to the blind, and aid to the disabled provisions of the public assistance titles of the Social Security Act.
Social Welfare Revolution
The years after Nixon were ones of declining federal budgetary strength and weak leadership from the White House. The administration of Gerald R. Ford exercised budgetary control through the veto, and the administration of Jimmy Carter, although a source of some interesting and comprehensive proposals, including an expanded version of FAP, was largely incapable of moving Congress and thus produced no lasting accomplishments in social welfare.
The lack of new initiatives should not suggest stagnation, however, because the 1970s witnessed what Patterson (1981) has called a social welfare “revolution.” It was not a revolution of policy and program, but of the effect of past policy decisions combined with social developments. The revolution was composed of a startling decrease in the number and proportion of poor people and an equally surprising expansion of existing social welfare programs and expenditures.
Reduction in poverty.
Reduction in poverty. The United States had officially been counting the number of Americans in poverty since 1961, when the Social Security Administration established $3,000 as the poverty line for an urban family of four. The poverty line had been adjusted yearly for cost-of-living increases, and by 1976, the poverty standard had increased to $5,500. In 1961, nearly 40 million Americans (22 percent of the population) fell below the standard; by 1976, there were 24.6 million Americans in poverty, or about 12 percent of the total population (U.S. Bureau of the Census, 1978). Optimists also liked to point to the fact that the poverty measure did not take into account “income” from in-kind programs, such as food stamps and Medicaid, both of which grew substantially during this period. If the benefits of such programs were counted, they contended, the poverty rate could be reduced by an additional one-third or more. Such an impressive decline in poverty may have occurred earlier in the century—and there was good reason to believe it had—but there was no official census counting the number of Americans in poverty in the earlier periods. However, this apparent success of American capitalism and government policy and programs was greeted with few cheers of joy and accomplishment.
The problem was that much of the reduction in poverty was traceable to increases in social spending and consequent income transfers. The “old welfare” had finally struck, and although it was effective, it was costly and not soul satisfying. The United States wanted to reduce poverty through social opportunities and individual initiative; because of its work-ethic culture, reducing poverty through social welfare, especially public assistance, was not what its citizens had in mind. Instead, the country wanted something more akin to the “new” welfare that 1960s-era leaders had promised: a reduction in poverty through increased opportunities, education, and social services. The old welfare had reduced poverty to a level that would have been unimaginable to earlier reformers, but it submitted a sizable bill as well: Expenditures in social welfare increased at an annual rate of 7.2 percent between 1964 and 1976, a rate twice as high as that in the previous decade; social welfare as a percentage of the GNP had increased from a low (by western European standards) of 7 percent in 1960 to a more respectable 17 percent by 1976; and overall spending, which had been only $53 billion in 1960, had become $340 billion by 1976 and more than $500 billion by 1980. Much of this increase had occurred in social security and Medicare, owing to both the “graying” of the population and the expansion of benefits during the 1960s and early 1970s: Medicaid, food stamps, AFDC, and unemployment insurance all increased three to five times (in constant dollars) over the period (U.S. Social Security Administration, 1992).
Increases in active interest groups.
Increases in active interest groups. In addition to the growth in social welfare programs and expenditures in the 1970s, there was also a dramatic increase in the number and types of active and visible interest groups that were vying for social policy consideration. President Kennedy had established the Commission on the Status of Women in 1961, Betty Friedan had published The Feminine Mystique in 1963, and the National Organization for Women had been established in 1966, beginning a new phase of the movement for social equality for women. Likewise a “gray” lobby was developing through such organizations as the American Association of Retired Persons, and people with disabilities, who were more visible because of the Vietnam War, were represented by organizations like the American Coalition of Citizens with Disabilities. Furthermore, the United States had come to recognize that there were Puerto Ricans in New York City, Cubans in Miami, and Mexican Americans in California, and the image of the country as a European country with an African American component was slowly giving way to a more genuinely multicultural one. In 1969 the gay rights movement in New York City was galvanized by the so-called Stonewall riot and later spread across the country to press for an end to persecution and discrimination. There emerged new conservative groups, including the “Christian Right,” which were concerned with abortion, prayer in schools, and other cultural issues. In short, the 1970s was a decade of the proliferation of political and social interests and a “rights” revolution in politics and law. The context of social welfare policy-making became more complex, more participatory, and more overtly and specifically ideological. These ideologies were not the old ones of class and economic interest, but new ones of gender, ethnicity, religion, and sexual preference that diluted the politics of class and thus altered the character of support for the welfare state in its traditional form.
THE REAGAN ERA AND BUSH PRESIDENCY
Repackaging of American Conservatism
The conservative ideology that was vying for American political support was certainly not new, but it, too, would undergo something of a transformation. American conservatism had been relatively dormant for decades, and, indeed, there was much talk of the end of ideology and the irrelevance of the “Left” and “Right” in American politics. American conservatism in the 1960s and 1970s had been associated with segregationists, rabid anticommunists, isolationists, protectionists, and assorted organizations largely on the political fringe. Conservatism in the Republican party had suffered from Senator Barry Goldwater's vote against the 1963 Civil Rights Act and his poor showing in the presidential election of 1964.
However, conservatism, in the sense of a commitment to limited government, free markets, and individual responsibility, was old, and it only took some effort to repackage these ideas, dissociate them from overt racism and reactionism, and develop an effective spokesperson to give them some renewed influence. Robert Nisbet, Milton Friedman, and William Buckley had been gaining some followers on American campuses for many years, and with the decline of the Republican party under Nixon, the time was ripe for a new voice and new leadership. This leadership would be found in the person of Ronald Reagan, who won the presidential election of 1980 over a beleaguered Jimmy Carter and went on to be something of a national hero, at least for a time.
Reagan was part of what appeared to be a large-scale political phenomenon. Margaret Thatcher had been chosen as prime minister in the United Kingdom in 1978, and most European elections were showing a declining influence of leftist parties and a deterioration of the labor base on which they had been built. This decline of the Left was partly due to changing “postindustrial” economies that simply had a smaller traditional labor class and partly to the high costs of welfare state programs. High taxes, low productivity, and no-growth economies were all blamed on welfare state policies, and many electorates were convinced that something else was needed. The United States did not have European-level tax rates, but had generally higher levels of unemployment (especially before 1975) and higher rates of growth. It also had lower levels of social protection and services, but the political themes used against welfare states in Europe also worked well in the United States.
Tax Cuts and Block Grants
Public bureaucracy, economic stagnation, suffocating taxes, the undermining of self-reliance, the “starvation” of the military, and excessive regulation were the common themes of the conservative attack on social welfare as it had been defined. President Reagan came into office intent on making a difference and reordering the priorities of the U.S. government. The first of the administration's victories came in the form of the Omnibus Budget Reconciliation Act (OBRA) of 1981, a comprehensive piece of legislation that substantially cut funding for social services, generally tightened eligibility requirements to focus on the “truly needy,” and created seven block-grant areas to states, greatly increasing the latitude of state governments. OBRA was an indication of things to come, in that it suggested a strategy of creating budgetary shortages that would force Congress to make cuts, rather than going after particular programs and doing battle with the often well-organized constituents of those programs. It was better to fight on the high ground of lowering taxes and controlling expenditures, things that nearly everyone claimed to support. The first of the Reagan tax cuts came shortly after OBRA was passed, and the administration made changes in the administration of taxes, regulatory processes and rules, and the enforcement of civil rights legislation. The administration appeared sympathetic to the “New Right” of the “Moral Majority,” but did little substantively to implement its agenda (Anderson, 1988).
Social Security Changes
Despite the hostility expressed toward social welfare in general, the Reagan administration recognized certain realities and did not, in the end, make a frontal assault on the basic structure of social welfare, despite its opportunity to do so when the social security system was faced with dire predictions based on income and payouts to the various trust funds in the social insurance system. Older people had increased as a percentage of the population, the post–World War II baby boom wave was still under way, and low birthrates suggested that in the future, proportionately far fewer workers would contribute to social security. The social security system was said to be facing bankruptcy, and President Reagan, who had earlier characterized social security as a “pyramid scheme,” seemed to have a unique political opportunity. However, a recession was in progress, the Republican party counted on older voters, and preliminary efforts to cut benefits in social security and Medicare had been rebuffed. Therefore, in 1982, the president appointed a bipartisan commission to study social security.
The commission recommended that the retirement age should be increased gradually to 67, that the benefits of people with incomes above a certain specified level should be taxed, and that cost-of-living adjustments of social security benefits should be delayed. These changes, adopted in 1983, substantially shored up the system, producing high levels of surplus in the retirement trust account by the early 1990s. President Reagan, perhaps the most ideologically conservative president in the 20th century, had strengthened the centerpiece of the American welfare system. However, one of these reforms in social security that was adopted would prove to be the first “chink” in the social insurance armor: The taxation of benefits and the suggestion that benefits need not go to people of higher income is in effect a means test, a step toward making social security operate less “universally” and more like “welfare.”
With regard to “welfare,” which at this point consisted entirely of AFDC, President Reagan pursued a policy that, on the one hand, abandoned much of what previous Republican administrations had sought and, on the other hand, greatly expanded the role of state governments, consistent with the Republican emphasis on the “new” federalism. The White House Policy Office concluded that work incentives had little impact on labor participation of adult recipients of AFDC and thus sought to encourage the states to implement “workfare” requirements that would exchange work, however modest in hours and type, for benefits. In addition, the administration pursued a policy of “let a thousand flowers bloom” and promoted experimentation by the states by granting waivers from federal regulations. This last policy gathered momentum slowly but had an impact in the following administration of George Bush. In addition, the Reagan administration strongly supported the enforcement of child-support regulations, increasing federal support and cooperation with states in the collection of child-support payments. It also implemented a private-sector alternative to the work and training program established by the Comprehensive Education and Training Act—the Job Training Partnership Act of 1982.
Total federal outlays for social welfare continued to increase under President Reagan, but the annual rate of increase was down to 5 percent to 6 percent, as opposed to the 10 percent typical of the previous decade. As federal expenditures slowed, however, state and local government costs continued to rise at near-1970s levels. Many states and localities found that their share of spending for various social programs, often in the form of federally mandated expenditures, required substantial increases in taxes and reduced budgetary flexibility.
The overall human and social service sector grew substantially in the 1980s, primarily in the voluntary and for-profit sectors. Title XX of the Social Security Act had allowed for the rapid expansion of contracts with private providers for services in the 1970s, and despite caps on spending and effective cuts in allocations during the Reagan era, and the number of agencies and organizations, employees, and budgets in the private sector in- creased. On the for-profit side, human service corporations, most evident in hospitals, nursing homes, and home health care and day care agencies, emerged as major actors in the delivery and management of services. The Reagan administration appointed a “privatization” commission and supported privatization with considerable rhetoric, as well as some operational and policy decisions. Privatization in its purest sense involves the divestment of public resources and the substitution of private ownership, management, and accountability. In the United States, however, it usually refers to the use of market forces to deliver publicly funded services in a way that provides for choice, competition, and cost constraint and diminishes the government's investment in permanent facilities, services, and personnel. The postal service, garbage collection, park management, data management, air traffic control, weather information, housing finance, and other services have all been the target of privatization plans. Conservatives, having a great deal of faith in the ability of economic markets and little faith in planning and administration, have proposed vouchers in education, health care, housing, and social services. Such vouchers would allow individual consumers to select among competing providers. In some areas—notably food programs, special education, and housing—vouchers are more widely used, and, as was noted earlier, the private provision of health care services is the norm.
Reduced Sensitivity to Social Issues
The Reagan years, then, witnessed a political attack on the softer side of the welfare state and a significant reduction in federal and, to a degree, state funds allocated to the social services. However, the “core” of the American social welfare system not only survived, but was enhanced in some ways. In addition, there was a greater diversity in the options for funding, managing with a consequent growth in the size and importance of the voluntary and for-profit sectors. Nevertheless, the general sensitivity to social issues seemed to decline, especially in the first half of the Reagan years.
Homelessness. Perhaps no issue symbolized this decline better than homelessness. Homelessness was apparent in major American cities in the late 1970s, but the early 1980s brought reports of substantial increases in the number of homeless. The recession of the early 1980s, the cumulative effects of the deinstitutionalization of treatment for mental illness and substance abuse, the breakdown of families, and the limited supply at the lower end of the housing market all combined to contribute to the homeless stream. Estimates of the number of homeless individuals varied widely, from a few hundred thousand to many millions, and advocacy organizations gained national attention. Although it supported the McKinney Homeless Assistance Act of 1987, the Reagan administration responded with little sympathy. Its lukewarm policy proposals and the White House's observations that the homeless were mentally ill and chose to live on the street seemed to many to capture the administration's lack of heart.
George Bush became president in 1988, after eight years as vice president, promising a “kinder and gentler” America, which many understood to be a mild criticism of the previous administration. President Bush had a personal history of involvement with education and the voluntary social welfare sector that suggested a more generous role for government, but the realities of budgetary politics allowed few additional resources to be spent on domestic programs. The tax reductions of the mid-1980s, combined with Gramm–Rudman–Hollings deficit controls and President Bush's campaign pledge of no new taxes, kept a firm grip on even the most modest of program intentions.
The administration did agree to the Civil Rights Act of 1991, which expanded the rights of those who were discriminated against to seek relief, but the legislation did not go as far as many civil rights advocates would have liked in reducing the complainant's burden of proof. The administration also supported the Americans with Disabilities Act of 1990, which dramatically increased protection from discrimination in housing, work, and public accommodation for those with disabilities. Overall, the decline of the Cold War, the collapse of the Soviet Union, and developments in Panama and Kuwait gave the Bush administration a far stronger reputation for foreign policy than for domestic policy. In the meantime, the economy slipped into a recession, and the Los Angeles riots in 1992 further reminded everyone that the problems of race and poverty were still real.
THE DEMOCRATS RETURN
President Bush was defeated after one term by Democrat Bill Clinton. Bush was regarded by many as the president of “gridlock,” presiding over a divided and ineffective legislative system and incapable of controlling either the budget or the nation's economy. President Clinton promised to get the economy moving again, to end gridlock, and to address some long-neglected social issues. He also promised to reform the nation's health care system and to “end welfare as we know it.” He entered the White House with an apparent personal interest in social welfare and experience with many social welfare programs as governor of Arkansas.
Emphasis on Domestic Issues
President Clinton appointed a more varied and decidedly more domestically oriented cabinet and sought to tackle a number of social matters. In the first year of the Clinton administration, he moved to support a family leave provision, reduce federal regulatory controls on abortion services, and end discrimination against gay men and lesbians in the U.S. military. The major policy thrust, however, would come in the area of health care. By appointing a task force chaired by Hillary Clinton, the White House sought both to dramatize the high costs of health care in the United States (the highest per capita in the world) and to point out the millions of citizens who do not have health coverage or who lose it because of health problems. Emphasizing basic principles, including universality of coverage and cost control, the administration proposed a far-reaching health scheme that would enroll all Americans in a “health alliance” that would seek bids from providers on a basic insurance package. Although hardly a federalized national health service plan, it would have nevertheless altered dramatically the administrative role of the federal government and created, for the first time, a genuinely national health care structure in the American government. Congress adjourned, however, without adopting any significant health care legislation. The dramatic resurgence of Republicans in the 1994 general election and the consequent control of both houses in Congress may support the passage of some sort of smaller-scale health care legislation in the next Congress.
The Clinton administration's plans in relation to welfare (that is, AFDC) remain less clear at this writing, although the basic outlines of a proposal have been drawn. President Clinton has proposed the time-limited receipt of AFDC, somewhat like unemployment insurance, that would require an individual to work or prepare for work at the end of two years. One question is, of course, where will these individuals work? Because gainful employment for people who are less skilled, less experienced, and less educated has not been easy to come by in recent decades, the administration seems to be considering various proposals for subsidizing the employment of former AFDC recipients. The cost of doing so and the associated need for child care services, as well as the feasibility and fairness of such a plan, are currently under consideration. As with health care, the Republican control of the House and Senate promises attention to welfare reform, although the character of the reform may well go beyond what the White House would prefer.
The budgetary constraints of the 1980s are still present, despite the Clinton administration's success with a budget agreement that included major tax increases. Therefore, it is too early to speculate on the administration's ultimate accomplishments in the social policy area. However, it is clear that the election of President Clinton and the simultaneous Democratic control of both houses of Congress did not unleash a wave of social reform legislation or reinvigorate the “old” welfare state idea. With Republicans in control of the Congress, the Clinton administration faces a different sort of activism, and the social policy agenda may shift dramatically to the right. There is, however, some reason to believe that some sort of national health plan will emerge. Regardless of its form, it will represent the one element of the welfare state package of benefits and services that has been missing in the American social welfare system, but it will be organized on the basis of a public–private mix and decentralization that certainly does not reflect the welfare state tradition (Mishra, 1990).
We approach the 21st century with a social welfare system that has worked only partially well. Poverty has been substantially reduced over the century in terms of income, but the remaining poverty rate, which is still quite high for women and many racial–ethnic groups, seems particularly dangerous. Considerable income security for elderly people has been achieved, but in ways that may be difficult to sustain in the future. The U.S. system of public assistance has suffered a collapse of political support, and some sort of welfare reform that ties benefits to work seems both inevitable and long overdue. Unemployment has remained a problem even during periods of sustained economic growth, and the ability of the U.S. economy in its early postindustrial state to absorb all its people is in question. Public education, once the great pride of American public accomplishment, is widely said to be ill suited to educate the work force of the 21st century. Health care, although apparently effective, is costly and uneven and demands some sort of reorganization. In short, the five great “specters” identified by Lord Beveridge—want, disease, ignorance, squalor, and idleness—have been tamed but not eradicated by 20th-century developments (Kaus, 1992).
It is tempting to observe that the United States approaches the next
century much as it approached the last: with a rapidly changing economy,
a rapidly changing demography, high rates of immigration and family compositional
change, challenges of urban poverty and subculture, a policy in search
of a consensus, and a widespread desire for things to be made right. In
the first part of this century, there emerged a major social reform movement,
based in the professions and the universities, that would provide much
of the social agenda for the century—social work as we know it. This movement
produced a uniquely American social welfare paradigm, embodying the complexities
of a federal governmental structure; a cultural emphasis on equality, individual
merit, and responsibility; and a mix of private and public provision. All
these elements remain. Indeed, the view that social needs are met through
a multitude of sources—the state, the market, the voluntary sector, and
the family—is stronger and more pertinent than ever. The vastness of these
resources, combined with the largely innovative, pragmatic, and less ideological
character of American social welfare development, suggests much hope for
the next century of social welfare.
Anderson, M. (1988). Revolution. San Diego: Harcourt Brace Jovanovitch.
Croly, H. (1909). The promise of American life. New York: Macmillan.
Crunden, R. (1982). Ministers of reform: The progressive achievists in American civilization, 1889–1920. New York: Basic Books.
de Schweinitz, K. (1943). England's road to social security. Philadelphia: University of Pennsylvania Press.
Diggins, J. (1992). The rise and fall of the American left. New York: W. W. Norton.
Ehrenreich, J. (1985). The altruistic imagination: A history of social work and social policy in the U.S. Ithaca, NY: Cornell University Press.
Flexner, A. (1915). Is social work a profession? In Studies in social work (Vol. 4, pp. 2–24). New York: New York School of Philanthropy.
Friedan, B. (1963). The feminique mystique. New York: W.W. Norton.
Galbraith, J. K. (1958). The affluent society. Boston: Houghton Mifflin.
Harrington, M. (1962). The other America. New York: Macmillan.
Hofstader, R. (1955). The age of reform. New York: Random House.
Jansson, B. (1988). The reluctant welfare state: A history of American social welfare policies. Belmont, CA: Wadsworth.
Katz, M. (1986). In the shadow of the poorhouse: A social history of welfare in America. New York: Basic Books.
Kaus, M. (1992). The end of equality. New York: Basic Books.
Leiby, J. (1978). A history of social welfare and social work in the U.S. New York: Columbia University Press.
Mishra, R. (1990). The welfare state in capitalist society. Toronto: University of Toronto Press.
Moynihan, D. P. (1969). Maximum feasible misunderstanding: Community action in the war on poverty. New York: Free Press.
Murray, C. (1984). Losing ground: American social policy 1950–1980. New York: Basic Books.
Patterson, J. (1981). America's struggle against poverty 1900–1980. Cambridge, MA: Harvard University Press.
Polyani, K. (1957). The great transformation. Boston: Beacon Press.
Richmond, M. (1917). Social diagnosis. New York: Russell Sage Foundation.
Rothman, D. (1971). This discovery of the asylum: Order and disorder in the new republic. Boston: Little, Brown.
Skocpol, T. (1992). Protecting soldiers and mothers: The origins of social policy in the U.S. Cambridge, MA: Harvard University Press.
Stewart, R. (1911). The philanthropic work of J. S. Lowell. New York: Macmillan.
Titmuss, R. M. (1958). Essays on the welfare state. London: Allen & Unwin.
Trattner, W. (1989). From poor law to welfare state: A history of social welfare in America (2nd ed.). New York: Free Press.
U.S. Bureau of the Census. (1978). Current population reports (Series P-23, No. 28). Washington, DC: U.S. Government Printing Office.
U.S. Social Security Administration. (1992, Winter). Social Security Bulletin [Entire issue].
Wilensky, H. L., & Lebeaux, C. N. (1965). Industrial society & social welfare. New York: Free Press.
P. Nelson Reid, PhD, is professor, North Carolina State University, Social Work Department, Raleigh, NC 27695.
For further information see
Advocacy; Archives of Social Welfare; Child Welfare Overview; Direct Practice Overview; Ethics and Values; Families Overview; Historiography; International and Comparative Social Welfare; National Association of Social Workers; Poverty; Public Social Services; Social Welfare Policy; Social Work Practice: History and Evolution; Social Work Profession: History.
social welfare history
social welfare policy
social work profession
Social Welfare Reader’s Guide
The following entries contain information on this general topic:
Archives of Social Welfare
International and Comparative Social Welfare
International Social Welfare: Organizations and Activities
Public Social Welfare Expenditures
Social Welfare History
Social Welfare Policy