This file has been deleted. Please return to the index and try again.
Labor in America: The Worker's Role[an error occurred while processing this directive]United States Economy
The American labor force has changed profoundly during the nation's
evolution from an agrarian society into a modern industrial state.
The United States remained a largely
agricultural nation until late in the 19th century. Unskilled workers
fared poorly in the early U.S. economy, receiving as little as half the
pay of skilled craftsmen, artisans, and mechanics. About 40 percent of
the workers in the cities were low-wage laborers and seamstresses in
clothing factories, often living in dismal circumstances. With the rise
of factories, children, women, and poor immigrants were commonly
employed to run machines.
The late 19th century and the 20th century
brought substantial industrial growth. Many Americans left farms and
small towns to work in factories, which were organized for mass
production and characterized by steep hierarchy, a reliance on
relatively unskilled labor, and low wages. In this environment, labor
unions gradually developed clout. Eventually, they won substantial
improvements in working conditions. They also changed American politics;
often aligned with the Democratic Party, unions represented a key
constituency for much of the social legislation enacted from the time of
President Franklin D. Roosevelt's New Deal in the 1930s through the
Kennedy and Johnson administrations of the 1960s.
Organized labor continues to be an
important political and economic force today, but its influence has
waned markedly. Manufacturing has declined in relative importance, and
the service sector has grown. More and more workers hold white-collar
office jobs rather than unskilled, blue-collar factory jobs. Newer
industries, meanwhile, have sought highly skilled workers who can adapt
to continuous changes produced by computers and other new technologies.
A growing emphasis on customization and a need to change products
frequently in response to market demands has prompted some employers to
reduce hierarchy and to rely instead on self-directed, interdisciplinary
teams of workers.
Organized labor, rooted in industries such
as steel and heavy machinery, has had trouble responding to these
changes. Unions prospered in the years immediately following World War
II, but in later years, as the number of workers employed in the
traditional manufacturing industries has declined, union membership has
dropped. Employers, facing mounting challenges from low-wage, foreign
competitors, have begun seeking greater flexibility in their employment
policies, making more use of temporary and part-time employees and
putting less emphasis on pay and benefit plans designed to cultivate
long-term relationships with employees. They also have fought union
organizing campaigns and strikes more aggressively. Politicians, once
reluctant to buck union power, have passed legislation that cut further
into the unions' base. Meanwhile, many younger, skilled workers have
come to see unions as anachronisms that restrict their independence.
Only in sectors that essentially function as monopolies -- such as
government and public schools -- have unions continued to make gains.
Despite the diminished power of unions,
skilled workers in successful industries have benefited from many of the
recent changes in the workplace. But unskilled workers in more
traditional industries often have encountered difficulties. The 1980s
and 1990s saw a growing gap in the wages paid to skilled and unskilled
workers. While American workers at the end of the 1990s thus could look
back on a decade of growing prosperity born of strong economic growth
and low unemployment, many felt uncertain about what the future would
bring.
Labor Standards
Economists attribute some of America's economic success to the
flexibility of its labor markets. Employers say that their ability to
compete depends in part on having the freedom to hire or lay off workers
as market conditions change. American workers, meanwhile, traditionally
have been mobile themselves; many see job changes as a means of
improving their lives. On the other hand, employers also traditionally
have recognized that workers are more productive if they believe their
jobs offer them long-term opportunities for advancement, and workers
rate job security among their most important economic objectives.
The history of American labor involves a
tension between these two sets of values -- flexibility and long-term
commitment. Since the mid-1980s, many analysts agree, employers have put
more emphasis on flexibility. Perhaps as a result, the bonds between
employers and employees have become weaker. Still, a wide range of state
and federal laws protect the rights of workers. Some of the most
important federal labor laws include the following.
- The Fair Labor Standards Act of 1938 sets national
minimum wages and maximum hours individuals can be required to work.
It also sets rules for overtime pay and standards to prevent
child-labor abuses. In 1963, the act was amended to prohibit wage
discrimination against women. Congress adjusts the minimum wage
periodically, although the issue often is politically contentious.
In 1999, it stood at $5.15 per hour, although the demand for workers
was so great at the time that many employers -- even those who hired
low-skilled workers -- were paying wages above the minimum. Some
individual states set higher wage floors.
- The Civil Rights Act of 1964 establishes that
employers cannot discriminate in hiring or employment practices on
the basis of race, sex, religion, and national origin (the law also
prohibits discrimination in voting and housing).
- The Age and Discrimination in Employment Act of 1967
protects older workers against job discrimination.
- The Occupational Health and Safety Act of 1971
requires employers to maintain safe working conditions. Under this
law, the Occupational Safety and Health Administration (OSHA)
develops workplace standards, conducts inspections to assess
compliance with them, and issues citations and imposes penalties for
noncompliance.
- The Employee Retirement Income Security Act, or ERISA,
sets standards for pension plans established by businesses or other
nonpublic organizations. It was enacted in 1974.
- The Family and Medical Leave Act of 1993 guarantees
employees unpaid time off for childbirth, for adoption, or for
caring for seriously-ill relatives.
- The Americans With Disabilities Act, passed in 1990,
assures job rights for handicapped persons.
Pensions and Unemployment Insurance
In the United States, employers play a key role in helping workers
save for retirement. About half of all privately employed people and
most government employees are covered by some type of pension plan.
Employers are not required to sponsor pension plans, but the government
encourages them to do so by offering generous tax breaks if they
establish and contribute to employee pensions.
The federal government's tax collection
agency, the Internal Revenue Service, sets most rules governing pension
plans, and a Labor Department agency regulates plans to prevent abuses.
Another federal agency, the Pension Benefit Guaranty Corporation,
insures retiree benefits under traditional private pensions; a series of
laws enacted in the 1980s and 1990s boosted premium payments for this
insurance and stiffened requirements holding employers responsible for
keeping their plans financially healthy.
The nature of employer-sponsored pensions
changed substantially during the final three decades of the 20th
century. Many employers -- especially small employers -- stopped
offering traditional "defined benefit" plans, which provide
guaranteed monthly payments to retirees based on years of service and
salary. Instead, employers increasingly offer "defined
contribution" plans. In a defined contribution plan, the employer
is not responsible for how pension money is invested and does not
guarantee a certain benefit. Instead, employees control their own
pension savings (many employers also contribute, although they are not
required to do so), and workers can hold onto the savings even if they
change jobs every few years. The amount of money available to employees
upon retirement, then, depends on how much has been contributed and how
successfully the employees invest their own the funds.
The number of private defined benefit
plans declined from 170,000 in 1965 to 53,000 in 1997, while the number
of defined contribution plans rose from 461,000 to 647,000 -- a shift
that many people believe reflects a workplace in which employers and
employees are less likely to form long-term bonds.
The federal government administers several
types of pension plans for its employees, including members of the
military and civil service as well as disabled war veterans. But the
most important pension system run by the government is the Social
Security program, which provides full benefits to working people who
retire and apply for benefits at age 65 or older, or reduced benefits to
those retiring and applying for benefits between the ages of 62 and 65.
Although the program is run by a federal agency, the Social Security
Administration, its funds come from employers and employees through
payroll taxes. While Social Security is regarded as a valuable
"safety net" for retirees, most find that it provides only a
portion of their income needs when they stop working. Moreover, with the
post-war baby-boom generation due to retire early in the 21st century,
politicians grew concerned in the 1990s that the government would not be
able to pay all of its Social Security obligations without either
reducing benefits or raising payroll taxes. Many Americans considered
ensuring the financial health of Social Security to be one of the most
important domestic policy issues at the turn of the century.
Many people -- generally those who are
self-employed, those whose employers do not provide a pension, and those
who believe their pension plans inadequate -- also can save part of
their income in special tax-favored accounts known as Individual
Retirement Accounts (IRAs) and Keogh plans.
Unlike Social Security, unemployment
insurance, also established by the Social Security Act of 1935, is
organized as a federal-state system and provides basic income support
for unemployed workers. Wage-earners who are laid off or otherwise
involuntarily become unemployed (for reasons other than misconduct)
receive a partial replacement of their pay for specified periods.
Each state operates its own program but
must follow certain federal rules. The amount and duration of the weekly
unemployment benefits are based on a worker's prior wages and length of
employment. Employers pay taxes into a special fund based on the
unemployment and benefits-payment experience of their own work force.
The federal government also assesses an unemployment insurance tax of
its own on employers. States hope that surplus funds built up during
prosperous times can carry them through economic downturns, but they can
borrow from the federal government or boost tax rates if their funds run
low. States must lengthen the duration of benefits when unemployment
rises and remains above a set "trigger" level. The federal
government may also permit a further extension of the benefits payment
period when unemployment climbs during a recession, paying for the
extension out of general federal revenues or levying a special tax on
employers. Whether to extend jobless-pay benefits frequently becomes a
political issue since any extension boosts federal spending and may lead
to tax increases.
The Labor Movement's Early Years
Many laws and programs designed to enhance the lives of working
people in America came during several decades beginning in the 1930s,
when the American labor movement gained and consolidated its political
influence. Labor's rise did not come easily; the movement had to
struggle for more than a century and a half to establish its place in
the American economy.
Unlike labor groups in some other
countries, U.S. unions sought to operate within the existing free
enterprise system -- a strategy that made it the despair of socialists.
There was no history of feudalism in the United States, and few working
people believed they were involved in a class struggle. Instead, most
workers simply saw themselves as asserting the same rights to
advancement as others. Another factor that helped reduce class
antagonism is the fact that U.S. workers -- at least white male workers
-- were granted the right to vote sooner than workers in other
countries.
Since the early labor movement was largely
industrial, union organizers had a limited pool of potential recruits.
The first significant national labor organization was the Knights of
Labor, founded among garment cutters in 1869 in Philadelphia,
Pennsylvania, and dedicated to organizing all workers for their general
welfare. By 1886, the Knights had about 700,000 members, including
blacks, women, wage-earners, merchants, and farmers alike. But the
interests of these groups were often in conflict, so members had little
sense of identity with the movement. The Knights won a strike against
railroads owned by American millionaire Jay Gould in the mid-1880s, but
they lost a second strike against those railroads in 1886. Membership
soon declined rapidly.
In 1881, Samuel Gompers, a Dutch immigrant
cigar-maker, and other craftsmen organized a federation of trade unions
that five years later became the American Federation of Labor (AFL). Its
members included only wage-earners, and they were organized along craft
lines. Gompers was its first president. He followed a practical strategy
of seeking higher wages and better working conditions -- priorities
subsequently picked up by the entire union movement.
AFL labor organizers faced staunch
employer opposition. Management preferred to discuss wages and other
issues with each worker, and they often fired or blacklisted (agreeing
with other companies not to hire) workers who favored unions. Sometimes
they signed workers to what were known as yellow-dog contracts,
prohibiting them from joining unions. Between 1880 and 1932, the
government and the courts were generally sympathetic to management or,
at best, neutral. The government, in the name of public order, often
provided federal troops to put down strikes. Violent strikes during this
era resulted in numerous deaths, as persons hired by management and
unions fought.
The labor movement suffered a setback in
1905, when the Supreme Court said the government could not limit the
number of hours a laborer worked (the court said such a regulation
restricted a worker's right to contract for employment). The principle
of the "open shop," the right of a worker not to be forced to
join a union, also caused great conflict.
The AFL's membership stood at 5 million
when World War I ended. The 1920s were not productive years for
organizers, however. Times were good, jobs were plentiful, and wages
were rising. Workers felt secure without unions and were often receptive
to management claims that generous personnel policies provided a good
alternative to unionism. The good times came to an end in 1929, however,
when the Great Depression hit.
Depression and Post-War Victories
The Great Depression of the 1930s changed Americans' view of unions.
Although AFL membership fell to fewer than 3 million amidst large-scale
unemployment, widespread economic hardship created sympathy for working
people. At the depths of the Depression, about one-third of the American
work force was unemployed, a staggering figure for a country that, in
the decade before, had enjoyed full employment. With the election of
President Franklin D. Roosevelt in 1932, government -- and eventually
the courts -- began to look more favorably on the pleas of labor. In
1932, Congress passed one of the first pro-labor laws, the Norris-La
Guardia Act, which made yellow-dog contracts unenforceable. The law also
limited the power of federal courts to stop strikes and other job
actions.
When Roosevelt took office, he sought a
number of important laws that advanced labor's cause. One of these, the
National Labor Relations Act of 1935 (also known as the Wagner Act) gave
workers the right to join unions and to bargain collectively through
union representatives. The act established the National Labor Relations
Board (NLRB) to punish unfair labor practices and to organize elections
when employees wanted to form unions. The NLRB could force employers to
provide back pay if they unjustly discharged employees for engaging in
union activities.
With such support, trade union membership
jumped to almost 9 million by 1940. Larger membership rolls did not come
without growing pains, however. In 1935, eight unions within the AFL
created the Committee for Industrial Organization (CIO) to organize
workers in such mass-production industries as automobiles and steel. Its
supporters wanted to organize all workers at a company -- skilled and
unskilled alike -- at the same time. The craft unions that controlled
the AFL opposed efforts to unionize unskilled and semiskilled workers,
preferring that workers remain organized by craft across industries. The
CIO's aggressive drives succeeded in unionizing many plants, however. In
1938, the AFL expelled the unions that had formed the CIO. The CIO
quickly established its own federation using a new name, the Congress of
Industrial Organizations, which became a full competitor with the AFL.
After the United States entered World War
II, key labor leaders promised not to interrupt the nation's defense
production with strikes. The government also put controls on wages,
stalling wage gains. But workers won significant improvements in fringe
benefits -- notably in the area of health insurance. Union membership
soared.
When the war ended in 1945, the promise
not to strike ended as well, and pent-up demand for higher wages
exploded. Strikes erupted in many industries, with the number of work
stoppages reaching a peak in 1946. The public reacted strongly to these
disruptions and to what many viewed as excessive power of unions allowed
by the Wagner Act. In 1947, Congress passed the Labor Management
Relations Act, better known as the Taft-Hartley Act, over President
Harry Truman's veto. The law prescribed standards of conduct for unions
as well as for employers. It banned "closed shops," which
required workers to join unions before starting work; it permitted
employers to sue unions for damages inflicted during strikes; it
required unions to abide by a 60-day "cooling-off" period
before striking; and it created other special rules for handling strikes
that endangered the nation's health or safety. Taft-Hartley also
required unions to disclose their finances. In light of this swing
against labor, the AFL and CIO moved away from their feuding and finally
merged in 1955, forming the AFL-CIO. George Meany, who was president of
the AFL, became president of the new organization.
Unions gained a new measure of power in
1962, when President John F. Kennedy issued an executive order giving
federal employees the right to organize and to bargain collectively (but
not to strike). States passed similar legislation, and a few even
allowed state government workers to strike. Public employee unions grew
rapidly at the federal, state, and local levels. Police, teachers, and
other government employees organized strikes in many states and cities
during the 1970s, when high inflation threatened significant erosion of
wages.
Union membership among blacks,
Mexican-Americans, and women increased in the 1960s and 1970s. Labor
leaders helped these groups, who often held the lowest-wage jobs, to
obtain higher wages. Cesar E. Chavez, a Mexican-American labor leader,
for example, worked to organize farm laborers, many of them
Mexican-Americans, in California, creating what is now the United Farm
Workers of America.
The 1980s and 1990s: The End of Paternalism
Despite occasional clashes and strikes, companies and unions
generally developed stable relationships during the 1940s, 1950s, and
1960s. Workers typically could count on employers to provide them jobs
as long as needed, to pay wages that reflected the general cost of
living, and to offer comfortable health and retirement benefits.
Such stable relationships depended on a
stable economy -- one where skills and products changed little, or at
least changed slowly enough that employers and employees could adapt
relatively easily. But relations between unions and their employees grew
testy during the 1960s and 1970s. American dominance of the world's
industrial economy began to diminish. When cheaper -- and sometimes
better -- imports began to flood into the United States, American
companies had trouble responding quickly to improve their own products.
Their top-down managerial structures did not reward innovation, and they
sometimes were stymied when they tried to reduce labor costs by
increasing efficiency or reducing wages to match what laborers were
being paid in some foreign countries.
In a few cases, American companies reacted
by simply shutting down and moving their factories elsewhere -- an
option that became increasingly easy as trade and tax laws changed in
the 1980s and 1990s. Many others continued to operate, but the
paternalistic system began to fray. Employers felt they could no longer
make lifetime commitments to their workers. To boost flexibility and
reduce costs, they made greater use of temporary and part-time workers.
Temporary-help firms supplied 417,000 employees, or 0.5 percent of
non-farm payroll employment, in 1982; by 1998, they provided 2.8 million
workers, or 2.1 percent of the non-farm work force. Changes came in
hours worked, too. Workers sometimes sought shorter work weeks, but
often companies set out to reduce hours worked in order to cut both
payroll and benefits costs. In 1968, 14 percent of employees worked less
than 35 hours a week; in 1994, that figure was 18.9 percent.
As noted, many employers shifted to
pension arrangements that placed more responsibility in the hands of
employees. Some workers welcomed these changes and the increased
flexibility they allowed. Still, for many other workers, the changes
brought only insecurity about their long-term future. Labor unions could
do little to restore the former paternalistic relationship between
employer and employee. They were left to helping members try to adapt to
them.
Union membership generally declined
through the 1980s and 1990s, with unions achieving only modest success
in organizing new workplaces. Organizers complained that labor laws were
stacked against them, giving employers too much leeway to stall or fight
off union elections. With union membership and political power
declining, dissident leader John Sweeney, president of the Service
Employees International Union, challenged incumbent Lane Kirkland for
the AFL-CIO presidency in 1995 and won. Kirkland was widely criticized
within the labor movement as being too engrossed in union activities
abroad and too passive about challenges facing unions at home. Sweeney,
the federation's third president in its 40-plus years, sought to revive
the lagging movement by beefing up organizing and getting local unions
to help each other's organizing drives. The task proved difficult,
however.
The New Work Force
Between 1950 and late 1999, total U.S. non-farm employment grew from
45 million workers to 129.5 million workers. Most of the increase was in
computer, health, and other service sectors, as information technology
assumed an ever-growing role in the U.S. economy. In the 1980s and
1990s, jobs in the service-producing sector -- which includes services,
transportation, utilities, wholesale and retail trade, finance,
insurance, real estate, and government -- rose by 35 million, accounting
for the entire net gain in jobs during those two decades. The growth in
service sector employment absorbed labor resources freed by rising
manufacturing productivity.
Service-related industries accounted for
24.4 million jobs, or 59 percent of non-farm employment, in 1946. By
late 1999, that sector had grown to 104.3 million jobs, or 81 percent of
non-farm employment. Conversely, the goods-producing sector -- which
includes manufacturing, construction, and mining -- provided 17.2
million jobs, or 41 percent of non-farm employment in 1946, but grew to
just 25.2 million, or 19 percent of non-farm employment, in late 1999.
But many of the new service jobs did not pay as highly, nor did they
carry the many benefits, as manufacturing jobs. The resulting financial
squeeze on many families encouraged large numbers of women to enter the
work force.
In the 1980s and 1990s, many employers
developed new ways to organize their work forces. In some companies,
employees were grouped into small teams and given considerable autonomy
to accomplish tasks assigned them. While management set the goals for
the work teams and monitored their progress and results, team members
decided among themselves how to do their work and how to adjust
strategies as customer needs and conditions changed. Many other
employers balked at abandoning traditional management-directed work,
however, and others found the transition difficult. Rulings by the
National Labor Relations Board that many work teams used by nonunion
employers were illegal management-dominated "unions" were
often a deterrent to change.
Employers also had to manage increasingly
diverse work forces in the 1980s and 1990s. New ethnic groups --
especially Hispanics and immigrants from various Asian countries --
joined the labor force in growing numbers, and more and more women
entered traditionally male-dominated jobs. A growing number of employees
filed lawsuits charging that employers discriminated against them on the
basis of race, gender, age, or physical disability. The caseload at the
federal Equal Employment Opportunity Commission, where such allegations
are first lodged, climbed to more than 16,000 in 1998 from some 6,900 in
1991, and lawsuits clogged the courts. The legal actions had a mixed
track record in court. Many cases were rebuffed as frivolous, but courts
also recognized a wide range of legal protections against hiring,
promotion, demotion, and firing abuses. In 1998, for example, U.S.
Supreme Court rulings held that employers must ensure that managers are
trained to avoid sexual harassment of workers and to inform workers of
their rights.
The issue of "equal pay for equal
work" continued to dog the American workplace. While federal and
state laws prohibit different pay rates based on sex, American women
historically have been paid less than men. In part, this differential
arises because relatively more women work in jobs -- many of them in the
service sector -- that traditionally have paid less than other jobs. But
union and women's rights organizations say it also reflects outright
discrimination. Complicating the issue is a phenomenon in the
white-collar workplace called the glass ceiling, an invisible barrier
that some women say holds them back from promotion to male-dominated
executive or professional ranks. In recent years, women have obtained
such jobs in growing numbers, but they still lag significantly
considering their proportion of the population.
Similar issues arise with the pay and
positions earned by members of various ethnic and racial groups, often
referred to as "minorities" since they make up a minority of
the general population. (At the end of the 20th century, the majority of
Americans were Caucasians of European descent, although their percentage
of the population was dropping.) In addition to nondiscrimination laws,
the federal government and many states adopted "affirmative
action" laws in the 1960s and 1970s that required employers to give
a preference in hiring to minorities in certain circumstances. Advocates
said minorities should be favored in order to rectify years of past
discrimination against them. But the idea proved a contentious way of
addressing racial and ethnic problems. Critics complained that
"reverse discrimination" was both unfair and
counterproductive. Some states, notably California, abandoned
affirmative action policies in the 1990s. Still, pay gaps and widely
varying unemployment rates between whites and minorities persist. Along
with issues about a woman's place in the work force, they remain some of
the most troublesome issues facing American employers and workers.
Exacerbating pay gaps between people of
different sexes, race, or ethnic backgrounds was the general tension
created in the 1980s and 1990s by cost-cutting measures at many
companies. Sizable wage increases were no longer considered a given; in
fact, workers and their unions at some large, struggling firms felt they
had to make wage concessions -- limited increases or even pay cuts -- in
hopes of increasing their job security or even saving their employers.
Two-tier wage scales, with new workers getting lower pay than older ones
for the same kind of work, appeared for a while at some airlines and
other companies. Increasingly, salaries were no longer set to reward
employees equally but rather to attract and retain types of workers who
were in short supply, such as computer software experts. This helped
contribute even more to the widening gap in pay between highly skilled
and unskilled workers. No direct measurement of this gap exists, but
U.S. Labor Department statistics offer a good indirect gauge. In 1979,
median weekly earnings ranged from $215 for workers with less than a
secondary school education to $348 for college graduates. In 1998, that
range was $337 to $821.
Even as this gap widened, many employers
fought increases in the federally imposed minimum wage. They contended
that the wage floor actually hurt workers by increasing labor costs and
thereby making it harder for small businesses to hire new people. While
the minimum wage had increased almost annually in the 1970s, there were
few increases during the 1980s and 1990s. As a result, the minimum wage
did not keep pace with the cost of living; from 1970 to late 1999, the
minimum wage rose 255 percent (from $1.45 per hour to $5.15 per hour),
while consumer prices rose 334 percent. Employers also turned
increasingly to "pay-for-performance" compensation, basing
workers' pay increases on how particular individuals or their units
performed rather than providing uniform increases for everyone. One
survey in 1999 showed that 51 percent of employers used a
pay-for-performance formula, usually to determine wage hikes on top of
minimal basic wage increases, for at least some of their workers.
As the skilled-worker shortage continued
to mount, employers devoted more time and money to training employees.
They also pushed for improvements in education programs in schools to
prepare graduates better for modern high-technology workplaces. Regional
groups of employers formed to address training needs, working with
community and technical colleges to offer courses. The federal
government, meanwhile, enacted the Workplace Investment Act in 1998,
which consolidated more than 100 training programs involving federal,
state, and business entities. It attempted to link training programs to
actual employer needs and give employers more say over how the programs
are run.
Meanwhile, employers also sought to
respond to workers' desires to reduce conflicts between the demands of
their jobs and their personal lives. "Flex-time," which gives
employees greater control over the exact hours they work, became more
prevalent. Advances in communications technology enabled a growing
number of workers to "telecommute" -- that is, to work at home
at least part of the time, using computers connected to their
workplaces. In response to demands from working mothers and others
interested in working less than full time, employers introduced such
innovations as job-sharing. The government joined the trend, enacting
the Family and Medical Leave Act in 1993, which requires employers to
grant employees leaves of absence to attend to family emergencies.
The Decline of Union Power
The changing conditions of the 1980s and 1990s undermined the
position of organized labor, which now represented a shrinking share of
the work force. While more than one-third of employed people belonged to
unions in 1945, union membership fell to 24.1 percent of the U.S. work
force in 1979 and to 13.9 percent in 1998. Dues increases, continuing
union contributions to political campaigns, and union members' diligent
voter-turnout efforts kept unions' political power from ebbing as much
as their membership. But court decisions and National Labor Relations
Board rulings allowing workers to withhold the portion of their union
dues used to back, or oppose, political candidates, undercut unions'
influence.
Management, feeling the heat of foreign
and domestic competition, is today less willing to accede to union
demands for higher wages and benefits than in earlier decades. It also
is much more aggressive about fighting unions' attempts to organize
workers. Strikes were infrequent in the 1980s and 1990s, as employers
became more willing to hire strikebreakers when unions walk out and to
keep them on the job when the strike was over. (They were emboldened in
that stance when President Ronald Reagan in 1981 fired illegally
striking air traffic controllers employed by the Federal Aviation
Administration.)
Automation is a continuing challenge for
union members. Many older factories have introduced labor-saving
automated machinery to perform tasks previously handled by workers.
Unions have sought, with limited success, a variety of measures to
protect jobs and incomes: free retraining, shorter workweeks to share
the available work among employees, and guaranteed annual incomes.
The shift to service industry employment,
where unions traditionally have been weaker, also has been a serious
problem for labor unions. Women, young people, temporary and part-time
workers -- all less receptive to union membership -- hold a large
proportion of the new jobs created in recent years. And much American
industry has migrated to the southern and western parts of the United
States, regions that have a weaker union tradition than do the northern
or the eastern regions.
As if these difficulties were not enough,
years of negative publicity about corruption in the big Teamsters Union
and other unions have hurt the labor movement. Even unions' past
successes in boosting wages and benefits and improving the work
environment have worked against further gains by making newer, younger
workers conclude they no longer need unions to press their causes. Union
arguments that they give workers a voice in almost all aspects of their
jobs, including work-site safety and work grievances, are often ignored.
The kind of independent-minded young workers who sparked the dramatic
rise of high-technology computer firms have little interest in belonging
to organizations that they believe quash independence.
Perhaps the biggest reason unions faced
trouble in recruiting new members in the late 1990s, however, was the
surprising strength of the economy. In October and November 1999, the
unemployment rate had fallen to 4.1 percent. Economists said only people
who were between jobs or chronically unemployed were out of work. For
all the uncertainties economic changes had produced, the abundance of
jobs restored confidence that America was still a land of opportunity.
United States Economy
Source: U.S. Department of State