Think of it as a parable for these grim economic times. On April 19th, McDonald’s launched its first-ever national hiring day, signing up 62,000 new workers at stores throughout the country. For some context, that’s more jobs created by one company in a single day than the net job creation of the entire U.S. economy in 2009. And if that boggles the mind, consider how many workers applied to local McDonald’s franchises that day and left empty-handed: 938,000 of them. With a 6.2% acceptance rate in its spring hiring blitz, McDonald’s was more selective than the Princeton, Stanford, or Yale University admission offices.
It shouldn’t be surprising that a million souls flocked to McDonald’s hoping for a steady paycheck, when nearly 14 million Americans are out of work and nearly a million more are too discouraged even to look for a job. At this point, it apparently made no difference to them that the fast-food industry pays some of the lowest wages around: on average, $8.89 an hour, or barely half the $15.95 hourly average across all American industries.
On an annual basis, the average fast-food worker takes home $20,800, less than half the national average of $43,400. McDonald’s appears to pay even worse, at least with its newest hires. In the press release for its national hiring day, the multi-billion-dollar company said it would spend $518 million on the newest round of hires, or $8,354 a head. Hence the Oxford English Dictionary’s definition of “McJob” as “a low-paying job that requires little skill and provides little opportunity for advancement.”
Of course, if you read only the headlines, you might think that the jobs picture was improving. The economy added 1.3 million private-sector jobs between February 2010 and January 2011, and the headline unemployment rate edged downward, from 9.8% to 8.8%, between November of last year and March. It inched upward in April, to 9%, but tempering that increase was the news that the economy added 244,000 jobs last month (not including those 62,000 McJobs), beating economists’ expectations.
Under this somewhat sunnier news, however, runs a far darker undercurrent. Yes, jobs are being created, but what kinds of jobs paying what kinds of wages? Can those jobs sustain a modest lifestyle and pay the bills? Or are we living through a McJobs recovery?
The Rise of the McWorker
The evidence points to the latter. According to a recent analysis by the National Employment Law Project (NELP), the biggest growth in private-sector job creation in the past year occurred in positions in the low-wage retail, administrative, and food service sectors of the economy. While 23% of the jobs lost in the Great Recession that followed the economic meltdown of 2008 were “low-wage” (those paying $9-$13 an hour), 49% of new jobs added in the sluggish “recovery” are in those same low-wage industries. On the other end of the spectrum, 40% of the jobs lost paid high wages ($19-$31 an hour), while a mere 14% of new jobs pay similarly high wages.
As a point of comparison, that’s much worse than in the recession of 2001 after the high-tech bubble burst. Then, higher wage jobs made up almost a third of all new jobs in the first year after the crisis.
The hardest hit industries in terms of employment now are finance, manufacturing, and especially construction, which was decimated when the housing bubble burst in 2007 and has yet to recover. Meanwhile, NELP found that hiring for temporary administrative and waste-management jobs, health-care jobs, and of course those fast-food restaurants has surged.
Indeed in 2010, one in four jobs added by private employers was a temporary job, which usually provides workers with few benefits and even less job security. It’s not surprising that employers would first rely on temporary hires as they regained their footing after a colossal financial crisis. But this time around, companies have taken on temp workers in far greater numbers than after previous downturns. Where 26% of hires in 2010 were temporary, the figure was 11% after the early-1990s recession and only 7% after the downturn of 2001.
As many labor economists have begun to point out, we’re witnessing an increasing polarization of the U.S. economy over the past three decades. More and more, we’re seeing labor growth largely at opposite ends of the skills-and-wages spectrum — among, that is, the best and the worst kinds of jobs.
At one end of job growth, you have increasing numbers of people flipping burgers, answering telephones, engaged in child care, mopping hallways, and in other low-wage lines of work. At the other end, you have increasing numbers of engineers, doctors, lawyers, and people in high-wage “creative” careers. What’s disappearing is the middle, the decent-paying jobs that helped expand the American middle class in the mid-twentieth century and that, if the present lopsided recovery is any indication, are now going the way of typewriters and landline telephones.
Because the shape of the workforce increasingly looks fat on both ends and thin in the middle, economists have begun to speak of “the barbell effect,” which for those clinging to a middle-class existence in bad times means a nightmare life. For one thing, the shape of the workforce now hinders America’s once vaunted upward mobility. It’s the downhill slope that’s largely available these days.
The barbell effect has also created staggering levels of income inequality of a sort not known since the decades before the Great Depression. From 1979 to 2007, for the middle class, average household income (after taxes) nudged upward from $44,100 to $55,300; by contrast, for the top 1%, average household income soared from $346,600 in 1979 to nearly $1.3 million in 2007. That is, super-rich families saw their earnings increase 11 times faster than middle-class families.
What’s causing this polarization? An obvious culprit is technology. As MIT economist David Autor notes, the tasks of “organizing, storing, retrieving, and manipulating information” that humans once performed are now computerized. And when computers can’t handle more basic clerical work, employers ship those jobs overseas where labor is cheaper and benefits nonexistent.
Another factor is education. In today’s barbell economy, degrees and diplomas have never mattered more, which means that those with just a high school education increasingly find themselves locked into the low-wage end of the labor market with little hope for better. Worse yet, the pay gap between the well-educated and not-so-educated continues to widen: in 1979, the hourly wage of a typical college graduate was 1.5 times higher than that of a typical high-school graduate; by 2009, it was almost two times higher.
Considering, then, that the percentage of men ages 25 to 34 who have gone to college is actually decreasing, it’s not surprising that wage inequality has gotten worse in the U.S. As Autor writes, advanced economies like ours “depend on their best-educated workers to develop and commercialize the innovative ideas that drive economic growth.”
The distorting effects of the barbell economy aren’t lost on ordinary Americans. In a recent Gallup poll, a majority of people agreed that the country was still in either a depression (29%) or a recession (26%). When sorted out by income, however, those making $75,000 or more a year are, not surprisingly, most likely to believe the economy is in neither a recession nor a depression, but growing. After all, they’re the ones most likely to have benefited from a soaring stock market and the return to profitability of both corporate America and Wall Street. In Gallup’s middle-income group, by contrast, 55% of respondents claim the economy is in trouble. They’re still waiting for their recovery to arrive.
The Slow Fade of Big Labor
The big-picture economic changes described by Autor and others, however, don’t tell the entire story. There’s a significant political component to the hollowing out of the American labor force and the impoverishment of the middle class: the slow fade of organized labor. Since the 1950s, the clout of unions in the public and private sectors has waned, their membership has dwindled, and their political influence has weakened considerably. Long gone are the days when powerful union bosses — the AFL-CIO’s George Meany or the UAW’s Walter Reuther — had the ear of just about any president.
As Mother Jones’ Kevin Drum has written, in the 1960s and 1970s a rift developed between big labor and the Democratic Party. Unions recoiled in disgust at what they perceived to be the “motley collection of shaggy kids, newly assertive women, and goo-goo academics” who had begun to supplant organized labor in the Party. In 1972, the influential AFL-CIO symbolically distanced itself from the Democrats by refusing to endorse their nominee for president, George McGovern.
All the while, big business was mobilizing, banding together to form massive advocacy groups such as the Business Roundtable and shaping the staid U.S. Chamber of Commerce into a ferocious lobbying machine. In the 1980s and 1990s, the Democratic Party drifted rightward and toward an increasingly powerful and financially focused business community, creating the Democratic Leadership Council, an olive branch of sorts to corporate America. “It’s not that the working class [had] abandoned Democrats,” Drum wrote. “It’s just the opposite: The Democratic Party [had] largely abandoned the working class.”
The GOP, of course, has a long history of battling organized labor, and nowhere has that been clearer than in the party’s recent assault on workers’ rights. Swept in by a tide of Republican support in 2010, new GOP majorities in state legislatures from Wisconsin to Tennessee to New Hampshire have introduced bills meant to roll back decades’ worth of collective bargaining rights for public-sector unions, the last bastion of organized labor still standing (somewhat) strong.
The political calculus behind the war on public-sector unions is obvious: kneecap them and you knock out a major pillar of support for the Democratic Party. In the 2010 midterm elections, the American Federation of State, County, and Municipal Employees (AFSCME) spent nearly $90 million on TV ads, phone banking, mailings, and other support for Democratic candidates. The anti-union legislation being pushed by Republicans would inflict serious damage on AFSCME and other public-sector unions by making it harder for them to retain members and weakening their clout at the bargaining table.
And as shown by the latest state to join the anti-union fray, it’s not just Republicans chipping away at workers’ rights anymore. In Massachusetts, a staunchly liberal state, the Democratic-led State Assembly recently voted to curb collective bargaining rights on heath-care benefits for teachers, firefighters, and a host of other public-sector employees.
Bargaining-table clout is crucial for unions, since it directly affects the wages their members take home every month. According to data from the Bureau of Labor Statistics, union workers pocket on average $200 more per week than their non-union counterparts, a 28% percent difference. The benefits of union representation are even greater for women and people of color: women in unions make 34% more than their non-unionized counterparts, and Latino workers nearly 51% more.
In other words, at precisely the moment when middle-class workers need strong bargaining rights so they can fight to preserve a living wage in a barbell economy, unions around the country face the grim prospect of losing those rights.
All of which raises the questions: Is there any way to revive the American middle class and reshape income distribution in our barbell nation? Or will this warped recovery of ours pave the way for an even more warped McEconomy, with the have-nots at one end, the have-it-alls at the other end, and increasingly less of us in between?
Andy Kroll is a reporter in the D.C. bureau of Mother Jones magazine and an associate editor at TomDispatch, where this column originally appeared. The son of two teachers, he grew up in a firmly — and happily — middle-class household. His email is andykroll (at) motherjones (dot) com.