The Minimum Wage Revisited

In 1938, the first federal minimum wage was established.  It was set at 25-cents per hour.  Not surprisingly, business groups and industrialists protested strenuously not only at having to pay so exorbitant a rate, but at the federal government’s naked attempt (as they saw it) to “Stalinize” the American economy.

The next year, 1939, the minimum was raised to 30-cents.  By March of 1956, it had crept up to a landmark $1.00 per hour; in May of 1974, it reached $2.00; and by the time Ronald Reagan took office, in 1981, it had risen to $3.35.  Notably, it was under President Reagan that the ratio between the minimum wage and the average worker’s wage began to grow.

Today, although the federal minimum is $6.55, the gap between the minimum wage and the wage of the average worker continues to widen.  In 1968, the minimum wage represented 53-percent of the average worker’s hourly wage; by 2006 it had dropped to 31-percent—this despite the fact that the average worker’s wage, in real dollars, had, itself, declined significantly.

On July 24, 2009, the federal minimum will be raised by seventy cents, to $7.25 per hour.  Let’s do the math.  Under the new rate, if you work eight hours a day, five days a week, fifty-two weeks a year, never take vacation or miss even one day due to illness or family emergency, you will earn $15, 080.

After state and federal taxes, social security, FDIC, et al, have been deducted, it’s hard to say how much actual cash you would take home, but, obviously, since what you started with was so little, what you’re left with won’t be much.  Moreover, that $15,000 pre-deduction figure could be more wishful thinking than economic reality, as many of those full-time, 40-hour a week jobs have dried up.  Thirty-hour a week jobs are becoming more common.

Some people—libertarians, hope-to-die conservatives, free market fundamentalists—believe we shouldn’t have anything remotely resembling a federal minimum wage, that the supply-and-demand dynamics of the marketplace should be the sole arbiter.

On the other side, you have progressives saying that, if you’re going to institute such a thing as a minimum wage, the least you can do is make it realistic:  Make it the minimum income on which an average person can actually live.

Arguably, a minimum wage that doesn’t supply the necessary minimum—doesn’t allow one to make the barest living—is more a mathematical construct, a “gimmick,” than a living wage.  As the Unitarian Church aptly summarized it, “the current federal minimum [wage] is a poverty wage, not an anti-poverty wage.”

But if we’re talking about a living wage, what would that minimum be?  By definition, wouldn’t it have to be the amount required for a single person to live independently at what is, more or less, a bare subsistence level:  a tiny apartment, transportation to and from work, utilities, food, toiletries and clothing?

While one might be able to pay for “luxuries” such as DVD rentals, cable TV, the Internet or telephone, it’s unlikely a minimum wage earner could afford a car, car insurance or car maintenance.  Needless to say, health insurance is out of the question.  And, if you start adding dependents to the equation—if you’re a family with kids, or a single mom requiring child-care—you can forget about it.

Some would argue that the aforementioned scenario is too bleak and despairing.  They would argue that, to be able to purchase material goods, people don’t necessarily have to be able to afford those goods.  They don’t need a commensurate income.  All they require is credit.  Unfortunately, there’s a rebuttal to the argument that free and easy credit has no downside, and it can be expressed in two words:  Brutal Recession.

Which brings us to organized labor.  People need to be reminded that America’s most prosperous period, the post-war 1950s (and into the ‘60s), happened to be the same period when the greatest number of America’s workers—approximately 35-percent—belonged to labor unions.  Was that a coincidence?

When we say “most prosperous,” we’re not speaking of the wealthiest Americans (the top 2-3-percent), those who are doing far better today than at any time in the country’s post-war history.  Rather, we’re talking about the middle-class, the vast segment of the population that was thriving in the 1950s—who could not only buy material goods, but could actually afford them.  Unfortunately, that same middle-class began shrinking under the Reagan administration and, alas, has continued to shrink.

A proposal:  Instead of relying on an artificial device called the Federal Minimum Wage (intended to insure that low-wage workers “maintain contact” with the economy), why not keep the Feds out of it entirely?  Why not look to labor and management to reach an equilibrium?

Instead of mandating government minimums which don’t—and never will—provide an actual living wage, we should allow what conservatives themselves call the “inherent wisdom” of the marketplace to prevail.  Allow management and labor to sit down at the table and trust the “wisdom” of the collective bargaining process to lead them to the Promised Land, to a wage/benefit package suitable to both.

It goes without saying that for this arrangement to be effective we’ll need more labor union members, because what used to be a robust 35-percent now stands at a puny 12.4-percent.  The benefits of union membership should be readily apparent.  All one has to do is look around and survey the condition of our  “union deficient” landscape.

There’s always been this nagging belief out there that labor unions are somehow bad for the economy.  That’s  a myth.  It’s corporate-sponsored propaganda.  Unions might be a threat to management autocracy, and harmful to management greed, but they’re certainly not harmful to commerce.  Indeed, commerce loves them.

Economists and progressive business groups (yes, business groups) have acknowledged that higher wages help the economy by increasing the purchasing power of the consumers.  After all, who’s going to buy the stuff available, if the number of flush consumers keeps diminishing?  As George Meany famously said, “The greatest anti-poverty program ever invented was the labor union.”

It’s also been demonstrated that higher wages (union wages) result in greater productivity and lower employee turnover.  It’s an undeniable fact:  good pay and good benefits attract a higher caliber of worker than lousy wages and lousy benefits.

So, besides supplying businesses with more qualified, more stable employees, labor unions create more personal wealth across-the-board.  Those bumper-stickers you still occasionally see aren’t lying:  “Live Better.  Work Union.”  It’s true.

DAVID MACARAY, a Los Angeles playwright (“Borneo Bob,” “Larva Boy”) and writer, was a former labor rep.  He can be reached at dmacaray@earthlink.net

David Macaray is a playwright and author. His newest book is How To Win Friends and Avoid Sacred Cows.  He can be reached at dmacaray@gmail.com