This article is from the May/June 2010 issue of Dollars & Sense: Real World Economics, available at http://www.dollarsandsense.org


issue 288 cover

This article is from the May/June 2010 issue of Dollars & Sense magazine.

Subscribe Now

at a 30% discount.

W(h)ither the Dollar?

The U.S. trade deficit, the global economic crisis, and the dollar’s status as the world’s reserve currency.

By Katherine Sciacchitano

For more than half a century, the dollar was both a symbol and an instrument of U.S. economic and military power. At the height of the financial crisis in the fall of 2008, the dollar served as a safe haven for investors, and demand for U.S. Treasury bonds (“Treasuries”) spiked. More recently, the United States has faced a vacillating dollar, calls to replace the greenback as the global reserve currency, and an international consensus that it should save more and spend less.

At first glance, circumstances seem to give reason for concern. The U.S. budget deficit is over 10% of GDP. China has begun a long-anticipated move away from Treasuries, threatening to make U.S. government borrowing more expensive. And the adoption of austerity measures in Greece—with a budget deficit barely 3% higher than the United States—hovers as a reminder that the bond market can enforce wage cuts and pension freezes on developed as well as developing countries.

These pressures on the dollar and for fiscal cut-backs and austerity come at an awkward time given the level of public outlays required to deal with the crisis and the need to attract international capital to pay for them. But the pressures also highlight the central role of the dollar in the crisis. Understanding that role is critical to grasping the link between the financial recklessness we’ve been told is to blame for the crisis and the deeper causes of the crisis in the real economy: that link is the outsize U.S. trade deficit.

Trade deficits are a form of debt. For mainstream economists, the cure for the U.S. deficit is thus increased “savings”: spend less and the bottom line will improve. But the U.S. trade deficit didn’t balloon because U.S. households or the government went on a spending spree. It ballooned because, from the 1980s on, successive U.S. administrations pursued a high-dollar policy that sacrificed U.S. manufacturing for finance, and that combined low-wage, export-led growth in the Global South with low-wage, debt-driven consumption at home. From the late nineties, U.S. dollars that went out to pay for imports increasingly came back not as demand for U.S. goods, but as demand for investments that fueled U.S. housing and stock market bubbles. Understanding the history of how the dollar helped create these imbalances, and how these imbalances in turn led to the housing bubble and sub-prime crash, sheds important light on how labor and the left should respond to pressures for austerity and “saving” as the solution to the crisis.

Gold, Deficits, and Austerity

A good place to start is with the charge that the Federal Reserve triggered the housing bubble by lowering interest rates after the dot-com bubble burst and plunged the country into recession in 2001.

In 2001, manufacturing was too weak to lead a recovery, and the Bush administration was ideologically opposed to fiscal stimulus other than tax cuts for the wealthy. So the real question isn’t why the Fed lowered rates; it’s why it was able to. In 2000, the U.S. trade deficit stood at 3.7% of GDP. Any other country with this size deficit would have had to tighten its belt and jump-start exports, not embark on stimulating domestic demand that could deepen the deficit even more.

The Fed’s ability to lower interest rates despite the U.S. trade deficit stemmed from the dollar’s role as the world’s currency, which was established during the Bretton Woods negotiations for a new international monetary system at the end of World War II. A key purpose of an international monetary system—Bretton Woods or any other—is to keep international trade and debt in balance. Trade has to be mutual. One country can’t do all the selling while other does all the buying; both must be able to buy and sell. If one or more countries develop trade deficits that persist, they won’t be able to continue to import without borrowing and going into debt. At the same time, some other country or countries will have corresponding trade surpluses. The result is a global trade imbalance. To get back “in balance,” the deficit country has to import less, export more, or both. The surplus country has to do the reverse.

In practice, economic pressure is stronger on deficit countries to adjust their trade balances by importing less, since it’s deficit countries that could run out of money to pay for imports. Importing less can be accomplished with import quotas (which block imports over a set level) or tariffs (which decrease demand for imports by imposing a tax on them). It can also be accomplished with “austerity”—squeezing demand by lowering wages.

Under the gold standard, this squeezing took place automatically. Gold was shipped out of a country to pay for a trade deficit. Since money had to be backed by gold, having less gold meant less money in domestic circulation. So prices and wages fell. Falling wages in turn lowered demand for imports and boosted exports. The deficit was corrected, but at the cost of recession, austerity, and hardship for workers. In other words, the gold standard was deflationary.

Bretton Woods

The gold standard lasted until the Great Depression, and in fact helped to cause it. Beyond the high levels of unemployment, one of the most vivid lessons from the global catastrophe that ensued was the collapse of world trade, as country after country tried to deal with falling exports by limiting imports. After World War II, the industrialized countries wanted an international monetary system that could correct trade imbalances without imposing austerity and risking another depression. This was particularly important given the post-war levels of global debt and deficits, which could have suppressed demand and blocked trade again. Countries pursued these aims at the Bretton Woods negotiations in 1944, in Bretton Woods, New Hampshire.

John Maynard Keynes headed the British delegation. Keynes was already famous for his advocacy of government spending to bolster demand and maintain employment during recessions and depressions. England also owed large war debts to the United States and had suffered from high unemployment for over two decades. Keynes therefore had a keen interest in creating a system that prevented the build-up of global debt and avoided placing the full pressure of correcting trade imbalances on debtor countries.

His proposed solution was an international clearing union—a system of accounts kept in a fictitious unit called the “bancor.” Accounts would be tallied each year to see which countries were in deficit and which were in surplus. Countries with trade deficits would have to work to import less and export more. In the meantime, they would have the unconditional right—for a period—to an “overdraft” of bancors, the size of the overdraft to be based on the size of previous surpluses. These overdrafts would both support continued imports of necessities and guarantee uninterrupted global trade. At the same time, countries running trade surpluses would be expected to get back in balance too by importing more, and would be fined if their surpluses persisted.

Keynes was also adamant that capital controls be part of the new system. Capital controls are restrictions on the movement of capital across borders. Keynes wanted countries to be able to resort to macroeconomic tools such as deficit spending, lowering interest rates, and expanding money supplies to bolster employment and wages when needed. He worried that without capital controls, capital flight—investors taking their money and running—could veto economic policies and force countries to raise interest rates, cut spending, and lower wages instead, putting downward pressure on global demand as the gold standard had.

Keynes’s system wouldn’t have solved the problems of capitalism—in his terms, the problem of insufficient demand, and in Marx’s terms the problems of overproduction and under-consumption. But by creating incentives for surplus countries to import more, it would have supported global demand and job growth and made the kind of trade imbalances that exist today—including the U.S. trade deficit—much less likely. It would also have taken the pressure off deficit countries to adopt austerity measures. And it would have prevented surplus countries from using the power of debt to dictate economic policy to deficit countries.

At the end of World War II, the United States was, however, the largest surplus country in the world, and it intended to remain so for the foreseeable future. The New Deal had lowered unemployment during the Depression. But political opposition to deficit spending had prevented full recovery until arms production for the war restored manufacturing. Many feared that without continued large U.S. trade surpluses and expanded export markets, unemployment would return to Depression-era levels.

The United States therefore blocked Keynes’ proposal. Capital controls were permitted for the time being, largely because of the danger that capital would flee war-torn Europe. But penalties for surplus countries were abandoned; pressures remained primarily on deficit countries to correct. Instead of an international clearing union with automatic rights to overdrafts, the International Monetary Fund (IMF) was established to make short-term loans to deficit countries. And instead of the neutral bancor, the dollar—backed by the U.S. pledge to redeem dollars with gold at $35 an ounce—would be the world currency.

Limits of the System

The system worked for just over twenty-five years, not because trade was balanced, but because the United States was able and willing to recycle its huge trade surpluses. U.S. military spending stayed high because of the U.S. cold-war role as “global cop.” And massive aid was given to Europe to rebuild. Dollars went out as foreign aid and military spending (both closely coordinated). They came back as demand for U.S. goods.

At the same time, memory of the Depression created a kind of Keynesian consensus in the advanced industrial democracies to use fiscal and monetary policy to maintain full employment. Labor movements, strengthened by both the war and the post-war boom, pushed wage settlements and welfare spending higher. Global demand was high.

Two problems doomed the system. First, the IMF retained the power to impose conditions on debtor countries, and the United States retained the power to control the IMF.

Second, the United States stood outside the rules of the game: The larger the world economy grew, the more dollars would be needed in circulation; U.S. trade deficits would eventually have to provide them. Other countries would have to correct their trade deficits by tightening their belts to import less, exporting more by devaluing their currencies to push down prices, or relying on savings from trade surpluses denominated in dollars (known as “reserves”) to pay for their excess of imports over exports. But precisely because countries needed dollar reserves to pay for international transactions and to provide cushions against periods of deficits, other countries would need to hold the U.S. dollars they earned by investing them in U.S. assets. This meant that U.S. dollars that went out for imports would come back and be reinvested in the United States. Once there, these dollars could be used to finance continued spending on imports—and a larger U.S. trade deficit. At that point, sustaining world trade would depend not on recycling U.S. surpluses, but on recycling U.S. deficits. The ultimate result would be large, destabilizing global capital flows.

The Crisis of the ’Seventies

The turning point came in the early ’seventies. Europe and Japan had rebuilt from the war and were now export powers in their own right. The U.S. trade surplus was turning into a deficit. And the global rate of profit in manufacturing was falling. The United States had also embarked on its “War on Poverty” just as it increased spending on its real war in Vietnam, and this “guns and butter” strategy—an attempt to quell domestic opposition from the civil right and anti-war movements while maintaining global military dominance—led to high inflation.

The result was global economic crisis: the purchasing power of the dollar fell, just as more and more dollars were flowing out of the United States and being held by foreigners.

What had kept the United States from overspending up to this point was its Bretton Woods commitment to exchange dollars for gold at the rate of $35 an ounce. Now countries and investors that didn’t want to stand by and watch as the purchasing power of their dollar holdings fell—as well as countries that objected to the Vietnam War—held the United States to its pledge.

There wasn’t enough gold in Ft. Knox. The United States would have to retrench its global military role, reign in domestic spending, or change the rules of the game. It changed the rules of the game. In August 1971, Nixon closed the gold window; the United States would no longer redeem dollars for gold. Countries and individuals would have to hold dollars, or dump them and find another currency that was more certain to hold its value. There was none.

The result was that the dollar remained the global reserve currency. But the world moved from a system where the United States could spend only if could back its spending by gold, to a system where its spending was limited only by the quantity of dollars the rest of the world was willing to hold. The value of the dollar would fluctuate with the level of global demand for U.S. products and investment. The value of other currencies would fluctuate with the dollar.

Trading Manufacturing for Finance

The result of this newfound freedom to spend was a decade of global inflation and crises of the dollar. As inflation grew, each dollar purchased less. As each dollar purchased less, the global demand to hold dollars dropped—and with it the dollar’s exchange rate. As the exchange rate fell, imports became even more expensive, and inflation ratcheted up again. The cycle intensified when OPEC—which priced its oil in dollars—raised its prices to compensate for the falling dollar.

Owners of finance capital were unhappy because inflation was eroding the value of dollar assets. Owners of manufacturing capital were unhappy because the global rate of profit in manufacturing was dropping. And both U.S. politicians and elites were unhappy because the falling dollar was eroding U.S. military power by making it more expensive.

The response of the Reagan administration was to unleash neoliberalism on both the national and global levels—the so-called Reagan revolution. On the domestic front, inflation was quelled, and the labor movement was put in its place, with high interest rates and the worst recession since the Depression. Corporate profits were boosted directly through deregulation, privatization, and tax cuts, and indirectly by attacks on unions, unemployment insurance, and social spending.

When it was over, profits were up, inflation and wages were down, and the dollar had changed direction. High interest rates attracted a stream of investment capital into the United States, pushing up demand for the currency, and with it the exchange rate. The inflows paid for the growing trade and budget deficits—Reagan had cut domestic spending, but increased military spending. And they provided abundant capital for finance and overseas investment. But the high dollar also made U.S. exports more expensive for the rest of the world. The United States had effectively traded manufacturing for finance and debt.

Simultaneously, debt was used as a hammer to impose neoliberalism on the Third World. As the price of oil rose in the seventies, OPEC countries deposited their growing trade surpluses—so-called petro-dollars—in U.S. banks, which in turn loaned them to poor countries to pay for the soaring price of oil. Initially set at very low interest rates, loan payments skyrocketed when the United States jacked up its rates to deal with inflation. Third World countries began defaulting, starting with Mexico in 1981. In response, and in exchange for more loans, the U.S.-controlled IMF imposed austerity programs, also known as “structural adjustment programs.”

The programs were similar to the policies in the United States, but much more severe, and they operated in reverse. Instead of pushing up exchange rates to attract finance capital as the United States had done, Third World countries were told to devalue their currencies to attract foreign direct investment and export their way out of debt. Capital controls were dismantled to enable transnational corporations to enter and exit at will. Governments were forced to slash spending on social programs and infrastructure to push down wages and demand for imports. Services were privatized to create opportunities for private capital, and finance was deregulated.

Policies dovetailed perfectly. As the high dollar hollowed out U.S. manufacturing, countries in the Global South were turned into low-wage export platforms. As U.S. wages stagnated or fell, imports became cheaper, masking the pain. Meanwhile, the high dollar lowered the cost of overseas production. Interest payments on third world debt—which continued to grow—swelled the already large capital flows into the United States and provided even more funds for overseas investment.

The view from the heights of finance looked promising. But Latin America was entering what became known as “the lost decade.” And the United State was shifting from exporting goods to exporting demand, and from recycling its trade surplus to recycling its deficit. The world was becoming dependent on the United States as the “consumer of last resort.” The United States was becoming dependent on finance and debt.

Consolidating Neoliberalism

The growth of finance in the eighties magnified its political clout in the nineties. With the bond market threatening to charge higher rates for government debt, Clinton abandoned campaign pledges to invest in U.S. infrastructure, education, and industry. Instead, he balanced the budget; he adopted his own high-dollar policy, based on the theory that global competition would keep imports cheap, inflation low, and the living standard high—regardless of sluggish wage growth; and he continued deregulation of the finance industry—repealing Glass-Steagall and refusing to regulate derivatives. By the end of Clinton’s second term, the U.S. trade deficit had hit a record 3.7% of GDP; household debt had soared to nearly 69% of GDP and financial profits had risen to 30% of GDP, almost twice as high as they had been at any time up to the mid 1980s.

Internationally, Clinton consolidated IMF-style structural adjustment policies under the rubric of “the Washington Consensus,” initiated a new era of trade agreements modeled on the North American Free Trade Agreement, and led the charge to consolidate the elimination of capital controls.

The elimination of capital controls deepened global economic instability in several ways.

First, eliminating restrictions on capital mobility made it easier for capital to go in search of the lowest wages. This expanded the globalization of production, intensifying downward pressure on wages and global demand.

Second, removing capital controls increased the political power of capital by enabling it to “vote with its feet.” This accelerated the deregulation of global finance and—as Keynes predicted—limited countries’ abilities to run full-employment policies. Regulation of business was punished, as was deficit spending, regardless of its purpose. Low inflation and deregulation of labor markets—weakening unions and making wages more “flexible”—were rewarded.

Finally, capital mobility fed asset bubbles and increased financial speculation and exchange rate volatility. As speculative capital rushed into countries, exchange rates rose; as it fled, they fell. Speculators began betting more and more on currencies themselves, further magnifying rate swings. Rising exchange rates made exports uncompetitive, hurting employment and wages. Falling exchange rates increased the competitiveness of exports, but made imports and foreign borrowing more expensive, except for the United States, which borrows in its own currency. Countries could try to prevent capital flight by raising interest rates, but only at the cost of dampening growth and lost of jobs. Lacking capital controls, there was little countries could do to prevent excessive inflows and bubbles.

Prelude to a Crash

This increased capital mobility, deregulation, and speculation weakened the real economy, further depressed global demand, and greatly magnified economic instability. From the eighties onward, international financial crises broke out approximately every five years, in countries ranging from Mexico to the former Soviet Union.

By far the largest crisis prior to the sub-prime meltdown took place in East Asia in the mid-nineties. Speculative capital began flowing into East Asia in the mid nineties. In 1997, the bubble burst. By the summer of 1998, stock markets around the world were crashing from the ripple effects. The IMF stepped in with $40 billion in loans, bailing out investors but imposing harsh conditions on workers and governments. Millions were left unemployed as Asia plunged into depression.

When the dust settled, Asian countries said “never again.” Their solution was to build up large dollar reserves—savings cushions—so they would never have to turn to the IMF for another loan. To build up reserves, countries had to run large trade surpluses. This meant selling even more to the United States, the only market in the world able and willing to run ever-larger trade deficits to absorb their exports.

In addition to further weakening U.S. manufacturing, the Asia crisis set the stage for the sub-prime crisis in several ways.

First, as capital initially fled Asia, it sought out the United States as a “safe haven,” igniting the U.S. stock market and nascent housing bubbles.

Second, the longer-term recycling of burgeoning Asian surpluses ensured an abundant and ongoing source of capital to finance not only the mounting trade deficit, but also the billowing U.S. consumer debt more generally.

Third, preventing their exchange rates from rising with their trade surpluses and making their exports uncompetitive required Asian central banks to print money, swelling global capital flows even more.

Between 1998 and 2007, when the U.S. housing bubble burst, many policy makers and mainstream economists came to believe this inflow of dollars and debt would never stop. It simply seemed too mutually beneficial to end. By financing the U.S. trade deficit, Asian countries guaranteed U.S. consumers would continue to purchase their goods. The United States in turn got cheap imports, cheap money for consumer finance, and inflated stock and real estate markets that appeared to be self- financing and to compensate for stagnating wages. At the same time, foreign holders of dollars bought increasing quantities of U.S. Treasuries, saving the U.S. government from having to raise interest rates to attract purchasers, and giving the United States cheap financing for its budget deficit as well.

It was this ability to keep interest rates low—in particular, the Fed’s ability to lower rates after the stock market bubble collapsed in 2000—that set off the last and most destructive stage of the housing bubble. Lower interest rates simultaneously increased the demand for housing (since lower interest rates made mortgages cheaper) and decreased the returns to foreign holders of U.S. Treasuries. These lower returns forced investors to look for other “safe” investments with higher yields. Investors believed they found what they needed in U.S. mortgage securities.

As Wall Street realized what a lucrative international market they had, the big banks purposefully set out to increase the number of mortgages that could be repackaged and sold to investors by lowering lending standards. They also entered into complicated systems of private bets, known as credit default swaps, to insure against the risk of defaults. These credit default swaps created a chain of debt that exponentially magnified risk. When the bubble finally burst, only massive stimulus spending and infusions of capital by the industrialized countries into their banking systems kept the world from falling into another depression.

Deficit Politics

The political establishment—right and center—is now licking its chops, attacking fiscal deficits as if ending them were a solution to the crisis. The underlying theory harks back to the deflationary operation of the gold standard and the conditions imposed by the IMF: Government spending causes trade deficits and inflation by increasing demand. Cutting spending will cut deficits by diminishing demand.

Like Clinton before him, Obama is now caving in to the bond market, fearful that international lenders will raise interest rates on U.S. borrowing. He has created a bi-partisan debt commission to focus on long-term fiscal balance—read: cutting Social Security and Medicare—and revived “PAYGO,” which requires either cuts or increases in revenue to pay for all new outlays, even as unemployment hovers just under 10%.

By acquiescing, the U.S. public is implicitly blaming itself for the crisis and offering to pay for it twice: first with the millions of jobs lost to the recession, and again by weakening the safety net. But the recent growth of the U.S. budget deficit principally reflects the cost of cleaning up the crisis and of the wars in Iraq and Afghanistan. Assumptions of future deficits are rooted in projected health-care costs in the absence of meaningful reform. And the U.S. trade deficit is driven mainly by the continued high dollar.

The economic crisis won’t be resolved by increasing personal savings or enforcing fiscal discipline, because its origins aren’t greedy consumers or profligate governments. The real origins of the crisis are the neoliberal response to the crisis of the 1970s—the shift from manufacturing to finance in the United States, and the transformation of the Global South into a low-wage export platform for transnational capital to bolster sagging profit rate. The U.S. trade and budget deficits may symbolize this transformation. But the systemic problem is a global economic model that separates consumption from production and that has balanced world demand—not just the U.S. economy—on debt and speculation.

Forging an alternative will be the work of generations. As for the present, premature tightening of fiscal policy as countries try to “exit” from the crisis will simply drain global demand and endanger recovery. Demonizing government spending will erode the social wage and undermine democratic debate about the public investment needed for a transition to an environmentally sustainable global economy.

In the United States, where labor market and financial deregulation have garnered the most attention in popular critiques of neoliberalism, painting a bulls-eye on government spending also obscures the role of the dollar and U.S. policy in the crisis. For several decades after World War II, U.S. workers benefited materially as the special status of the dollar helped expand export markets for U.S. goods. But as other labor movements throughout the world know from bitter experience, it’s the dollar as the world’s currency, together with U.S. control of the IMF, that ultimately provided leverage for the United States to create the low-wage export model of growth and financial deregulation that has so unbalanced the global economy and hurt “first” and “third” world workers alike.

Looking Ahead

At the end of World War II, John Maynard Keynes proposed an international monetary system with the bancor at its core; the system would have helped balance trade and avoid the debt and deflation in inherent in the gold standard that preceded the Great Depression. Instead, Bretton Woods was negotiated, with the dollar as the world’s currency. What’s left of that system has now come full circle and created the very problems it was intended to avoid: large trade imbalances and deflationary economic conditions.

For the past two and a half decades, the dollar enabled the United States to run increasing trade deficits while systematically draining capital from some of the poorest countries in the world. This money could have been used for development in the Global South, to replace aging infrastructure in the United States, or to prepare for and prevent climate change. Instead, it paid for U.S. military interventions, outsourcing, tax cuts for the wealthy, and massive stock market and housing bubbles.

This mismanagement of the dollar hasn’t served the long-term interests of workers the United States any more than it has those in of the developing world. In domestic terms, it has been particularly damaging over the last three decades to U.S. manufacturing, and state budgets and workers are being hit hard by the crisis. Yet even manufacturing workers in the United States cling to the high dollar as if it were a life raft. Many public sector workers advocate cutting back on government spending. And most people in the United States would blame bankers’ compensation packages for the sub-prime mess before pointing to the dismantling of capital controls.

After suffering through the worst unemployment since the Depression and paying for the bailout of finance, U.S. unions and the left are right to be angry. On the global scale, there is increased space for activism. Since the summer of 2007, at least 17 countries have imposed or tightened capital controls. Greek workers have been in the streets protesting pension cuts and pay freezes for months now. And a global campaign has been launched for a financial transactions tax that would slow down speculation and provide needed revenue for governments. Together, global labor and the left are actively rethinking and advocating reform of the global financial system, the neoliberal trade agreements, and the role and governance of the International Monetary Fund. And there is increasing discussion of a replacement for the dollar that won’t breed deficits, suck capital out of the developing world, impose austerity on deficit countries—or blow bubbles.

All these reforms are critical. All will require more grassroots education. None will come without a struggle.

Katherine Sciacchitano is a former labor lawyer and organizer. She teaches political economy at the National Labor College.

Sources: C. Fred Bergsten, “The Dollar and the Deficits: How Washington Can Prevent the Next Crisis,” Peterson Institute for International Economics, Foreign Affairs, Volume 88 No. 6, November 2009; Dean Baker, “The Budget, the Deficit, and the Dollar,” Center for Economic Policy and Research, ( www.cepr.net); Martin Wolf, “Give us fiscal austerity, but not quite yet,” Financial Times blogs, November 24, 2009; Tom Palley, “Domestic Demand-led Growth: A New Paradigm for Development,” paper presented at the Alterantives to Neoliberalism Conference sponsored by the New Rules for Global Finance Coalition, May 21-24, 2002, www.economicswebinstitute.org); Sarah Anderson, “Policy Handcuffs in the Financial Crisis: How U.S. Government And Trade Policy Limit Government Power To Control Capital Flows, ” Institute for Policy Studies, February 2009; Susan George, “The World Trade Organisation We Could Have Had,” Le Monde Diplomatique, January 2007.

Did you find this article useful? Please consider supporting our work by donating or subscribing.

end of article