Dating the Recession

Alarmed by the coronavirus-induced economic collapse, the NBER declares the economy in a recession in record time.

By John Miller

My wife Ellen and I got married in 2013 after living together for 15 years. The Justice of the Peace who married us told our twelve-year old son Sam that are we had already been married, and all she was doing was helping us fill out the paper work to make our marriage official.

On June 8 of this year, the National Bureau of Economics Research (NBER), the nation’s official arbiter of the business cycle, finished its paper work, and made what we already knew official:  The COVID-19 economic collapse is a recession, and a damn bad one.   After reviewing data on the calamitous drop in employment and consumer spending and the deterioration of other economic variables, the NBER declared that the recession began in February (2020).

The depth and diffusion across the economy of the downturn convinced the NBER to announce the onset of the recession far more quickly than it usually does.   The Business Cycle Dating Committee waited a full year into the recession to declare that the Great Recession had begun in December 2007. This time, the NBER declared the onset of the recession just four months after it had begun.  The downturn was so pronounced that the dating committee didn’t bother waiting for data to confirm that the economic contraction would meet the economist’s shorthand definition of a recession, two consecutive quarters of negative real (corrected for inflation) GDP growth.

Identifying Business Cycles

To understand what economists call a “recession,” we need to look more closely at the method used by the NBER dating committee to date a business cycle, and its two phases–economic expansions and economic contractions (also called “recessions”).

The NBER tracks the waves of economic activity that economists call “business cycles.” A business cycle runs its course from trough of a recession to the peak of an expansion and back down into a trough. In the first phase of the cycle–the expansion–the economy grows as companies produce more goods and services and hire workers. When the economy begins contracting, its second phase, companies produce fewer goods and workers lose their jobs. The NBER has identified ten complete business cycles in the U.S. economy since World War II. The current task of the NBER was to decide when the expansion of the business cycle that began in June 2009 ended and entered its recession phase.

The NBER’s Dating Committee, a group of eight economists, has no rigid rules for determining the start or finish of a business cycle. For instance, the committee looks for “a significant decline in economic activity that is spread across the economy and lasts more than a few months” to identify a recession. The committee considers a broad array of macroeconomic indicators put pays particular attention to two broad monthly measures personal income less transfer payments, in real terms, and payroll employment from the Bureau of Labor Statistics’ household survey, just as they did in dating the onset of the current recession.

In short, the committee eyeballs the data and is guided by their malleable definition of an economic contraction to identify a recession. Dating a recession using the economists’ shorthand definition of a recession as two consecutive quarters of negative real growth measured by GDP would assign similar starting and ending points to a recession, but not always – particularly when a downturn is interrupted by a quarter of slow but positive economic growth. In addition, GDP data are available only after a considerable lag and are often subject to revision.

End of the Expansion

The NBER announcement also closed the books on the economic expansion that began in June 2009 lasted 128 months, making it the longest expansion on record.  The expansion, which spanned the Obama and Trump presidencies, might have been historically long it was also slow, and did little to improve the lot of most people by historical standards.  “Long but limp growth” was The Financial Times’ far from flattering description of U.S. economic performance during the decade long expansion. Its 2.3% economic growth rate was the slowest of any U.S. economic expansions since 1949.  It also failed to even match the 2.9% average posted by the sluggish economic expansion during the last decade that led up the Great Recession, and it was nowhere close to the 4.3% average growth of the ten previous expansions since 1949.

The employment record of the expansion was also a mixed bag. The expansion created fewer jobs per month than any economic expansions in the last five decades with the exception of the jobless expansion from 2002 through 2007.  But 113 straight months of positive job growth was enough to push the unemployment rate down to 3.5%, the lowest rates since 1969.  Still falling unemployment rates did little to improve workers’ wages.   Average hourly earning of production and non-supervisory workers corrected for inflation rose just 0.7%, per year, slower than the 1.1% per year rate during the 120 month long expansion in the 1990s, less than half of the 1.7% per year rate during the 106 month long economic expansion of the 1960s. Only the dismal wage growth during the expansion of the previous decade did worse.

All told, working people were tightening their economic belts even when the economy was expanding.  Now that the COVID-19 economy is contracting at an alarming rate, we are in real trouble.  But you probably didn’t need the NBER to tell you that.

John Miller is a professor of economics at Wheaton College, a member of the Dollars & Sense collective, and author of the “Up Against the Wall Street Journal” column in D&S. 

Sources:   “NBER Determination of the February 2020 Peak in Economic Activity,”  National Bureau of Economic Research, June 8, 2020; “The record-breaking US economic recovery in charts,” by Robin Wigglesworth and Keith Fray, The Financial Times, July 4, 2019;  Bureau of Labor Statistics, Total private: Average Hourly Earnings of Production and Nonsupervisory Employees, 1982-84 Dollars, Seasonally Adjusted.  Federal Reserve Bank of St. Louis, Federal Reserve Economic Data (FRED), Real Gross Domestic Product, Billions of Chained 2012 Dollars, Quarterly, Seasonally Adjusted Quarterly; and, All Employees: Total Nonfarm Payrolls, Thousands of Persons, Monthly Seasonally Adjusted Monthly; Federal Reserve Bank of St. Louis, Federal Reserve Economic Data (FRED).


Murder of George Floyd and My Segregated Youth

By John Weeks

John Weeks is a London-based member of the Union for Radical Political Economics (URPE), one of the founders of the UK-based Economists for Rational Economic Policies, part of the European Research Network on Social and Economic Policy, and a frequent contributor to Dollars & Sense and the D&S blog.

Memories of White Supremacy

I was born in Austin, Texas in 1941, a city which claims a benign “southwestern” character of cultural diversity. My public primary school was segregated, as was all my schooling until 1957 when the town authorities had no choice but to enforce the 1954 U.S. Supreme Court decision to include African-American Austinites. When I graduated in 1959 my secondary school had about 2,500 students, one of whom was black.

Though Austin’s 1960 population of 190.000 included—excluded is perhaps the better word—28,000 black citizens and 21,000 citizens of Mexican descent, so effective was the formal and informal system of segregation that I rarely encountered either. The University of Texas, where I became an undergraduate in 1959, boasted over 20,000 students. Perhaps a dozen of those students were African American.

We whites occasionally called black citizens “negroes,” though more frequently used the term “colored,” which is why my skin crawls when I read the now-common term “people of color,” with its implication now as then that whites have no color. Until the mid-1950s the few white-owned businesses that allowed black shoppers had segregated toilets with the signs “White” and “Colored.” I can recall one of my university professors provoking campus controversy by repeatedly using the term “light-skinned” rather than white, implying that we the majority was also “colored.” He hailed from Massachusetts, an ideologically suspect “Yankee.”

Only now, 60 years later, do I partially, but not fully, appreciate the profound racism of my upbringing with its many layers. What appeared to me as a benignly tolerant Austin was in practice a continual reign of white supremacy with recurrent violence to maintain the subjugation of 15% of the population that I hardly ever saw. That thinly-disguised reign of white supremacy provides a common context for almost all white Americans, day-to-day benign normality overlaying a reign of terror that sustains our privileges.

That context is essential for whites to understand the murder of George Floyd. I frequently hear critics of U.S. racism say that killing blacks is not a crime in America. The truth is more unpleasant—killing black citizens serves the functional purpose of maintaining white rule and rolling back what gains African Americans have achieved.

The Civil War and Counter-Revolution

Americans of my generation learned a specific interpretation of the country’s history that implicitly and occasionally explicitly extolled white supremacy, more explicit than implicit in the 11 ex-Confederate states. However, in the North and South, U.S. school children learned an accepted wisdom: 1) the unsuccessful impeachment of Abraham Lincoln’s vice-president and successor Andrew Johnson represented a triumph of the constitutional separation of powers; and 2) Ulysses S. Grant, Union commander and first elected president after Lincoln, was a drunk, an incompetent general, and a corrupt president.

If anything, Johnson was the boozer, notorious for giving his inaugural address drunk. John Kennedy in his ghost-written Profiles in Courage lauds as a hero the senator whose vote allegedly prevented Johnson’s impeachment, Edmund Ross of Kansas. Whether or not Ross’s vote was pivotal, Johnson’s acquittal represented a clear victory for white supremacy. Lincoln’s successor opposed the 14th Amendment to the U.S. Constitution that granted citizenship to ex-slaves, as well as facilitating the return to power of white supremacists and confederate leaders in the secessionist states.

The denigration of Grant also had its roots in support for southern racism and white supremacy. Elected president in 1868, Grant supported the radical measures to transform the secessionist states into functioning democracies. He maintained the military occupation of those states, enforced political rights of the freed slaves, and aggressively crushed the Ku Klux Klan, the terrorist wing of white supremacy. During Grant’s administration multi-racial governments took power in the secessionist states.

Dismissed by generations of historians as a corrupt drunk, Grant in practice became a champion of southern democracy, though less so in his second term. For his support of democracy, whites throughout the South, including my mother (from Alabama), loathed Grant. Had subsequent presidents maintained even a mild version of “radical reconstruction” policies the United States would today be a very different country.

I have used the word “citizen” to refer to African Americans. While strictly correct, that term has limited meaning. The counter-revolution that ended the progressive democratization of the South has a precise date, 1877, with the compromise that brought Grant’s successor to the presidency, though he gained less than 48% of the popular vote, compared to the loser’s 51%. The Compromise of 1877, “the Great Betrayal,” granted the Republican candidate the presidency in exchange for the end of progressive reform in the South.

White Supremacy Re-established

In May of this year, U.S. President Donald Trump threatened to use the U.S. Army to quell the protests of the murder of George Floyd. The legality would come from invoking the Insurrection Act of 1807, passed by Congress to prevent a conspiracy by Aaron Burr to create a country west of the Mississippi.

The main legal obstacle to use of the military against civil unrest comes from the Posse Comitatus Act of 1876. This law restricts the use of the U.S. military to intervene in civil unrest, and would block Trump from mobilizing the armed forces to use against protestors. Congress passed the Posse Comitatus Act as part of ending southern reconstruction. Its motivation was formally and legally to end the military occupation of the secessionist states. As such it played an essential role in facilitating the return of the plantation owners to power and re-establishing white supremacy.

The role of the Posse Comitatus Act in the Great Betrayal of African Americans does not argue against invoking it to prevent irresponsible and racist actions by Trump. Rather, it demonstrates how almost every aspect of the U.S. legal system has a racist undercurrent. White supremacy in the south resulted from conscious policies applied by the federal government over decades that reversed political reforms in the secessionist states, facilitating the institutionalization of white supremacy by state and local governments.

Along with Abraham Lincoln, Franklin D. Roosevelt was, in my opinion, the greatest U.S. president. Notwithstanding his accomplishments, many of FDR’s programs reinforced white supremacy, the most obvious case being his failure to support the anti-lynching bill of 1937, which southern Senators, all Democrats, filibustered to death. Less obvious, FDR’s original version of the Social Security Act of 1935 covered agricultural and domestic servants.

However, African Americans made up the agricultural labor force of the south, and white southern Congressmen controlled the committees through which New Deal legislation had to pass. Roosevelt decided that excluding agricultural workers represented a necessary condition for the passage of Social Security. More grotesque, the famous G.I. Bill of Rights (officially the Serviceman’s Readjustment Act of 1944) in practice excluded 1.2 million African-American veterans from its benefits.

In 1944, southern racists forced Roosevelt to abandon his progressive vice president Henry Wallace and replace him with Harry Truman, a Missouri Senator with a racist history (his parents owned slaves). When Truman became president after the death of Roosevelt in May 1945, few expected him to become the first chief executive since Grant to initiate a policy explicitly intended to reduce white supremacy. In 1947 he ended formal segregation of the military and a year later integrated employment in the federal government.

By contrast, President Dwight Eisenhower lobbied the head of the Supreme Court to support the southern racists opposing the path-breaking decision to end “separate but equal” schools, Brown vs. the Board of Education. Eisenhower’s successor, John Kennedy, had at best an ambiguous record on white racism during his brief tenure as president. His vice president Lyndon Johnson, like Truman a southerner (from Texas), became the first president since Grant to introduce substantial legislation aimed at undermining white supremacy—the Civil rights Act of 1964 and the Voting Rights Action of 1965. No subsequent president would take any major step towards reducing white supremacy.

Looking Backwards

The view that U.S. society has made continuous or even sustained progress reducing white supremacy since the 1950s is false. Reaction and resurgence of white supremacy have followed the few moments of substantial progress. Reconstruction in the south, which fundamentally challenged white supremacy in the secessionist states ended abruptly, followed by a complete reversal with little gained in the south beyond the formal end of slavery. Since the 1960s, Congress, state governments, and the Supreme Court have weakened the civil rights laws passed under Lyndon Johnson.

Gains would have been non-existent and the setbacks even greater were it not for the struggles led by the civil rights movement in the 1950s and subsequently. That movement has persisted despite continuous repression over the decades.

Physically, the Austin in which I grew to adulthood has changed almost beyond recognition. The racism and white rule of the 1950s and 1960s remains, altered in form but not in essence. The reign of terror that enforces white rule continues despite the statue of an African-American legislator, Barbara Jordan, in the airport and the renaming of a major street as Martin Luther King Boulevard. With the exception of very few university professors and professionals, Austin remains as segregated and white ruled as it was in my teenage years.

Two years ago when I walked the streets near the university campus and the central city south of the state capitol, I encountered few African Americans. Housing integration has meant gentrification that has forced African Americans into ever more distant suburbs. Far from weird, Austin adheres to the nature of U.S. society, idyllic for middle class whites, a repressive apartheid for African Americans.