Notwithstanding its many failings, capitalism has been a formidable engine of wealth creation and economic development over the last three centuries.
Yet, what classical economists and revolutionary theorists like Karl Marx called capital was in fact what financiers label “equity.” Retained earnings sit in the shareholders’ equity section of a company’s balance sheet. Technically speaking, most of the capital accumulated in the 18th and 19th centuries was equity.
Traditional Capitalism or “Equityism”
This is not to say that all the equity generated over time was internally produced or that corporations were entirely self-financed. The 1840s Railway Mania in the United Kingdom, for example, was a classic stock market bubble that was fed through the intermediation of banks using their depositors’ money, but also directly by small public investors.
Since then, other people’s money has underwritten growth, although the “paid-in” capital coming from public offerings and rights issues was also part of the shareholders’ equity stake in a corporation.
It is only from the mid-point of the 19th century onwards that debt, in the form of bank loans and public bonds, systematically helped finance businesses. That led Max Weber to observe:
“In modern economic life, the issue of credit instruments is a means for the rational assembly of capital.”
Yet until the early decades of the last century, interest-bearing debt played an ancillary role in corporate finance and an even lesser role in the lives of consumers. Except in occasional speculative cycles, such as the frenzied demand for US railroad bonds after the Civil War or the overabundance of household credit in the 1920s, equity and individual savings were the primary sources of private sector funding in capitalism’s first 250 years.
This state of affairs first changed gradually after World War II and then more briskly in the past half-century.
Financial Deregulation and Innovation
President Richard Nixon’s decision to end the Bretton Woods international monetary system in the early 1970s opened a Pandora’s Box of mobile cross-border finance. Deregulation, spearheaded by the creation of structured derivatives, immediately achieved prominent emphasis. The following decade, under President Ronald Reagan in the United States and Prime Minister Margaret Thatcher in the United Kingdom, a wave of product innovation ensured that the “Box” could never again be shut.
This colossal credit creation inspired the junk bond mania and savings and loan failures of the Roaring ’80s, emerging market crises in Mexico, Southeast Asia, and Russia in the 1990s, and the proliferation of leveraged buyouts (LBOs) as well as the subprime mortgage lending frenzy both before and after the turn of the millennium.
Private credit supply has been particularly pronounced in recent years after a hiatus during the 2008 to 2010 credit crunch when financial stimulus took over. Every debt product — sovereign, emerging markets, financial and non-financial corporate, housing, consumer, student, and health care — is at or near all-time highs. Total debt measured 150% of US GDP in 1980; today it hovers at 400%. During the worst stages of the Great Depression it was 300%.
Nowadays, debt plays a larger role than equity. Last year, bond markets totaled $130 trillion worldwide, up 30% in the past three years. Various sources put the total capitalization of equity-backed securities at between three-quarters and 80% of that amount, and that is mostly due to unprecedented quantitative easing (QE), which fueled a rally in stock valuations.
This is only part of the story. Even before the pandemic, credit was expanding at a much faster pace than stock offerings. In 2019, the securities industry collected $21.5 trillion globally. About $21 trillion of that capital was raised in the form of fixed income. Only $540 billion came from common and preferred shares.
There is a strong underlying driver behind credit’s modern popularity.
According to the traditional rules of capitalism, a debt is contractually due before or upon maturity. From 30% of gross national product (GNP) following the Revolutionary War, US government debt was entirely repaid by the 1840s. After surging to 30% during the Civil War, it was brought down to 5% by the end of the 19th century. It climbed back up to nearly 30% in 1917 due to World War I and then shrank to 15% by the time the Great Depression hit.
The combination of the New Deal and the World War II pushed total government debt beyond 100% of gross domestic product (GDP), a new metric introduced in 1934. By the 1970s, successive administrations, no matter their political leaning, had reduced this ratio to 30%.
Until then, governments had demonstrated exemplary behavior simple enough to emulate for citizens and businesses alike: Debts eventually had to be settled. As the economic sociologist Wolfgang Streeck points out, under the Keynesian blueprint:
“Debt is supposed to be paid off as the economy returns to an adequate level of growth and public budgets generate a surplus of reserves over expenditure.”
That all changed when Reaganomics substituted quasi-permanent government borrowing for tax revenues. The model has gained acceptance not just in the United States or among right-of-center political parties, but across the world and across the political spectrum. On Reagan’s watch, the US national debt almost tripled from $700 billion in 1980 to nearly $2 trillion in 1988, rising from 26% to 41% of US GDP.
Since the 1980s, public debt has risen across all OECD countries. Save for a brief period under US president Bill Clinton, nations have rarely adopted the Keynesian principle of disciplined reduction, or what Streeck calls a “consolidation state,” in contrast to today’s “debt state” in which governments make little real effort to curtail spending. US federal debt now exceeds 100% of GDP.
Corporations and consumers followed their governments’ footsteps and employed credit on a massive scale. The risk is that overuse of debt could cause bankruptcies, financial distress, and recessions. That was indeed the common scenario in past economic cycles. Downturns would compel borrowers to stop spending and look for ways to shrink their liabilities. Banks would cease lending and work out solutions for their existing distressed loan portfolios.
Liability in Perpetuity
This storyline is no longer in vogue. Debt is actually so pervasive that the term capitalism has become a misnomer. We now live in the age of leverage, or debtism. This model dictates that, in a crisis, borrowers and lenders renegotiate, amend, and extend, that is, convert and reschedule loans. Debt contracts are becoming ever more flexible.
For all the intrinsic instability that leverage provokes, governments encourage private lenders to keep lending to avoid a recession and to kick the can down the road until the economy recovers. Lenders agree because they make money not from interest charged on loans — in a debtist system, interest rates stay low — but from arrangement, prepayment, penalty, consent, and advisory fees, as well as syndication fees derived from the distribution of the default risk across the financial system.
Historically, governments incurred debt to pay for wars and counteract recessions, while the private sector — businesses, homebuyers, and consumers — did so during times of prosperity. But as Alan Greenspan explained, the period of relative economic stability between 1983 and 2007 — known as the Great Moderation — was “precisely the tinder that ignites bubbles.” Two-and-a-half decades of shallow recessions and financialization encouraged everyone to take risks.
In the face of stubborn lethargic demand, it is more than likely that we will not be able to grow into our debt burden. But despite the Biden administration’s commitment to student loan forgiveness, the ongoing debate about applying this policy to our collective loan book may be missing the point. Few have raised the prospect of never redeeming this evergreen debt, but, instead, just continuously rolling it over in the face of adversity.
Although a permanent debt overhang adds chronic stress to the economy and may eventually require some form of financial catharsis, unless governments across the world collaborate to engineer the Great Deleveraging or the Great Write-Off, the age of perpetual and extreme leverage is here to stay.
Aside from moral hazard, such a system elicits a philosophical question:
Should a loan that one neither intends nor is required to repay be considered debt or equity?
If you liked this post, don’t forget to subscribe to the Enterprising Investor.
All posts are the opinion of the author. As such, they should not be construed as investment advice, nor do the opinions expressed necessarily reflect the views of CFA Institute or the author’s employer.
Image credit: ©Getty Images / bobloblaw
Professional Learning for CFA Institute Members
CFA Institute members are empowered to self-determine and self-report professional learning (PL) credits earned, including content on Enterprising Investor. Members can record credits easily using their online PL tracker.