On February 24, 2022, Vladimir Putin ordered the Russian military to initiate a full-scale invasion of Ukraine. The Russian people, outside of a few thousand brave and quickly-punished protestors, had no way to prevent their government from going to war. It was the decision of a dictator.
Because there are no structural domestic checks and balances on Putin’s power, he was able to unilaterally push forward with an invasion that seems deeply unpopular with the Russian public. Within a few short hours, his decision detonated one-third of the Russian stock market, tanked the ruble to record lows and evaporated the value of Russian bonds, sending some to zero. Some of the harshest sanctions in history have now been set in place against Moscow, preventing its banks from settling in dollars. Virtually all Russians — whether they are on the frontlines or back home — will suffer as a result of Putin’s decision.
One of the hallmark features of democracy is that citizens should, in theory, have a way to prevent their government from waging and prolonging unpopular wars. Through elected representatives, free media and dialogue around public spending, the argument goes that citizens of democracies should be more directly involved in warmaking. And if more countries become democracies, there will be less war, as democracies do not historically fight each other.
The problem is that this concept, known as “democratic peace theory,” is in danger of failing. As a result of the current dollar framework — in which America’s post-9/11 wars in Iraq, Afghanistan and beyond have effectively been paid for by borrowing — the U.S. may have already lost one of the greatest benefits of democracy: its promise of peace.
This essay advances three arguments:
- The post-1971 fiat standard, in which central banking rests on fiat currency, enables even elected governments to fight wars without public consent, presenting a terminal risk for democratic peace theory and thus for liberal democracy.
- Expensive and unpopular U.S. military operations like the Iraq War would not be possible to sustain for decades without zero interest rate policy (ZIRP) and quantitative easing (QE), which carry significant negative externalities for the average citizen.
- An eventual shift from the fiat standard to a Bitcoin standard (where BTC acts as the global reserve currency) could help bring warmaking toward the hands of the public, and away from unelected bureaucrats.
The goal of this essay is to spark broader public debate about how we pay for wars. Many Americans — and of course, many individuals in countries like Iraq, Afghanistan, Yemen and elsewhere — have found America’s post-9/11 conflicts abhorrent. But few discuss the dimension of price.
For example, the U.S. government’s “final report” on the 2007 to 2008 Great Financial Crisis (GFC) does not mention Iraq, Afghanistan or the War On Terror: as if these items had zero impact on the state of the U.S. economy in the decade between 2001 and the publication of the report in 2011.
Marcia Stigum’s “Money Market” — a hugely important textbook on the dollar-dominated global economy, likely handed out to any money-market trader on the first day of the job or to any student of banking on day one of class — does not include the word “war,” or any other related military topic, in its otherwise sprawling index.
“The Deficit Myth,” a popular and influential 300-page book by Modern Monetary Theorist Stephanie Kelton also lacks mention of the words “Iraq,” “Afghanistan” or “War On Terror.”
Time after time in modern economic discourse, expansionary foreign policy is divorced as a concept from expansionary domestic fiscal and monetary policy. War — the single-largest discretionary expense of the U.S. government — is simply left out of the discussion. It becomes invisible.
I. The End Of Democratic Peace Theory
In her sobering book, “Taxing Wars,” U.S. Air Force veteran and law scholar Sarah Kreps writes that a key assumed difference between democracies and non-democracies is that “a democratic populace bears the direct costs of war in blood and treasure.”
“The more directly [citizens] bear those costs,” she writes, the more incentives they have to pressure their leaders to keep wars short, cheap or to not wage them in the first place. Dictatorships have very few checks on their warmaking. But democracies, so the theory goes, are less likely to fight without a clear, narrow and popular mission.
Democratic peace theory is not without critics, but is widely popular in political science, and remains one of the strongest arguments for a liberal democratic system. However, in “Taxing Wars,” Kreps advances a thesis concerned with a potentially fatal flaw of this theory:
“If individuals no longer saw the costs of war, would they be less politically-engaged with the cost, duration, and outcome?”
Her research, she writes, “suggests that the answer is yes.”
Kreps says that democratic peace theory is grounded in several assumptions: “that the direct, visible costs of war are passed along to the citizenry in a democracy; that bearing the costs of war is generally unpopular and will make the people judicious about the use of force; and that they have electoral recourse.”
But since the Vietnam War, the U.S. has been increasingly engaged in what Kreps calls “Hide-And-Seek” wars, where “leaders have shied away from asking the populace for fiscal sacrifice, thereby anticipating and sidestepping public constraints on their conduct of war by avoiding war taxes and seeking less obvious forms of war finance, especially borrowing.”
“Taxation is onerous,” Kreps writes, “and when citizens bear the burden of war in taxation, this creates tighter institutional linkages between the public and leaders’ conduct of war, as taxpayers have more incentives to hold leaders accountable for how the resources are being used.”
“In contrast,” she writes, “borrowing shields the public from the direct costs and insulates leaders from heavy scrutiny.”
Kreps’s book relies on historical tax, bond and spending data, as well as public opinion polling about war going back a century. One major takeaway, though seemingly obvious, is that taxed wars are less popular than untaxed wars.
“A war financed through higher taxes,” she observes, “decreases support by about 20% compared to the baseline scenario without taxes.”
American elected leaders know this, and since Vietnam have sought other ways to pay for wars. This was on display during the peak of the Iraq War in 2007, when Congressmen John Murtha and Jim McGovern proposed a war tax to finance the surge. It was based on a sliding scale, something that columnist E.J. Dionne called the “rare Democratic proposal that does not put the entire burden of taxation on the rich.”
But House Speaker Nancy Pelosi rejected the war tax, saying it was “not a Democratic proposal,” and hinted that the Democrats would suffer at the ballot box if they tried to push it through. As Kreps notes, “debate was perfunctory and questions about the potential effect of a war tax on support for the war were glossed over.”
Taxation was dismissed in favor of borrowing in a strong show of bi-partisanship.
In another example, in 2014 President Obama launched Operation Inherent Resolve, a now nearly eight-year war against the Islamic State in Syria, Iraq and Libya. The American public has been largely unaware of the scale and price of these operations. Kreps says legislators were “relatively silent on each of these fronts because their constituents [were] silent. The constituents [were] silent because they are shielded from the costs of war.”
One trend that aids the U.S. government’s ability to conduct its “invisible” post-9/11 wars is that they are waged in a way in which fewer American soldiers die. Long gone are the days of conscription.
“Leaders have shifted away from a labor-intensive military,” Kreps writes, “in favor of a capital-intensive military that is financially costlier but poses a lower risk of casualties. While the Vietnam War incurred over 58,000 fatalities at a financial cost of about $750 billion in 2010 dollars, the combined wars in Iraq and Afghanistan — also roughly a decade in duration — resulted in around 6,000 fatalities but at a cost of about $1.5 trillion.”
These are not uniquely American trends. Kreps points out that Israel, for instance, has not fought a war that required full mobilization of reserve units or instituted a war tax since the early 1980s. European countries and even India have shown similar behavior. Democracies worldwide are increasingly choosing to subject fewer of their citizens to the physical cost of war, instead using easy money and advanced technology to quietly impose the price on future generations.
The Austrian economist Joseph Schumpeter thought that “the liberal state, one where individuals bear the burdens of war and have levers for registering disapproval,” would exercise “powerful restraint in its foreign policy.”
Yes — but only if the accountability mechanism of citizens holding control over government spending is maintained.
It seems, however, that over time, the result of the fiat standard is that citizens grow weary of war, politicians eventually borrow instead of tax, the public becomes unaware of and unengaged with war, arms dealers grow larger and more powerful and democratic peace theory breaks.
II. The Credit Card Wars
Today, Americans live in an age of “credit card” wars, putting the costs of military action on the national tab, deferring payment today in exchange for owing interest and principal tomorrow. But this has not always been the case.
Between 1900 and 1960, the United States used its military largely with the consent of its people, financing war efforts in significant part with taxation and by selling war (or “liberty”) bonds.
But as the gold standard came to a close in the 1960s, paving the way for the post-1971 fiat standard, the mechanism for war finance changed permanently.
In the last few decades, America has paid for its military operations in Afghanistan, Iraq and beyond entirely through borrowing.
As of 2020, a total of $2.02 trillion had been borrowed and spent by the U.S. government on the post-9/11 wars. Americans have by now paid roughly an additional $1 trillion in interest alone for the privilege of borrowing to wage conflicts that have become increasingly distant from public discourse.
The War On Terror’s global operations have been detached from the average American’s life, in part by the end to national service and the dawn of military drones and robotics, and in part because the actual cost of these conflicts have been hidden from the people through debt financing.
In a 2017 testimony to U.S. Congress, Harvard University scholar Linda Bilmes called the wartime budgetary process for post-9/11 military operations “the largest single deviation from standard budgetary practice in U.S. history.”
“In every previous extended U.S. conflict,’ she notes, “including the War of 1812, the Spanish-American War, Civil War, World War I, World War II, Korea and Vietnam — we increased taxes and cut non-war spending. We raised taxes on the wealthy.”
By contrast, she says, in 2001 and 2003 Congress cut taxes, and the invasions of Iraq and Afghanistan were paid for “by piling up debt on the national credit card.”
As political scientist Rosella Cappella Zielinski observed, “rather than raising taxes or shifting funds from other parts of the federal budget, the Bush administration cut taxes while increasing war spending, moving the nation out of budgetary surplus and into deficit spending, which in turn increased the national debt and the interest that must be paid on that debt.” Washington, of course, spends more on war if it borrows than if it simply pays as it goes.
In “The Cost Of Debt-Financed War,” military economist Heide Peltier sums up the American predicament:
“Part of the problem with funding war through debt is that American voters and taxpayers don’t feel the cost of war. Unless they have a service-member in their family or among their close friends or relations, in which case they might experience the human toll of war, war poses little burden and is in some ways invisible. Its costs are hidden because we are not, as citizens and taxpayers, being asked to shoulder the financial burden of war in any visible or noticeable way. We are not patriotically buying war bonds (as in World War II) or having war taxes levied upon us that make the costs of war feel immediate and tangible. The costs are borne in a less noticeable and more general way as we pay our regular (peacetime) taxes, and will be borne in greater measure by future generations who will have to face increased taxes or reduced public spending in order to pay the cost of rising public debt and interest.”
In 20 congressional fiscal hearings between 2001 and 2017 regarding America’s conflicts abroad, war funding strategy was only discussed once. Compare this, for example, to the Vietnam era, when war financing was debated at 70% of such meetings.
The concept of an “invisible” war is something that appeared to weigh heavily on President Obama when he left office. Famously a president awarded the Nobel Peace Prize early in his first term only to mire the U.S. in even more wars than his predecessor, in an exit interview he revealed that he was worried about “a president who can carry on perpetual wars all over the world, a lot of them covert, without any accountability or democratic debate.”
His worry, sadly, is our current reality. In her Congressional testimony, Bilmes pointed out that the post-9/11 wars have been funded by emergency bills exempt from spending caps and without requirements to offset cuts anywhere else in the budget. More than 90% of spending for Iraq and Afghanistan was paid for in this way, compared to 35% for Korea or 32% for Vietnam. American public polling in the 21st century indicates that war comes up less and less in conversations, and increasingly does not impact people’s lives.
Bilmes concluded her testimony by telling Congress that relying exclusively on borrowing for the post-9/11 wars had:
- Reduced transparency over spending
- Lowered accountability for war expenditures
- Weakened fiscal discipline over the defense budget
- Triggered less public debate on war
- Pushed the cost to future generations
- Failed to properly plan for funds promised to veterans who actually fought the wars
- Made it easier to engage in and prolong war
According to the Cost Of War project at Brown University, interest payments on money borrowed to fight the post-9/11 wars may one day actually eclipse the actual spending for those wars. The project’s authors project that even if spending ceased today, total interest payments would rise from the $1 trillion already paid to $2 trillion by 2030 and to $6.5 trillion by 2050.
For context, the current U.S. fiscal budget for 2022 is approximately $6 trillion, mostly earmarked for entitlements. Military spending makes up the largest discretionary expenditure, at $750 billion, while support for veterans makes up another $270 billion. Annual interest payments account for around $300 billion in today’s near-zero interest rate environment, a good portion of that going to pay back war borrowing. In total, more than $1 trillion (close to 20%) of the annual U.S. fiscal budget is military-related.
Washington projects its 2022 revenue through taxes and other streams at only around $4 trillion, meaning this year more than $2 trillion of new debt will add to the existing $30 trillion pile.
As of the publication of this article in March 2022, the federal funds rate (the bedrock interest rate for the global economy, referring to the rate at which banks borrow and lend excess reserves to each other overnight) is 0.08%. The Federal Reserve influences this rate by adjusting how much it offers banks as it borrows or holds their reserves.
If the Fed hikes rates to 3% — low by modern historical standards, but seemingly steep by today’s — then more than one-quarter of the $4 trillion government revenue for 2022 would need to go toward interest payments.
In order to stop rates from going up, the U.S. government has undertaken unprecedented intervention in the bond markets, with the Federal Reserve purchasing nearly $9 trillion of government debt and subprime mortgages since 2008, providing liquidity for assets that would otherwise have no equivalent buyer. Since March 2020, America’s central bank has bought approximately $4.7 million of assets per minute.
With the U.S. debt-to-GDP ratio (a commonly-used metric to determine national indebtedness) now moving past World War II territory, many question how long American policymakers can sustain this activity while keeping a bull bond market going. Eventually, U.S. debt — even though it remains easily the most in-demand financial collateral in the world — could become discredited. The U.S., after all, has defaulted twice in the past 100 years, in 1933 and 1971, each time devaluing the dollar and betraying a promise it had made to the international system.
Perhaps the greatest perk of dollar hegemony is that foreign nations are compelled or incentivized to buy U.S. debt and therefore (often unwillingly) finance America’s wars. But this is starting to change, as countries like China and Japan reached their peak holdings of treasuries in 2013 and 2014, and have since been gradually reducing their holdings. Since low interest rates are so critical to America’s spending, including its wars abroad, the Federal Reserve has countered this trend, becoming the single-largest buyer of U.S. debt since 2008, pushing its share of the treasury market to 20% since September 2020. Zooming out, the Fed’s share of U.S. debt has increased from 15% in the 2002 to 2019 era to 64% in the 2020 to 2021 era, while foreign ownership has declined in the same timeframe from 33% to 14%.
As macroeconomic analyst Alfonso Peccatiello writes, “long-term real yields must remain super low for the system not to collapse, as we become more and more leveraged over time.” In other words, high interest rates would likely force the U.S. government to shrink its warmaking activity, as it would be deterred from additional spending by popular frustration with rising inflation or the strong reluctance towards additional taxes.
250 years ago, Adam Smith wrote that “relying mainly on borrowing was a mistake: it hid the cost of war from the public” and encouraged war “by hiding the true costs.” Seventy-five years later, John Stuart Mill argued that borrowing for war was perhaps justified, but only as long as interest rates did not rise.
What Smith and Mill could not have known is that modern governments would figure out a powerful trick: how to borrow massively for war without causing a rise in interest rates.
III. The Evolution Of American War Finance
In “War And Inflation In The United States From The Revolution To The First Iraq War,” economist Hugh Rockoff gives a detailed history of American war finance.
Before the 20th century, the American state was so vastly different in structure to today that it is hard to make comparisons, but it is still helpful to look at how the early wars were conducted.
The Revolutionary War was famously financed — at times entirely — by the printing press. The phrase “not worth a continental” described the hyperinflation that wreaked monetary havoc on eastern North America as the revolutionaries tried to break away from the British Empire.
The War Of 1812 introduced more borrowing concepts, including for example a $16 million war loan. In that case, however, the promissory notes could not be sold at par and incurred high rates, resulting in the government being forced to raise more taxes.
Smaller wars like the Mexican-American War were minor enough to be paid for entirely by borrowing without fear of interest rates rising. But when it came to the Civil War, both sides needed to print money.
Rockoff gives an overview of Civil War finance: the North issued “$500,000 in 5-20s: six percent bonds with interest payable in gold, callable after 5 years maturing at 20… the 5-20s could be exchanged at par for greenbacks, so essentially the government was printing money to buy bonds; economically the same as the Federal Reserve open market operations undertaken during World War II.”
“Eventually,” he writes, “the right to convert greenbacks into interest bearing gold bonds was terminated, so the greenbacks became a pure fiat money.”
He notes that the National Banking Act’s goals were to monetize part of the federal debt while trying to keep nominal interest rates low.
During the first half of the 20th century, things changed as the modern state was built and the American people became strongly connected to warmaking through taxes. Taxation financed 30% of the cost of World War I, 50% of the cost of World War II, and 100% of the cost of the Korean War. Americans were largely in favor of these wars (based on historic opinion polling) and were willing to sacrifice blood and treasure for the causes.
As America entered World War I in 1917, Rockoff explains that “Alcohol, tobacco, jewelry, cameras, cosmetics, chewing gum, and many others came in for new or increased taxes. Income taxes, now possible because of the sixteenth amendment, were raised. The highest rate was 67%.”
“The Treasury,” he writes, “also made efforts to encourage people to buy bonds through a national campaign based on patriotism. Giant rallies were held in which celebrities, including Hollywood stars, urged people to support the war effort by buying the bonds.”
To help cover costs that taxes and liberty bond sales could not match right away — presaging future tactics — the Treasury sold short-term liabilities directly to the newly-created Federal Reserve, monetizing part of the debt. This mirrored events across the pond.
As detailed in Saifedean Ammous’s “The Fiat Standard,” in November 1914 the British government “issued the first war bond, aiming to raise 350 million pounds from private investors at an interest rate of 4.1% and a maturity of ten years. Surprisingly, the bond issue was undersubscribed, and the British public purchased less than a third of the targeted sum. To avoid publicizing this failure, the Bank of England granted funds to its chief cashier and his deputy to purchase the bonds under their own names.”
This was one of the more prominent early examples of government bond market intervention to finance war, and it would provide a blueprint for America to follow for decades to come.
Regarding World War II, Rockoff notes that “the attack on Pearl Harbor created deep and long-lasting support for the war, making it possible for the Roosevelt administration to increase taxes without worrying about adverse political effects.”
Liberty bonds also remained effective. To give one example, in 1943 employees of the New York Fed teamed up to purchase $87,000 of war bonds. They were informed that their funds helped the army acquire a 105 mm howitzer and a P-51 Mustang fighter plane. Contrast this to today, when most Americans do not even know how war is funded, forget knowing exactly where the funds are spent.
To fight the Axis powers, the U.S. Treasury paired liberty bond revenue and enormous new tax increases with more bond market intervention. The Fed set a floor for the price of government securities — fixing the interest rates for long-term bonds at 2.5% — and bought “whatever amount of bonds was necessary to prevent the price from falling below that level.”
The government continued to intervene in the bond markets until 1953, through the Korean War, but in a diminishing way. The spending for Korea — the first major Cold War conflict — was covered entirely by aggressive income, corporate, sin and luxury taxes. It’s worth noting how nationally popular these taxes, and thus the public’s willingness to pay a cost to fight, really were, passing 328 to seven in the House.
The broader tax-based consensual relationship with regard to warmaking between the American government and people ended during the Vietnam War. In a deeply unpopular move, President Johnson announced new taxes for war spending in 1967, the last time taxes would be raised during Vietnam operations. One year later, amid enormous political pressure, Johnson announced he would not run for a second term.
Two noteworthy trends were apparent across these early 20th century wars. First, the Federal Reserve acted as a counter to populist presidents. For example, Truman opposed increases in nominal interest rates with an eye on winning the next election. Johnson later did the same, fighting the Fed’s mid-1960s rate hikes. But in both cases, the Fed raised rates anyway, putting somewhat of a check on borrowing. It is not clear, to put it mildly, if this kind of independence still exists today.
A second trend Rockoff notices is that, whereas before World War II, emergency wartime economics — “high levels of government spending financed in part by borrowing from the public and in part by money creation” — were temporary, in the post-1971 era, they became the “peacetime norm.”
Rockoff concludes:
“The natural reaction when faced with a major war was for governments to borrow the sums needed. But large scale borrowing raised the prospect of substantial increases in interest rates. For a variety of reasons war governments were loath to see interest rates rise above prewar norms. For one thing, higher rates would be a signal to the public and to friends and foes abroad that the government’s decision to wage war was undermining the economy. Increasing taxes at least to a level that promised to be sufficient to pay interest and principal on war debt was an obvious necessity for keeping interest rates in check.”
This conclusion was shared by none other than John Maynard Keynes, who argued that the British state should fund its World War II operations through taxation, and not borrowing.
The problem is, war taxes are no longer a solution in the 21st century. Americans do not want to pay for wars they do not care about. So Washington had to figure out a way to borrow for exotic wars without interest rates going up.
IV. War Spending In The Post-9/11 Era
Unlike pre-Vietnam era wars, which mainly had narrow and clear missions and strong public support, America’s invasions of Iraq and Afghanistan morphed into “forever wars.”
This mission creep was only possible because their staggering costs were hidden from the public by the way they were financed.
As political scientist Neta Crawford writes, “if we hadn’t had such low interest rates, and Congress had moved, for example, to raise taxes instead of cut them, the public would have paid attention to these wars in a different way.”
Indeed, there was mass public objection to the Iraq War (with some of the largest protests in the United States since the Vietnam War) but partially because the public was not asked to pay for the war, dissent eventually dwindled instead of intensifying. Ten years after it began, Iraq was a topic barely mentioned in normal everyday conversation between Americans.
This is because American legislators decided to borrow to pay for these wars, choosing to defer costs to future generations. But how exactly does it work, to pay for a war without taxes or war-specific bonds?
First, the U.S. government needs to create some money for war, so it holds an auction through its Treasury Department. U.S. debt instruments of different maturities (20- to 30-year bonds, two- to 10-year notes, and short-term bills) are sold — to finance many activities, of course, not just war — to a network of primary dealer banks (the most senior and trusted global financial institutions), who in turn sell those securities to the secondary global market.
As an accumulative consequence of World War I, British decline, the Bretton Woods system, American economic growth, the petrodollar system, and the eurodollar system, U.S. government debt became the premium financial collateral in world markets. Treasuries are the “risk-free” asset, treated as money by large institutions who cannot simply hold millions or billions in a bank account. Despite large deficits run up by Washington, U.S. debt remains extremely liquid and in high demand.
That being said, it is important to keep in mind that some of this demand is forced: primary dealers are obligated to buy treasuries and bid on every auction, and various financial institutions are mandated to hold treasuries.
As Peccatiello notes, since 2013, banks worldwide have been required to keep around 10% to 15% of their assets in bank reserves and bonds.
“Effectively,” he says, “banks were asked to own a large amount of liquid assets and were told government bonds were the most obvious choice -— they are essentially risk-free and often yield more than a simple overnight deposit at the domestic central bank. A huge, relatively low price-elastic demand for bonds was created by a mere regulatory change.”
The rules of the system influence global demand, and today there are plenty of customers lining up to buy the U.S. Treasury’s promises to pay. Once the auction is complete, the bank deposits of the bond purchasers get drawn down, reserves are deducted from their commercial bank, and the U.S. government’s Treasury General Account (TGA) at the Fed gets filled up.
Next, the U.S. government’s department for war — now euphemistically named the Department of Defense, or the Pentagon for short — uses this new money to buy guns, tanks, planes, ships and missiles. So, it will place an order for this weaponry from the private sector. As a method of payment, the Fed will draw down the TGA’s balance and add reserves to the arms dealer’s commercial bank. The bank will then extend the arms dealer’s deposit account by that same amount.
And voila, the U.S. government has purchased military equipment with nothing more than a promise to pay — a promise highly dependent on interest rates.
It is worth pondering what would happen if war bonds were labeled as such instead of being hidden among general securities. Would they trade at a discount on Wall Street? Would they be boycotted by ESG funds or social impact investors? We may never know.
V. The Age Of Quantitative Easing
Once primary dealers sell treasuries to secondary markets, additional buying pressure is exerted on the global marketplace for U.S. debt by the American government through the act of the Fed buying short and — with the advent of a new trick — long-term government securities.
According to Stigum’s “Money Market,” “Few factors move the bond market more than the Federal Reserve. The Federal Reserve’s ability to alter short-term interest rates and the impact that this has on the bond market and the financial markets in general is immense.”
Government purchasing of short-term treasuries has been common practice in the post-1971 financial system, with the Fed’s trading desk buying and selling millions of dollars in securities to “make markets” a regular occurance. This process, however, was supercharged in 2008 in response to the Great Financial Crisis.
As the GFC exploded, the Fed used its trading desk and “forward guidance” to drop interest rates to zero, but this still did not have the desired stimulative effect. Investors were still hiding in longer-duration treasuries, and subprime mortgages were cratering, destroying stunning amounts of value of derivative exposure in the shadow-banking system, causing devastating effects for the global economy. So, to try and take the 10-year and longer-duration bonds off the market, the Fed — inspired by similar military-era programs during World War I and World War II — began buying them through a process known as “quantitative easing” or QE for short.
In QE, the Fed will purchase any amount of not just bills and notes but also long-term bonds from primary dealer banks and in return fill up their accounts at the Fed with bank reserves. Since 2008, the Fed has purchased an astronomical amount of U.S. government securities, totaling nearly $9 trillion, becoming the world’s single-largest buyer.
Technically, the Federal Reserve cannot, as it once did in wartime, simply buy U.S. government debt outright. But since the private sector is obligated to buy the debt, and also obligated to sell to the Fed, this technicality is easily overcome.
In reality, the U.S. government has monetized trillions of dollars of debt by printing promises to pay with one hand at the Treasury and buying them up with the other hand at the Fed with no intention of selling them all back to the market. QE seems like it should be a controversial program, but public interest has been muted compared to other large-scale government programs, especially since “Fedspeak” has been employed to make sure the process sounds complicated and so that people do not ask too many questions. As investment analyst Mohamed El-Erian has observed, QE “would trigger a much bigger societal reaction were it broadly understood.”
Let’s look at the mechanics behind this process:
When the Fed buys U.S. treasuries of varying maturities, it reduces the supply of those bonds on the open market, increasing the value of the outstanding bonds held by the private sector. When a bond’s value goes up, its yield goes down. And so the Fed puts downward pressure on treasury interest rates through this process, known as “open market operations.”
The key link to warmaking is that with lower interest rates, the U.S. government pays less on its debt, and can take on more debt than otherwise possible with higher interest rates. In the pre-1971 era, policymakers were constrained by high interest rates, and were forced to tax for war.
For example, for each 1% hike to the federal funds rate — which today, in March 2022, would be from around 0% to around 1% — the U.S. government would need to pay an additional $300 billion in interest, around 5% of the 2022 federal budget. No bueno.
But in the age of QE, policymakers are unconstrained. They can finance the forever wars without worrying too much about the interest rate on debt going up.
According to former Federal Reserve trader Joseph Wang, before the Great Financial Crisis, the Fed had no control over medium- to long-term debt, which was priced by the bond market. If the bond market felt that the U.S. government was being irresponsible, then it could punish Washington with higher interest rates by selling its debt. Today, Wang says, the Fed has taken away this restraint on political power.
The bond market is an intelligent organism of sorts. For example, it sensed the outbreak of a global pandemic in early March 2020, and naturally started to shrink in response to expected deflation. But the Fed intervened, buying more bonds each day in late March 2020 than it did during the entire QE event of 2008, keeping the bond market much larger than it would have been otherwise.
The big question is: What would have happened if the Fed had never bought any bonds during the last 15 years, if those nearly $9 trillion of securities were floating in the open market, with no buyer of last resort?
What kind of interest rates would we see on short- and long-term American debt? And what kind of constraints on warmaking would the American government face?
VI. Modern Monetary Theory And War
Over the past few years, Modern Monetary Theorists have gained power and influence on the following claim: that countries which issue the currency that their liabilities are denominated in cannot run out of money and should not worry about a deficit. They can simply print as much of it as they need in a quest for full employment, and stop only whenever they see inflation.
This leads MMT torchbearer Kelton to give an alternative narrative of how war spending works:
“Once Congress authorizes the spending,” she writes, “agencies like the Department of Defense are given permission to enter into contracts with companies like Boeing, Lockheed Martin, and so on. To provision itself with F-35 fighters, the U.S. Treasury instructs its bank, the Federal Reserve, to carry out the payment on its behalf. This is done by marking up numbers in Lockheed’s bank account. Congress doesn’t need to ‘find the money’ to spend it. It needs to find the votes! Once it has the votes, it can authorize the spending. The rest is just accounting. As the checks go out, the Federal Reserve clears the payments by crediting the seller’s account with the appropriate number of digital dollars, known as bank reserves. That’s why MMT sometimes describes the Fed as the scorekeeper for the dollar. The scorekeeper can’t run out of points.”
Kelton continues:
“America cannot run out of dollars because it can print them. It will therefore always be able to pay its debts. Further, Uncle Sam does not actually need to borrow money or raise taxes to increase public spending; the government can simply finance new outlays through money printing if the Federal Reserve is willing to let it. Thus, neither the absolute size of America’s debt load nor the threat of ‘bond vigilantes’ refusing to buy U.S. treasuries at affordable interest rates constrain Congress’s spending power.”
The problem is, what happens when no one else except the U.S. government wants to buy those securities? This is why, as Kelton and other MMTers admit, only “reserve currency” nations with significant foreign demand for their fiat can conduct MMT. If emerging market countries try this, they will literally run out of “hard” money (dollars), and extreme currency devaluation will ensue.
Kelton remarks that “even as multi trillion-dollar COVID-relief bills pushed the national debt past $30 trillion, America’s borrowing costs have remained historically low. This is in part because the Federal Reserve bought up much of the debt that stimulus spending generated, effectively financing public spending through money printing.”
Here she is telling us that if the Fed did not do QE, then interest rates would be higher. Of course, this has a major impact on foreign policy, but it is undiscussed in her book.
It is hard to see how an MMT approach could ever constrain the Credit Card Wars. In an age where Congress does not exert much influence on wars, and where politicians would prefer to borrow than to tax, restraint fades away.
Kelton concludes her book with the following: “What matters is not the size of the so-called debt (or who holds it) but whether we can look back with pride, knowing that our stockpile of treasuries exists because of the many (mostly) positive interventions that were taken on behalf of our democracy.”
The hubris of Kelton’s book — which reads like a stenography of a late imperial power, in denial about its global decline — is only matched by its complete disregard for the costs of war.
Not all Modern Monetary Theorists are neoconservatives. But all neoconservatives are, in some form, Modern Monetary Theorists. The purest expression of fiat money — MMT theory — allows governments to fight wars without consent of the people, hiding their true costs and representing a terminal risk to democracy.
As Cicero concluded 2,000 years ago, “nervi belli pecunia infinita” — the sinews of war are infinite money.
VII. QE And Asset Inflation
One major externality of keeping interest rates at zero to allow expansionary spending is asset inflation.
As documented in investigative journalist Christopher Leonard’s new book, “The Lords Of Easy Money,” the Federal Reserve has followed a clear blueprint since the early 1990s and the days of chairman Alan Greenspan:
- Fight price inflation
- Ignore asset inflation
- Bail out the economy when it collapses
The chosen tactic to achieve this has been to continually, over time, use the Fed’s power to depress interest rates. This can be seen simply by looking at the federal funds rate over time, which was close to 10% in the late 1980s, and is now essentially 0%.
With these low rates, Leonard writes, “the state can finance its debt cheaply, and sustain the equity markets boom. The cost is in QE, which drives banks to lever up and find alternate sources of investment beyond treasuries, which haven’t yielded enough interest since the Great Financial Crisis.”
No longer can one save safely for the future in a U.S. long-term treasury delivering 5% per year. That was a model that pension funds and insurance funds and trillion-dollar industries could once rely on.
BitMEX founder Arthur Hayes recently gave his take on the transformation at hand: “QE is designed to starve the market of yield across all durations (by reducing the supply of safe bonds), and force investors into riskier assets, pushing up the prices of those assets.”
As the Bank of England explains:
“We buy UK government bonds or corporate bonds from other financial companies and pension funds. When we do this, the price of these bonds tend to increase which means that the bond yield, or ‘interest rate’ that holders of these bonds get, goes down. The lower interest rate on UK government and corporate bonds then feeds through to lower interest rates on loans for households and businesses… Say we buy £1 million of government bonds from a pension fund. In place of those bonds, the pension fund now has £1 million in cash. Rather than hold on to that cash, it will normally invest it in other financial assets, such as shares, that give it a higher return. In turn, that tends to push up on the value of shares, making households and businesses holding those shares wealthier. That makes them likely to spend more, boosting economic activity.”
Curiously, even though the Bank of England seems to be open about the fact that QE aims to create asset inflation, it rejects that low rates are its goal.
“QE lowers the cost of borrowing throughout the economy, including for the government,” it writes. “That’s because one of the ways that QE works is by lowering the bond yield or ‘interest rate’ on UK government bonds. But that’s not why we do QE. We do it to keep inflation low and stable and support the economy.”
The St. Louis Fed once claimed that the U.S. government would eventually sell all the assets it bought post-GFC back to the private sector, making it clear that the Fed would not use “money creation as a permanent source for financing government spending.”
But as macroeconomic analyst Lyn Alden notes, this never happened: “A decade later, the Fed’s holdings of Treasury securities and other assets, both in absolute terms and as a percentage of GDP, are far higher now than they were then, and are rising. So, it became clear that it was and is debt monetization.”
Alden then provides a key insight: “Things like Medicare, Social Security, military spending, crisis stimulus checks, and so forth, would likely have to be reduced if the Treasury was limited to only borrowing from real lenders rather than borrowing from newly-created pools of dollars from the Federal Reserve.”
In fact, in September 2019, the money market system broke and, as Alden writes, “the U.S. government ran out of lenders. Foreigners, pensions, insurance companies, retail investors, and finally large banks and hedge funds, simply weren’t buying enough Treasuries at that point compared to how many Treasuries the government was issuing… [so] the Federal Reserve stepped in with newly-printed dollars out of thin air, and started buying Treasury securities, due to a lack of any more real buyers at those low rates.”
According to Alden, the Fed “basically nationalized the repo market to reduce the interest rate… the Federal Reserve allowed the U.S. government to keep funding its domestic spending plans at current interest rates, without finding new real lenders for their rising deficits.”
The same, of course, goes for foreign and military spending.
In sum, the U.S. government has shown that — through the GFC, repo spike crisis in 2019 and pandemic crisis in 2020 — it is willing to do anything to keep interest rates down: start an experimental QE program to buy long-dated treasuries and subprime mortgages; nationalize the repo markets; and even nationalize the corporate debt markets.
In March and April of 2020 the Fed essentially nationalized the private credit markets by creating a “special purpose vehicle” that could buy corporate debt. The Fed only ended up buying $8.7 billion of this type of security, but it saved the market with a psychological effect: Everyone knows there is now a buyer of last resort for corporate debt, too.
The Fed has not quite employed “yield curve control” — where the government guarantees the price of longer-dated securities — as the Japanese and Australian banks have started to do in the past few years, but the subject has become increasingly discussed.
Typically, central banks might determine short-term interest rates, but the market determines long-term rates. Yield-curve control is a central bank program to try and control both. The U.S. government did, of course, at one time employ yield curve control, in the 1940s, to support World War II.
VIII. QE And Inequality
The Federal Reserve lists five key functions on its website — including, for example, maximum employment and financial stability — but nowhere does it list a sixth function: to create and sustain asset bubbles to exponentially enrich the American elite.
In a country where the top 10% of the population own 88.9% of the stocks and mutual funds, asset inflation is a highly redistributionary phenomenon. According to Joseph Wang, who saw the process from the inside during years of trading at the Fed, “QE appears to lift financial asset prices but not necessarily economic activity.”
“The value of stocks,” Christopher Leonard writes, “rose steadily during the decade after 2010, in spite of the weak overall economic growth, the broad-based wage stagnation, and the host of international financial problems that the Fed cited as justification for its interventions.”
Leading up to the GFC, prestigious institutions worldwide went deep into subprime mortgage securities and credit default swaps, taking out staggering amounts of insurance on increasingly risky investments. After the crash, with interest rates at the zero lower bound, companies had to look even further out on the yield curve for profits. Most recently, ZIRP has led to the explosion of corporate leverage and stock buybacks, which have resulted in 40% of the S&P’s total return since 2011.
As Wang writes in his new book “Central Banking 101,” “quantitative easing has helped push longer-term interest rates to record lows. Corporations have taken advantage of the record low interest rates and issued record amounts of debt that they use to buy back stock.”
Easy monetary policy has resulted in increased corporate power over wage earners and small businesses, a conclusion strongly backed by Shimshon Bichler and Jonathan Nitzan’s capital as power theory. In this environment, companies are able to make even more money by borrowing and then re-packaging and selling their debt, than by focusing on actual products. They are also able to exploit stock buybacks, which amplify returns to the shareholding elite, as opposed to advancing innovation and growth.
In 1990, the 1% held 23% of all American household wealth. Today, after more than 30 years of easy monetary policy, they hold 32%. As Bichler and Nitzan write, “inflation is always and everywhere a monetary phenomenon; but it is also always and everywhere a redistributional phenomenon.”
As Rosella Cappella Zielinski puts it, middle and low-income households typically cannot lend and receive interest payments, but are taxed anyway. So, when the government finances war through borrowing, we see a “huge redistribution of wealth from the middle and low-income classes to the wealthy.”
Alden notes similar trends: “In the 1990’s, the top 10% richest households owned about 60% of the country’s household net worth. By 2006, it had increased to 65%. By the end of 2019, it was over 70%. Meanwhile, the share of wealth held by the bottom 90% of households decreased from 40% of the country’s household net worth in the 1990’s to 35% in 2006 to less than 30% at the end of 2019.”
This redistributionary effect has become even more magnified in the last two years of pandemic fiscal policy. According to a January 2022 Oxfam report, “the wealth of the world’s 10 richest men has doubled since the pandemic began,” while “the incomes of 99% of humanity are worse off.”
Fed critics like Jeff Snider say the central bank is “bad at its job” — but what if its job is to enrich the American elite and keep borrowing costs for spending on activities like war low? Then we might say it has done pretty well.
According to Alden, one reason why the U.S. has a much higher wealth concentration than the rest of the developed world is because it spends more on the military as a percentage of GDP, which is not generally the most constructive use of spending for domestic human flourishing. She says the U.S. could have, for example, instead used the $1 trillion spent on borrowing for war on payroll tax cuts for workers, infrastructure or simply held a lower debt-to-GDP ratio. She points to Japan as a society that has a very high debt that is spent all domestically on keeping healthcare cheap and maintaining the social contract. So, it has less populism, less polarization, higher median wealth and so on.
But America is not Japan. Its easy monetary policy is not reducing inequality, it is exacerbating it. And one of the biggest factors is war spending.
“By 2030,” according to Heidi Peltier, “Americans will have spent over $2 trillion on [war] interest alone, not for anything productive or even any military action that could ostensibly make us safer and more secure. The costs to the country are thus more than simply the funds used on war versus on peaceful activities, but they are even more importantly the funds wasted on interest payments rather than on productive investments, useful programs, or lower taxes. Rather than spending 2.4 percent of our GDP on interest payments, how else could we productively be using those funds?”
In sum, a significant externality of the national security consideration of keeping low interest rates is increased inequality in the U.S., a rich-get-richer scenario above and beyond even what was seen in the 1920s.
If America’s political system was not built on a mix of debt monetization and unaccountable war — and paying for military expenditures without public consent — one wonders what the U.S. warfare state would look like.
One imagines a more limited operation, more focused on defending the homeland from real threats, and only undertaking actions that are popular with the public, lest they get defunded.
IX. Financial Crashes And Debt-Financed War
In his provocative book, “The Political Economy Of American Hegemony,” political economist Thomas Oatley argues that the debt-financed U.S. military buildups of the 1960s, 1980s and 2000s led respectively to currency collapse, banking collapse and real estate collapse.
Oatley argues that debt-financed military buildups in the fiat currency age actually end up causing recessions. He looks at the Vietnam buildup in the late 1960s, followed by dollar devaluation and the end of the gold standard; the anti-Soviet buildup in the 1980s triggered by their invasion of Afghanistan, followed by the Savings And Loan Crisis and Black Monday; and the War On Terror triggered by 9/11, which was followed by the Great Financial Crisis.
His conclusion is that when the United States borrows to fight wars, the economy goes into a deficit, overheats and crashes: another externality of debt-financing military conflict to add to increased inequality.
According to Oatley, “postwar military buildups have constituted large economic events — they have increased government spending on average by roughly 2% of GDP for four or more consecutive years. To put this in context, consider that the American Recovery and Reinvestment Act (ARRA), enacted in February 2009 as an economic stimulus package to combat the Great Recession, increased government spending by $230 billion, or approximately 1.5% of GDP, in 2009 and 2010… The typical postwar military buildup thus has had a proportionately larger and more sustained impact on government expenditures than the fiscal stimulus enacted to combat America’s deepest postwar recession.”
In essence, Oatley argues that military spending leads to cyclical economic crashes, hurting the average American. He says the U.S. has not had a “run on the dollar” since the 1970s because of the rise of global demand for the dollar. Any other country might have collapsed, but since the dollar is the reserve currency, it is protected. Instead of manifesting in the form of currency devaluation, such pressures, Oatley argues, have come in the form of market collapse.
“America’s financial power,” Oatley writes, “allows the U.S. government to increase military spending sharply in response to foreign military challenges without needing to resolve political conflict over how to pay for it. Because the United States can import capital in large volumes at low cost for extended periods, policymakers face little diffuse market pressure to agree on deficit-reduction measures. And the ease with which the U.S. attracts foreign capital implies that the private sector is not facing higher borrowing costs as a result of government borrowing either. Hence, the corporate sector has little reason to pressure the government to balance the budget and the financial sector profits from intermediating the larger volume of funds flowing into the American economy. Financial power enables the U.S. government to increase military spending without having to cut social welfare programs, without having to reduce private consumption, and without having to reduce private sector investment.”
And so, just like the Fed removed the bond market as a check on power against military spending, dollar hegemony also removes the debt burden as a check on power against military spending.
“American policymakers,” Oatley writes, “discovered they lived in a world in which capital was available in potentially limitless supply. Access to global financial markets would allow the state to defer indefinitely the difficult political choices [as it struggled] to allocate capital between competing social priorities.”
It is a potent combination: dollar supremacy and QE. But it is not sustainable.
X. Declining Foreign Demand For U.S. Debt
The recent expansion of the Federal Reserve’s intervention in the U.S. bond market comes at an important geopolitical moment.
U.S. public debt is reaching a danger zone. America’s debt-to-GDP ratio is now at an all-time high of more than 130%, even higher than its peak during World War II. The Congressional Budget Office is projecting $112 trillion in new deficits over the next three decades, which would push the debt past 200% of GDP. In that future world, interest payments on debt would be the largest federal expenditure, consuming nearly half of all tax revenues.
“When a country starts getting to about 100% debt-to-GDP, the situation becomes nearly unrecoverable,” writes Alden. “There is a vanishingly small probability that the bonds will be able to avoid default and pay interest rates that are higher than the prevailing rate of inflation. In other words, those bonds will most likely begin to lose a meaningful amount of purchasing power for those creditors who lent money to those governments, one way or another.”
Alden goes on to write that “out of 51 cases of govt. debt breaking above 130% of GDP since 1800, 50 governments have defaulted.” The only exception, she notes, is Japan, which — unlike the US — is the largest creditor nation in the world.
She assesses the debt-to-revenue ratio of the U.S. government today as around “$32.5 trillion divided by $4.25 trillion, or about 7.6x.”
If America was a company, she says, “it would be junk bond status.” She points out that each 1% increase on interest rates for $30 trillion of debt is an additional $300 billion per year in expenses. Alden calls the post-9/11 wars the “event horizon” for U.S. fiscal policy, “since they added trillions to the national debt without much of an increase to GDP.”
As Manhattan Institute senior fellow Brian Reidl writes, “If Washington finds that mounting debt is putting its fiscal sustainability at the mercy of interest rates, there is little doubt that presidents, Treasury secretaries, and Congress will pressure the Federal Reserve to pledge artificially low interest rates, including monetizing much of the debt, if necessary.”
This, of course, can only happen if America can keep the buying spree of its debt going.
A major trend in the fiat standard era is the U.S. government trying to find buyers of its debt. For many decades, it was successful, often by coercion.
In the late 1960s, when the U.S. balance-of-payments deficit first became a major concern, and as America started to permanently become a debtor nation, this issue was addressed partially through Germany. President Johnson used threats to force the West Germans to buy more U.S. treasuries than they would have otherwise.
Next were the Organization of the Petroleum Exporting Countries (OPEC) states. With the creation of the petrodollar system in 1974, the newly-rich OPEC states led by Saudi Arabia agreed to price oil in dollars and recycle their dollar windfalls back into U.S. debt in exchange for weapons and protection. In the 1980s, Japan was next, forced to buy U.S. debt as a result of the Plaza Accord and other international agreements.
In the 2000s, the U.S. government spent enormous resources pursuing a policy that would result in China stockpiling U.S. debt, including pushing it into the World Trade Organization, which helped it earn dollars which it recycled into more than $1 trillion of treasuries.
Spanning the age of Vietnam to Iraq, the Germans, Japanese, OPEC nations, and finally the Chinese produced marginal buying pressure for U.S. debt, allowing America to continually expand its warfare state while decreasing its manufacturing base.
Given that over time the dollar was becoming the world reserve currency, there were of course free market reasons why investors flocked to U.S. debt. America has, after all, the world’s most powerful economy and is the least likely to default. However, the coercive tactics mentioned above resulted in a system where there was even more demand and lower rates than otherwise possible.
With the post-9/11 wars relying on the bond market, foreigners — including the Germans, Japanese, Saudis and Chinese — initially helped finance U.S. military operations, funding up to 40% of all war spending between 2001 and 2020. But now things are changing.
Financial analyst Luke Gromen has pointed out that over the last decade, major countries have stopped or slowed their buying of U.S. treasuries. This dynamic began to change with the Great Financial Crisis. Shocked by the U.S. effort to bail out financial markets, the Chinese government began to balk at the credibility of U.S. debt. In 2013, it acted formally, massively reducing its purchases. Many other countries followed suit. The percentage of foreign ownership of U.S. debt has declined significantly over the past decade.
China, which quadrupled its holdings to $1.3 trillion between 2004 and 2012, actually reduced its net holdings in the past decade, as did Japan and Germany. Partly because of the post-9/11 wars, and partly because of the Great Financial Crisis, trust in the dollar system has started to wane. U.S. bonds have lost around 4% of their value in the first few months of 2022.
As Gromen points out, pre-GFC, foreigners owned around 60% of U.S. debt. Today, their holdings are down below 40%. The gap has been made up by the Fed outright and a market that knows the Fed will be the buyer of last resort.
The key point is that without the new policies of QE and ZIRP, resulting in the U.S. government taking trillions of its own debt off the global marketplace, yields on treasuries would be higher and the forever wars would be cut short.
There are those who say that the Federal Reserve has very little power over interest rates, and that the age of low interest rates is not because of Fed policy, but rather, growing global demand for U.S. debt in the eurodollar system and an age of deflation and dollar shortages. They imply that the world is voluntarily, out of self-interest, pumping up the U.S. military state against its own will, because it wants treasuries — the international base money of the last 50 years.
There is more than a grain of truth to this. Governments, private firms and individuals worldwide do need and want dollars, especially in times of crisis.
But would treasuries be as valuable, as in demand, and therefore as cheap as they are for the U.S. government to pay back if there was no bond market intervention by the American central bank? If an actor is going to buy nearly $9 trillion of something, it is going to have an impact on the market.
In the end, what is quite clear is that the current American global warfare state relies on QE-driven domestic demand for treasuries. Few Americans would be OK with the end result of less savings, more wars and less citizen control over state policy if they only knew what was going on.
XI. The Rise Of Bitcoin Peace Theory
To recap this essay so far:
- America’s post-9/11 wars have been paid for entirely through borrowing, and have become increasingly distant from daily life and public discourse
- The U.S. government has engaged in unprecedented intervention in the bond markets, which has helped keep the price of borrowing for war low
- Negative externalities of debt-financed war include a rise in inequality due to asset inflation, as well as cyclical economic crises
- The only way to keep this system going is more debt-monetization through issuing new bonds and and QE, given that foreign demand for U.S. debt has peaked
- Financing for war through borrowing makes conflict more likely, endangers democratic peace theory, and ultimately erodes democracy itself
Are debt-based monetary systems more belligerent than commodity-based monetary systems? One thing is for sure: The former allows wars to be extended far beyond what would otherwise be possible.
Consider Putin. After invading Ukraine, he is now largely cut off from the international financial system. He cannot easily borrow from the international markets. Yes, he has strategic reserves, a low level of government debt, a balanced budget and a flow of cash coming in for oil and gas. But war is extremely expensive — this one in particular costing him $20 billion per day — and his regime has other costs. If his Ukraine operation is not immediately successful, Putin must draw down his reserves — which will run out in a matter of months — or devalue the ruble to fund the war. He can of course do some QE, but is not in a position to do unlimited bond buying.
Putin cannot keep a war going forever without imposing real costs on his citizens, who may eventually push back. And that is a good thing. The United States, by contrast, found, at the apex of its power, a magical way to finance wars without restraint.
By abusing its privilege as the issuer of the world’s reserve currency, America has imposed war costs on future populations, making it possible to fight prolonged conflicts on several continents without the consent or even knowledge of the public.
This is the endgame of fiat central banking: an ostensibly “democratic” government that spends in an unaccountable way, ultimately enriching a tiny few at the expense of the rest. This final state may be amplified still by the rise of central bank digital currencies, which are designed to replace banknotes and coinage as “cash” in the real economy, and give governments the ability to easily hand out helicopter money, impose negative interest rates, set expiration dates on savings, operate political blacklists, install a total financial surveillance state and further accumulate power over private industry.
As part of the research process for this essay, I spoke to a securities trader who dealt extensively with the Fed and primary dealer system over the past decade.
He explained that the power of the Federal Reserve enables a lot of Congressional spending, and that without Fed intervention, rates would be higher, leading to higher taxes, which would prompt greater public interest in how money is spent. In short, Fed intervention has helped hide spending from the public and given the state unchecked power.
Americans find themselves in a situation today where war creditors buying treasuries do not necessarily know that they are paying for war. And where the fiat currency-powered central banking system behind it all is propping up a bloated, inefficient and undemocratic warfare state.
There are three potential solutions to this problem.
First would be conscription. If every American had to enroll in military service, citizens might have much more debate over warfare than they do today. Existential or just wars would still be fought, but there would be a serious hesitancy to send family and friends abroad for anything but the level of gravity as Pearl Harbor.
Second would be a reinstatement of war taxes and liberty bonds. A new flat tax on the American people, specifically labeled, to pay for the global War On Terror, with clear instructions on what it would be funding, would help. As would a robust liberty bond effort, where the U.S. government would have to sell a percentage of its treasuries marked as such, and allow them to trade on the free market. Perhaps the Fed would even be prohibited from buying them and inflating their value.
Conscription and new aggressive war taxation are not only morally debatable, but also politically impossible. This leaves a third alternative from the status quo: a change in the monetary system to a Bitcoin standard.
Now, of course, no central bank would ever choose to give up its control over money. No group of bureaucrats would ever put restraints on themselves. But Bitcoin may force their hand. In its first decade, it has grown from a mysterious post on a cypherpunk message board to a trillion-dollar asset, and given global macroeconomic policy — in which extreme inflation, financial censorship, onerous sanctions, intrusive surveillance and exploitative payment companies are the new norm — it has considerable upside for global adoption.
As the only digital currency in the world with a credibly-predictable monetary policy, Bitcoin could very well continue to grow, and eat into the store-of-value roles currently held by gold, real estate, stocks and negative-yielding government bonds. It is not out of the question that one day, bitcoin could become the global reserve currency, and an asset that governments compete to attain through mining, taxation, incentives or confiscation.
Beyond this, there is a possibility that Bitcoin also becomes the globally-desired medium exchange for citizens everywhere. While this may seem far-fetched today, consider Thiers’ law, an economic trend in dollarizing countries, where the local fiat becomes so poor that good money drives out the bad. Similarly, over time, merchants may want your bitcoin, not your fiat, driving government-created currencies out of circulation, or at least reducing their use significantly.
This would be the Bitcoin standard, and in that timeline, the Fed, operating with a reserve account of BTC, could not simply buy infinite assets. Once it ran low on its bitcoin reserve, it would necessarily have to tax or sell bonds at unsubsidized rates to buy more. America’s economic calculations would look more similar to those made by most countries around the world today, which have to think carefully about saving, and make hard choices about spending, to avoid drawing down their reserves.
This may sound like the gold standard — which was killed by governments, who were able to seize, centralize and demonetize the precious metal — but it is different in two critical aspects.
Unlike gold, the production of which is held tightly by a handful of megacorporations, bitcoin operates from software scattered pseudonymously across the globe on tens of thousands of privately-operated servers. Its users are strongly incentivized to never download and run a new version of the software with more than 21 million bitcoin. And unlike gold, bitcoin is primarily held by individuals, not governments or corporations. This makes it much harder to depress its fair market price over long periods of time.
Looming global macro conditions make further Bitcoin adoption even more probable. A decade of low interest rates and high inflation likely awaits. This financial repression will continue to drive individuals towards money which cannot be debased.
Under a Bitcoin standard, governments would be more constrained. They would still be able to borrow to pay for expenses, issue fiat currency and wage popular wars. But they would have to be much more transparent with the public about spending, as states would depend more tightly on the people’s consent and cooperation for revenue, and interest rates on sovereign bonds could not be as easily manipulated.
Yes, all spending would come under a more watchful eye in a Bitcoin standard. But consider what would get cut first in such a scenario: spending on forever wars in faraway lands that only tend to enrich military contractors, or spending on upgrading domestic infrastructure, education and healthcare? The American system, which already tends to finance social entitlements with taxes and foreign military action with borrowing, might be telling us the answer.
A March 1, 2022 poll from Rasmussen suggested that 53% of Democrats and 49% of Republicans thought the U.S. military should join a wider war, if one broke out in Europe. One wonders what the level of support would be if the questions were based on cost, and not just sentiment: Would you support a war tax? Buy liberty bonds? Endorse a return to conscription?
Perhaps the American people, in the tradition of World War II, would view such a war as existential for democracy, and would push for U.S. involvement with their own blood and treasure. Maybe they would wait to engage until directly attacked, as they were at Pearl Harbor. Either way, a broadly popular war can be fought under any monetary standard. But forever wars in the Middle East and Asia, disconnected from the lives of average Americans, are only possible under the fiat standard. A Bitcoin standard would reject them.
The cost of war might be dangerously invisible in democracies today. But it does not have to be forever.
This is a guest post by Alex Gladstein. Opinions expressed are entirely their own and do not necessarily reflect those of BTC Inc or Bitcoin Magazine.