Sunday, May 31, 2020

Will There Be A New War In Asia

Saturday, May 30, 2020

The Spanish Flu Didn't Wreck the Global Economy

The St. Louis Red Cross Motor Corps in 1918
Library of Congress / The New York Times / Redux

In October 1918, the Spanish flu descended on Stanford University. Residents donned facemasks, football games were canceled, and students were asked to quarantine on campus. But classes and assemblies continued to meet. And in addition to fulfilling their regular academic obligations, male students trained to combat German machine guns and poison gas in World War I. Over a tenth of all students fell ill, and a dozen died—roughly in line with the 45,000 cases and 3,000 fatalities recorded in nearby San Francisco. Yet faculty and students started to abandon face coverings just a month after the initial outbreak. Football returned to campus shortly thereafter, even as the disease lingered throughout the winter.

The contrast with the current coronavirus pandemic is striking. I cannot enter my office at Stanford without special permission from the dean. Almost all undergraduates have left campus, and everyone who can is required to work online. The university hospital, recently rebuilt to the tune of $2 billion, had to cut pay by a fifth for all of its 14,000 employees as anxious patients put off treatment. San Francisco County, now almost twice as populous as a century ago, has reported 2,400 infections and 40 deaths—a per capita fatality rate 99.2 percent lower than that of the 1918–19 pandemic. But two full months after California Governor Gavin Newsom ordered residents to “shelter in place,” the prospect of even a gradual return to normalcy remains elusive at best.

Scaling up California’s experience by several orders of magnitude gives a good sense of the state of the world right now. Several hundred million workers have lost their jobs. Global GDP is set to decline by a greater percentage than at any time since the Great Depression. One and a half billion students—some 90 percent of the world’s total—have been affected by school shutdowns. Most societies now face a prolonged economic slump that will derail and blight countless lives.

The economic fallout from the Spanish flu was far less dramatic. In the United States, industrial output fell sharply but rebounded within a few months. Retail was barely affected, and businesses did not declare bankruptcy at higher rates than usual. According to the latest econometric analysis, the pandemic of 1918–19 cut the United States’ real GDP and consumption by no more than two percent. The same appears to have been true for most advanced Western economies.

Yet the Spanish flu may turn out to have been far deadlier than the novel coronavirus. It killed at least 550,000 Americans—0.5 percent of the population. Adjusted for population growth over the last century, this would work out to a little under two million deaths today, close to the number predicted in the worst-case, zero-distancing scenario for the coronavirus that Imperial College London published in March. Death rates in 1918–19 were far higher outside of the industrialized world. Worldwide, the Spanish flu carried off 40 million people, or two percent of humanity, equivalent to more than 150 million people today. Even worse, it stalked not only the elderly and infirm but also infants and those in their twenties and thirties. This squeezed the workforce and snuffed out the lives of many who had just started families, leaving behind spouses and children to fend for themselves in a sink-or-swim society.

So why did this ferocious pandemic fail to wreck the economy? The answer is deceptively simple: for the most part, whether by necessity or choice, people barreled through.

Authorities in many countries recommended hand washing and the use of handkerchiefs as face coverings. In the United States, measures varied widely from city to city and state to state, but across the country, local officials closed many schools and large public venues. For the most part, however, nonessential businesses remained open, and customer demand was sufficiently robust to keep them afloat without the help of costly stimulus packages.

A century ago, Americans still inhabited a physical and mental universe that had not yet been sanitized by modern science.

Were the lives of Americans back then worth less than they are today? Only in the most technical sense. In recent years, various U.S. government agencies have set the value of a human life at around $10 million. Estimates in other high-income societies are not far behind. A century ago, no one would have thought of putting similarly hefty price tags on human beings. More to the point, life was shorter overall. In the mid-1910s, mean life expectancy at birth in the United States was only two-thirds of what it is now. Worldwide, it has doubled since.

What is more, a century ago Americans still inhabited a physical and mental universe that had not yet been sanitized by modern science. The older generation would have remembered catastrophic outbreaks of cholera and yellow fever. There were no vaccines for influenza, tuberculosis, tetanus, diphtheria, typhus, measles, or polio, no antibacterial sulfonamide drugs, no penicillin, no antiviral drugs, and no chemotherapy. Wealth offered limited protection at best: more often than not, the rich and poor were in it together.

Over the last hundred years, peace, medicine, and prosperity have steered humanity toward greater comfort, safety, and predictability. For the first time in history, the residents of the developed world have good reason to expect science to shield and heal them. To varying degrees, these expectations have also taken hold in developing countries as income and education have expanded, hunger and premature death have receded, and conscription has gone out of fashion. People expect more from life and behave accordingly.

It may be tempting to take the collective embrace of lockdowns and social-distancing measures as signs that higher expectations have made people kinder, ready to shoulder economic burdens in order to protect the elderly, the immunocompromised, and the plain unlucky in their midst. But diligent citizens under lockdown ought to be wary of congratulating themselves for letting the better angels of their nature take flight. Empathy remains in short supply: if Americans really cared about refugees or those affected by their foreign wars, their politics would look quite different. Their kindness does not extend even to their fellow citizens—witness the endless plight of the un- or underinsured and those doomed in so many ways by their ZIP codes.

For the first time in history, many in the developed world can afford to give free rein to their anxieties.

Viewed against this unflattering background, the response of many Americans to the pandemic can be more plausibly explained by the fear—unfamiliar in these times of prosperity and science—that the next victim could be a vulnerable spouse, a devoted parent, or a beloved grandparent. It is these personal anxieties and tribal empathies that have sucked the oxygen out of the economy and put lives on hold.

For the first time in history, many in the developed world can afford to give free rein to their anxieties. Even 20 years ago, hardly anybody could have worked or studied from home. Technology alone has made sustained distancing feasible, even tolerable. But not for all. The days when Stanford students braved the same risks to life and limb as today’s cops and cashiers are long gone. Expectations of life have grown across the board, yet more for some than for others.

Today, the selective empathy of privilege amplifies existing inequalities. Thanks to Social Security and Medicare, Americans have long been in the habit of transferring wealth from young to old. But now they have taken the more radical step of destroying resources—by shrinking the economy—to safeguard the often few remaining years of those most at risk from COVID-19, the disease caused by the novel coronavirus. Technology renders this gambit least painful for the most protected, those who can hope to ride out the storm from the relative security of their home offices and higher-paying work.

Meanwhile, a large part of society is left behind, mired in unemployment and precarity or stuck in face-to-face jobs that promise ongoing exposure. The young and the poor, already held down by inequality, debt, and fading prospects of social mobility, are bound to pay the heaviest price.

Pundits have yet to tire of predicting how this crisis will change everything. But will the unnerving experience of this pandemic also inspire humanity to review some of the loftier expectations we have nurtured? We must face up to the tradeoffs we are rushing to accept with scant regard for those who can least afford them.

  • WALTER SCHEIDEL is Professor of Classics and History at Stanford University.

Friday, May 29, 2020

The Age of Magic Money Can Endless Spending Prevent Economic Calamity

Jerome Powell, the Chair of the U.S. Federal Reserve, at a press conference in Washington, D.C., January 2020
Liu Jie / Xinhua / eyevine / Redux
Crises can drive change, but sometimes it takes two crises to cement a transformation. Alone, the Great Depression ushered in the New Deal, roughly tripling U.S. federal spending as a share of output. But it took World War II to push federal spending much higher, solidifying the role of the state in the U.S. economy. If federal interventions such as the creation of the interstate highway system felt natural by the mid-1950s, it was the result of two compounding shocks, not a single one.
American history offers many such examples. Alone, the Vietnam War might have triggered a decline of trust in the government. It took the compounding shock of Watergate to make that decline precipitous. Alone, the collapse of the Soviet Union would have enhanced U.S. power. It took the strong performance of the U.S. economy in the 1990s to spark talk of a “unipolar moment.” Alone, technological advances would have fueled inequality in the first decade of this century. Globalization reinforced that fracturing.
Today, the United States and other advanced countries are experiencing the second wave of an especially powerful twin shock. Taken individually, either the global financial crisis of 2008 or the global pandemic of 2020 would have been enough to change public finances, driving governments to create and borrow money freely. Combined, these two crises are set to transform the spending power of the state. A new era of assertive and expansive government beckons. Call it the age of magic money.
The twin shocks will change the balance of power in the world, because their effects will vary across countries, depending on the credibility and cohesion of each country’s economic institutions. Japan, with a long history of low inflation and a competent national central bank, has already shown that it can borrow and spend far more than one might have predicted given its already high levels of public debt. The United Kingdom, which has a worrisome trade deficit but strong traditions of public finance, should be able to manage an expansion of government spending without adverse consequences. The eurozone, an ungainly cross between an economic federation and a bickering assemblage of proud nation-states, will be slower to exploit the new opportunities. Meanwhile, emerging economies, which weathered the 2008 crisis, will enter a hard phase. Weaker states will succumb to debt crises.
The new era will present the biggest potential rewards—and also the greatest risks—to the United States. As the issuer of the world’s most trusted financial assets, the United States will be able to use (and maybe abuse) the new financial powers most ambitiously. Thanks partly to the dollar’s entrenched position as the world’s reserve currency, the United States will be able to sustain an expansion in government spending on priorities as varied as scientific research, education, and national security. At the same time, the U.S. national debt will swell, and its management will depend crucially on the credibility of the Federal Reserve. In times of high national debt, U.S. presidents since Harry Truman have tried to subjugate the central bank. If the Fed loses its independence, the age of magic money could end in catastrophe.

“WHATEVER IT TAKES”

The financial crisis of 2008 left its mark on the world by magnifying the power of central banks in the advanced economies. In the days immediately after Lehman Brothers filed for bankruptcy, in September of that year, Ben Bernanke, the U.S. Federal Reserve chair, offered an early glimpse of the economy’s new rules by pumping $85 billion of public funds into the American International Group (AIG), an insurer. When Representative Barney Frank, Democrat of Massachusetts, was informed of this plan, he skeptically inquired whether the Fed had as much as $85 billion on hand. “We have $800 billion,” Bernanke answered simply. Armed with the nation’s printing press, Bernanke was saying, the Fed can conjure as many dollars as it wants. The iron law of scarcity need not apply to central bankers.
The AIG rescue was only the beginning. The Fed scooped toxic assets off the balance sheets of a long list of failing lenders in order to stabilize them. It embraced the new tool of “quantitative easing,” which involves creating money to buy long-term bonds, thus suppressing long-term interest rates and stimulating the economy. By the end of 2008, the Fed had pumped $1.3 trillion into the economy, a sum equivalent to one-third of the annual federal budget. The central bank’s traditional toolkit, involving the manipulation of short-term interest rates, had been dramatically expanded.
The Fed has emerged as the biggest agent of big government, a sort of economics superministry.
These ambitious moves were mirrored in other advanced economies. The Bank of England also embraced quantitative easing, buying bonds on the same scale as the Fed (adjusting for the size of the British economy). The Bank of Japan had experimented with quantitative easing since 2001, but following the financial crisis, it redoubled those efforts; since 2013, it has created more money relative to GDP than any other mature economy. The European Central Bank’s response was halting for many years, owing to resistance from Germany and other northern member states, but in 2015, it joined the party. Combined, these “big four” central banks injected about $13 trillion into their economies in the decade after the financial crisis.
The crisis brought on by the novel coronavirus has emboldened central banks still further. Before the pandemic, economists worried that quantitative easing would soon cease to be effective or politically acceptable. There were additional concerns that post-2008 legislation had constrained the power of the Fed to conduct rescues. “The government enjoys even less emergency authority than it did before the crisis,” former Treasury Secretary Timothy Geithner wrote in these pages in 2017. But as soon as the pandemic hit, such fears were dispelled. “I was among many who were worried a month ago about the limited scope of the Fed arsenal,” the respected investor Howard Marks confessed recently. “Now we see the vast extent of the Fed’s potential toolkit.”
The Fed rode into battle in March, promising that the range of its actions would be effectively limitless. “When it comes to lending, we are not going to run out of ammunition,” declared Jerome Powell, the Fed chair. Whereas the Fed’s first two rounds of quantitative easing, launched in 2008 and 2010, had involved a preannounced quantity of purchases, Powell’s stance was deliberately open ended. In this, he was following the precedent set in 2012 by Mario Draghi, then the president of the European Central Bank, who pledged to do “whatever it takes” to contain Europe’s debt crisis. But Draghi’s promise was an inspired bluff, since the willingness of northern European states to support limitless intervention was uncertain. In contrast, nobody today doubts that the Fed has the backing of the U.S. president and Congress to deliver on its maximalist rhetoric. This is “whatever it takes” on steroids.
The Fed’s muscular promises have been matched with immediate actions. During March and the first half of April, the Fed pumped more than $2 trillion into the economy, an intervention almost twice as vigorous as it delivered in the six weeks after the fall of Lehman Brothers. Meanwhile, market economists project that the central bank will buy more than $5 trillion of additional debt by the end of 2021, dwarfing its combined purchases from 2008 to 2015. Other central banks are following the same path, albeit not on the same scale. As of the end of April, the European Central Bank was on track for $3.4 trillion of easing, and Japan and the United Kingdom had promised a combined $1.5 trillion.
The design of the Fed’s programs is leading it into new territory. After Lehman’s failure, the Fed was leery of bailing out nonfinancial companies whose stability was marginal to the functioning of the financial system. Today, the Fed is buying corporate bonds—including risky junk bonds—to ensure that companies can borrow. It is also working with the Treasury Department and Congress to get loans to small and medium-sized businesses. The Fed has emerged as the lender of last resort not just to Wall Street but also to Main Street.
As the Fed expands its reach, it is jeopardizing its traditional claim to be a narrow, technocratic agency standing outside politics. In the past, the Fed steered clear of Main Street lending precisely because it had no wish to decide which companies deserved bailouts and which should hit the wall. Such invidious choices were best left to democratically elected politicians, who had a mandate to set social priorities. But the old demarcation between monetary technicians and budgetary politics has blurred. The Fed has emerged as the biggest agent of big government, a sort of economics superministry.

MONEY FOR NOTHING

This leads to the second expansion of governments’ financial power resulting from the coronavirus crisis. The pandemic has shown that central banks are not the only ones that can conjure money out of thin air; finance ministries can also perform a derivative magic of their own. If authorized by lawmakers and backed by central banks, national treasuries can borrow and spend without practical limit, mocking the normal laws of economic gravity.
The key to this new power lies in the strange disappearance of inflation. Since the 2008 crisis, prices in the advanced economies have risen by less than the desired target of about two percent annually. As a result, one of the main risks of budget deficits has vanished, at least for the moment. In the pre-2008 world, governments that spent more than they collected in taxes were creating a risk of inflation, which often forced central banks to raise interest rates: as a form of stimulus, budget deficits were therefore viewed as self-defeating. But in the post-2008 world, with inflation quiescent, budget authorities can deliver stimulatory deficits without fear that central banks will counteract them. Increased inequality has moved wealth into the hands of citizens who are more likely to save than to spend. Reduced competition has allowed companies with market power to get away with spending less on investments and wages. Cloud computing and digital marketplaces have made it possible to spend less on equipment and hiring when launching companies. Thanks to these factors and perhaps others, demand has not outgrown supply, so inflation has been minimal.
Despite a perception of U.S. decline, almost two-thirds of central bank reserves are still composed of dollars.
Whatever the precise reasons, the disappearance of inflation has allowed central banks to not merely tolerate budget deficits but also facilitate them. Governments are cutting taxes and boosting spending, financing the resulting deficits by issuing bonds. Those bonds are then bought from market investors by central banks as part of their quantitative easing. Because of these central bank purchases, the interest rate governments must pay to borrow goes down. Moreover, because central banks generally remit their profits back to government treasuries, these low interest payments are even lower than they seem, since they will be partially rebated. A finance ministry that sells debt to its national central bank is, roughly speaking, borrowing from itself. Just as central bankers are blurring the line between monetary policy and budgetary policy, so, too, are budgetary authorities acquiring some of the alchemical power of central bankers.
If low inflation and quantitative easing have made budget deficits cheap, the legacy of 2008 has also made them more desirable. In the wake of the financial crisis, quantitative easing helped the economy recover, but it also had drawbacks. Holding down long-term interest rates has the effect of boosting equity and bond prices, which makes it cheaper for companies to raise capital to invest. But it also delivers a handout to holders of financial assets—hardly the most deserving recipients of government assistance. It would therefore be better to rouse the economy with lower taxes and additional budgetary spending, since these can be targeted at citizens who need the help. The rise of populism since 2008 underscores the case for stimulus tools that are sensitive to inequality.
Outside the New York Stock Exchange, May 2020
Outside the New York Stock Exchange, May 2020
Lucas Jackson / Reuters
Because budget deficits appear less costly and more desirable than before, governments in the advanced economies have embraced them with gusto. Again, the United States has led the way. In the wake of the financial crisis, in 2009, the country ran a federal budget deficit of 9.8 percent of GDP. Today, that number has roughly doubled. Other countries have followed the United States’ “don’t tax, just spend” policies, but less aggressively. At the end of April, Morgan Stanley estimated that Japan will run a deficit of 8.5 percent of GDP this year, less than half the U.S. ratio. The eurozone will be at 9.5 percent, and the United Kingdom, at 11.5 percent. China’s government, which led the world in the size of its stimulus after 2008, will not come close to rivaling the United States this time. It is likely to end up with a 2020 deficit of 12.3 percent, according to Morgan Stanley.
As the world’s strong economies borrow heavily to combat the coronavirus slump, fragile ones are finding that this option is off-limits. Far from increasing their borrowing, they have difficulty in maintaining their existing levels of debt, because their creditors refuse to roll over their loans at the first hint of a crisis. During the first two months of the pandemic, $100 billion of investment capital fled developing countries, according to the International Monetary Fund, and more than 90 countries have petitioned the IMF for assistance. In much of the developing world, there is no magic, only austerity.

AMERICA’S ADVANTAGE

Since the start of the pandemic, the United States has unleashed the world’s biggest monetary stimulus and the world’s biggest budgetary stimulus. Miraculously, it has been able to do this at virtually no cost. The pandemic has stimulated a flight to the relative safety of U.S. assets, and the Fed’s purchases have bid up the price of U.S. Treasury bonds. As the price of Treasuries rises, their interest yield goes down—in the first four months of this year, the yield on the ten-year bond fell by more than a full percentage point, dropping below one percent for the first time ever. Consequently, even though the stimulus has caused U.S. government debt to soar, the cost of servicing that debt has remained stable. Projections suggest that federal debt payments as a share of GDP will be the same as they would have been without the crisis. This may be the closest thing to a free lunch in economics.
The world’s top economies have all enjoyed some version of this windfall, but the U.S. experience remains distinctive. Nominal ten-year government interest rates are lower in Canada, France, Germany, Japan, and the United Kingdom than in the United States, but only Germany’s is lower after adjusting for inflation. Moreover, the rate in the United States has adjusted the most since the pandemic began. Germany’s ten-year government rate, to cite one contrasting example, is negative  but has come down only marginally since the start of February—and has actually risen since last September. Likewise, China’s ten-year bond rate has come down since the start of this year but by half as much as the U.S. rate. Meanwhile, some emerging economies have seen their borrowing costs move in the opposite direction. Between mid-February and the end of April, Indonesia’s rate rose from around 6.5 percent to just under eight percent, and South Africa’s jumped from under nine percent to over 12 percent, although that increase has since subsided.
The United States’ ability to borrow safely and cheaply from global savers reflects the dollar’s status as the world’s reserve currency. In the wake of the 2008 crisis, when the failures of U.S. financial regulation and monetary policy destabilized the world, there was much talk that the dollar’s dominance might end, and China made a concerted effort to spread the use of the yuan beyond its borders. A decade or so later, China has built up its government-bond market, making it the second largest in the world. But foreigners must still contend with China’s capital controls, and the offshore market for yuan-denominated bonds, which Beijing promoted with much fanfare a decade ago, has failed to gain traction. As a result, the yuan accounts for just two percent of global central bank reserves. Private savers are starting to hold Chinese bonds, but these still represent a tiny fraction of their portfolios.
Today, finance has more sway over countries and people than ever before.
As China struggles to internationalize the yuan, the dollar remains the currency that savers covet. Despite the financial crisis and the widespread perception that U.S. influence in the world has declined, almost two-thirds of central bank reserves are still composed of dollars. Nor has the frequent U.S. resort to financial sanctions changed the picture, even though such sanctions create an incentive for countries such as Iran to develop ways around the dollar-based financial system. Issuing the global reserve currency turns out to be a highly sustainable source of power. The dollar continues to rally in times of uncertainty, even when erratic U.S. policies add to that uncertainty—hence the appreciation of the dollar since the start of the pandemic.
The dollar’s preeminence endures because of powerful network effects. Savers all over the world want dollars for the same reason that schoolchildren all over the world learn English: a currency or a language is useful to the extent that others choose it. Just under half of all international debt securities are denominated in dollars, so savers need dollars to buy these financial instruments. The converse is also true: because savers are accustomed to transacting in dollars, issuers of securities find it attractive to sell equities or bonds into the dollar market. So long as global capital markets operate mainly in dollars, the dollar will be at the center of financial crises—failing banks and businesses will have to be rescued with dollars, since that will be the currency in which they have borrowed. As a result, prudent central banks will hold large dollar reserves. These network effects are likely to protect the status of the dollar for the foreseeable future.

OUR CURRENCY, YOUR PROBLEM

In the age of magic money, this advantage will prove potent. At moments of stress, the United States will experience capital inflows even as the Federal Reserve pushes dollar interest rates down, rendering capital plentiful and inexpensive. Meanwhile, other countries will be treated less generously by the bond markets, and some will be penalized by borrowing costs that rise at the least opportune moment.
A strong financial system has always given great powers an edge: a bit over two centuries ago, the United Kingdom’s superior access to loans helped it defeat Napoleon. Today, finance has more sway over countries and people than ever before. But even as it bolsters U.S. power, finance has become riskier. The risk is evident in the ballooning U.S. federal debt burden. As recently as 2001, the federal debt held by the public amounted to just 31 percent of GDP. After the financial crisis, the ratio more than doubled. Now, thanks to the second of the twin shocks, federal debt held by the public will soon match the 106 percent record set at the end of World War II.
Whether this debt triggers a crisis will depend on the behavior of interest rates. Before the pandemic, the Congressional Budget Office expected the average interest rate on the debt to hover around 2.5 percent. The Fed’s aggressive bond buying has pulled U.S. rates lower—hence the free lunch. But even if interest rates went back to what they were before, the debt would still be sustainable: higher than the average of 1.5 percent of GDP that the country has experienced over the past two decades but still lower than the peak of 3.2 percent of GDP that the country reached at the start of the 1990s.
Another way of gauging debt sustainability is to compare debt payments with the growth outlook. If nominal growth—real growth plus inflation—outstrips debt payments, a country can usually grow out of its problem. In the United States, estimates of real sustainable growth range from 1.7 percent to 2.0 percent; estimates of future inflation range from the 1.5 percent expected by the markets to the Fed’s official target of 2.0 percent. Putting these together, U.S. nominal growth is likely to average around 3.6 percent. If debt service payments are 2.5 percent of GDP, and if the government meets those obligations by borrowing and so expanding the debt stock, nominal growth of 3.6 percent implies that the federal government can run a modest deficit in the rest of its budget and still whittle away at the debt-to-GDP ratio.
Japan’s experience reinforces the point that high levels of debt can be surprisingly sustainable. The country’s central government debt passed 100 percent of GDP in 2000, and the ratio has since almost doubled, to nearly 200 percent. Yet Japan has not experienced a debt crisis. Instead, interest rates have declined, keeping the cost of servicing the debt at an affordable level. Japan’s track record also disproves the notion that high levels of debt impede vigorous emergency spending. The country’s pandemic stimulus is large, especially relative to the scale of its health challenge.
Pedestrians in downtown Tokyo, May 2020
Pedestrians in downtown Tokyo, May 2020
Issei Kato / Reuters
In short, the recent prevalence of low interest rates across the rich world encourages the view that U.S. debt levels will be manageable, even if they expand further. The more central banks embrace quantitative easing, the lower interest rates are likely to remain: the rock-bottom yields on Japan’s government debt reflect the fact that the Bank of Japan has vacuumed up more than a third of it. In this environment of durably low interest rates, governments enter a looking-glass world: by taking on more debt, they can reduce the burden of the debt, since their debt-financed investments offset the debt by boosting GDP. Based on this logic, the age of magic money may usher in expanded federal investments in a wide range of sectors. When investors the world over clamor for U.S. government bonds, why not seize the opportunity?
The question is whether Tokyo’s experience—rising debt offset by falling interest rates—anticipates Washington’s future. For the moment, the two countries have one critical feature in common: a central bank that is eagerly engaged in quantitative easing. But that eagerness depends on quiescent inflation. Because of a strong tradition of saving, Japan has experienced outright deflation in 13 of the past 25 years, whereas the United States has experienced deflation in only one year over that period. The danger down the road is that the United States will face an unexpected price surge that in turn forces up interest rates faster than nominal GDP, rendering its debt unsustainable.
To see how this could work, think back to 1990. That year, the Fed’s favorite measure of inflation, the consumer price index, rose to 5.2 percent after having fallen to 1.6 percent four years earlier—thus proving that inflation reversals do happen. As inflation built, the Fed pushed up borrowing costs; rates on ten-year Treasury bonds went from about seven percent in late 1986 to over nine percent in 1988, and they hovered above eight percent in 1990. If a reversal of that sort occurred today, it could spell disaster. If long-term interest rates rose by two percentage points, the United States would face debt payments worth 4.5 percent of GDP rather than 2.5 percent. The burden of the national debt would hit a record.
That would have significant political consequences. In 1990, the unsustainable debt trajectory forced the adoption of a painful deficit-cutting package, causing President George H. W. Bush to renege on his “no new taxes” campaign pledge, arguably costing him the 1992 election. Given today’s political cynicism, it seems unwise to count on a repeat of such self-sacrifice. It is therefore worth recalling the other debt-management tactic that Bush’s administration attempted. By attacking the Fed chair, Alan Greenspan, with whispered slanders and open scolding, Bush’s advisers tried to bully the central bank into cutting interest rates. The way they saw things, lower rates, faster growth, and higher inflation would combine to solve the debt problem.
Greenspan stood his ground, and Bush was not reckless enough to get rid of him. But if a future president were more desperate, the Fed could be saddled with a leader who prioritized the stability of the national debt over the stability of prices. Considering the Fed’s recent business bailouts, it would be a small step to argue that the central bank also has a duty to protect citizens from budget austerity. Given its undershooting of the inflation target over the past few years, it would be easy to suggest that a bit of overshooting would be harmless. Unfortunately, if not checked fairly quickly, this seductive logic could open the way to a repeat of the 1970s, when U.S. financial mismanagement allowed inflation to reach double digits and the dollar came closer than ever in the postwar period to losing its privileged status.
The age of magic money heralds both opportunity and peril. The twin shocks of 2008 and 2020 have unleashed the spending power of rich-world governments, particularly in the United States. They have made it possible to imagine public investments that might speed growth, soften inequality, and tackle environmental challenges. But too much of a good thing could trigger a dollar crisis that would spread worldwide. As U.S. Treasury Secretary John Connally put it to his European counterparts in 1971, “The dollar is our currency but your problem.”

THE FED’S DILEMMA

Nobody is sure why inflation disappeared or when it might return again. A supply disruption resulting from post-pandemic deglobalization could cause bottlenecks and a price surge; a rebound in the cost of energy, recently at absurd lows, is another plausible trigger. Honest observers will admit that there are too many unknowns to make forecasting dependable. Yet precisely because the future is uncertain and contingent, a different kind of prediction seems safe. If inflation does break out, the choices of a handful of individuals will determine whether finance goes over the precipice.
The United States experienced an analogous moment in 1950. China had sent 300,000 infantry across the frozen Yalu River, which marked its border with Korea; they swarmed U.S. soldiers sleeping on the frigid ground, stabbing them to death through their sleeping bags. The following month, with the fate of the Cold War as uncertain as it would ever be, U.S. President Harry Truman called Thomas McCabe, the Fed chair, at home and insisted that the interest rate on ten-year bonds stay capped at 2.5 percent. If the Fed failed to buy enough bonds to keep the interest rate at that level, “that is exactly what Mr. Stalin wants,” the president lectured. In a time of escalating war, the government’s borrowing capacity had to be safeguarded.
This presented the Fed with the kind of dilemma that it may confront again in the future. On the one hand, the nation was in peril. On the other hand, inflation was accelerating. The Fed had to choose between solving an embattled president’s problem and stabilizing prices. To Truman’s fury, McCabe resolved to put the fight against inflation first; when the president replaced McCabe with William McChesney Martin, a Treasury official Truman expected would be loyal, he was even more shocked to find that his own man defied him. In his first speech after taking office, Martin declared that inflation was “an even more serious threat to the vitality of our country than the more spectacular aggressions of enemies outside our borders.” Price stability should not be sacrificed, even if the president had other priorities.
Years later, Truman encountered Martin on a street in New York City. “Traitor,” he said, and then walked off. Before the age of magic money comes to an end, the United States might find itself in need of more such traitors.
  • SEBASTIAN MALLABY is Paul A. Volcker Senior Fellow for International Economics at the Council on Foreign Relations.

Thursday, May 28, 2020

The EU Recovery Plan

The EU  Recovery Plan A "Merkel" but not a " 
Hamilton" moment 
by Hung Tran

EU Economic Affairs Commissioner Paolo Gentiloni holds a joint press conference with European Commission Vice-President Valdis Dombrovskis on Recovery and Resilience at the European Commission headquarters in Brussels, Belgium May 28, 2020. Aris Oikonomou/Pool via REUTERS
Amidst high expectations, the European Commission has just released its proposed €750 billion recovery plan labeled “Next Generation EU,” as a “temporary reinforcement” of its draft budget of €1.1 trillion for 2021-2027. Built on the suggestion by German Chancellor Angela Merkel and French President Emmanuel Macron of a €500 billion recovery fund based on borrowing by the EU on capital markets and disbursed to members through grants, such a fiscal package has been viewed by some observers as signaling a “Hamilton” moment for the EU. This is an analogy to the agreement Alexander Hamilton was able to secure in 1790 for the US federal government to assume states’ debt (used to finance the Revolutionary War) and to fund those debts by issuing federal government bonds to be repaid by new tariffs on imports—thus strengthening the financial position of the newly established federal government. While the EU recovery plan is a good step toward more fiscal cohesion, it is nowhere near fostering a fiscal union—hence the analogy is not quite accurate.
The EU recovery plan consists of three pillars. The first pillar contains the most important instrument, the Recovery and Resilience Facility of €560 billion, €310 billion of which will be in grants, €250 billion in loans (adding in other programs, the grant portion will come to €451 billion of the €750 billion total). This recovery facility will support regions and sectors in need, with projects in line with EU priorities for a green and digital transformation of the economy. In addition, there is €95 billion to top up cohesion policy programs and to support a Just Transition Fund to help member states moving toward climate neutrality. The second pillar aims to kickstart the economy, by leveraging €56.3 billion (hopefully) by a factor of ten to get private sector participation in providing solvency support for viable companies and other investment projects. The third pillar of €38.7 billion addresses the health care needs revealed by the COVID-19 pandemic.
There are several novel features in the proposal. First, the EU will borrow the €750 billion on international capital markets. This represents a big step-up in issuance, after traditionally modest bond issues to fund various programs, resulting in an outstanding amount of about €50 billion. The new issuance should be welcome by market participants eager to acquire European safe AAA (except AA from S&P) assets. The bond issue will be repaid in the period 2028-2058 from the EU budget, to which member states contribute according to their relative shares of the EU economy. As such, the new EU borrowing is not the joint Eurobond or Coronabond that some have proposed with “joint and several” liability among the member states.
Second, grants in substantial amounts are a new feature compared to the traditional use of loans to support members in need, albeit at lower interest rates than those members can get on capital markets. However, since member states contribute to the EU budget, the net transfer portion of any grant to a member is much less than what the headline numbers may suggest. For example, according Commission staff estimates, the net transfer to Italy taking into account allocation (including the grant element) of €153 billion and its budget contribution will amount to €56.7 billion (3.2% of gross domestic product—GDP) over the next several years—a useful amount but not a game changer given the fact that Italy’s government debt will jump by about 20 percentage points of GDP to more than 155 percent this year. As expected, the northern countries including Germany and the “frugal four” (the Netherlands, Austria, Denmark, and Sweden) will become net contributors to the plan, in the range of 3.5%-5.4% of Gross National Income (GNI).
Third, the Commission has also proposed to raise its own resources ceiling (maximum amount of resources in any given year that can be called from member states)—currently at 1.2% of GNI to 1.4% to accommodate Brexit, and adding a “temporary and ring-fenced” element of 0.6% to come to 2 percent of GNI—so as to create sufficient fiscal headroom to borrow in capital markets. The 0.6% temporary element will be removed when the new bond issue is paid off. Furthermore, the Commission has also suggested new ways to raise its own revenues—namely via an extended Emission Trading System (estimated to raise €10 billion per year); a tax on big corporations benefiting from the Single Market (raising €10 billion per year); carbon border adjustment mechanism (raising €5-14 billion per year) and a digital tax on companies with global turnover greater than €750 million (raising €1.3 billion per year—this however will put the EU in conflict with the United States which has threatened to retaliate with tariffs if any European country implements a digital tax).
Counting the previously approved stimulus package of €540 billion (for unemployment insurance support and European Stability Mechanism facility), the just announced €750 billion “Next Generation EU” recovery plan, reinforcing a €1.1 trillion 2021-2027 budget (itself representing an almost 15% increase from the 2014-2020 budget), the total amount of fiscal mobilization comes to €2.39 trillion—quite a robust and unprecedented step. The recovery plan and 2021-2017 budget proposal will be on the agenda of the next European Council meeting on June 18-19, with intense debate in particular with the “frugal four” going on until then. The fact that the proposal contains a mix of loans and grants (of somewhat smaller amount compared to the Merkel-Macron suggestion), presented as an “one-off” temporary addition to the budget (which all member states contribute to and unanimously approve) can create some room for compromise. It will then be debated in the European Parliament with final approval planned for December.
On balance, the large volume of fiscal measures funded in part by substantial borrowing by the EU, increasing the Commission’s own resources ceiling and adding new ways to raise own  revenues— which enables the Commission to implement more EU-wide measures—as well as accepting the principle of grants, are positive steps that move a good way toward more fiscal cohesion, demonstrating solidarity to fiscally-challenged member states. These moves have been positively received in financial markets, reducing pressures on the European Central Bank (ECB) to buy bonds of highly indebted member countries. The proposal can be said to reflect a “Merkel” moment, as her legacy to the cause of European integration.
However, these measures do not constitute a fiscal union—which can only come about by Treaty change. Here lies the relevance of the German Constitutional Court’s ruling in early May. The EU, the Euro Area, and the ECB can try to take incremental steps to sustain the Union and the euro during crises, but until the EU Treaty is changed from the “Articles of Confederation” to a “Federal Constitution,” tension within the Union and the Euro Area will remain.
Hung Tran is a nonresident senior fellow at the Atlantic Council and former executive managing director at the Institute of International Finance

Chronicle of a pqndemic Foretold Learning From the COVID - 19 Failure -Before the Next Outbreak Arrives

Chronicle of a Pandemic  Foretold
Learning From the COVID-19 Failure—Before the Next Outbreak Arrives

Michael T. Osterholm and Mark Olshaker
MICHAEL T. OSTERHOLM is Regents Professor and Director of the Center for Infectious
Disease Research and Policy at the University of Minnesota.
MARK OLSHAKER is a writer and documentary filmmaker.
They are the authors of Deadliest Enemy: Our War Against Killer Germs.
foreign affai r s 3


“Time is running out to prepare for the next pandemic. We
must act now with decisiveness and purpose. Someday, after
the next pandemic has come and gone, a commission much
like the 9/11 Commission will be charged with determining how well
government, business, and public health leaders prepared the world for
the catastrophe when they had clear warning. What will be the verdict?”
That is from the concluding paragraph of an essay entitled “Preparing
for the Next Pandemic” that one of us, Michael Osterholm, published
in these pages in 2005. The next pandemic has now come, and
even though Covid-19, the disease caused by the new coronavirus that
emerged in late 2019, is far from gone, it is not too soon to reach a verdict
on the world’s collective preparation. That verdict is a damning one.
There are two levels of preparation, long range and short range, and
government, business, and public health leaders largely failed on both.
Failure on the first level is akin to having been warned by meteorologists
that a Category 5 hurricane would one day make a direct hit on
New Orleans and doing nothing to strengthen levies, construct water-
diversion systems, or develop a comprehensive emergency plan. Failure
on the second is akin to knowing that a massive low-pressure
system is moving across the Atlantic toward the Gulf of Mexico and
not promptly issuing evacuation orders or adequately stocking emergency
shelters. When Hurricane Katrina hit New Orleans on August
29, 2005, preparation on both levels was inadequate, and the region
suffered massive losses of life and property as a result. The analogous
failure both over recent decades to prepare for an eventual pandemic
and over recent months to prepare for the spread of this particular
pandemic has had an even steeper toll, on a national and global scale.
The long-term failure by governments and institutions to prepare
for an infectious disease outbreak cannot be blamed on a lack of warning
or an absence of concrete policy options. Nor should resources
have been the constraint. After all, in the past two decades,
the United States alone has spent countless billions on homeland security and
counterterrorism to defend against human enemies, losing sight of
the demonstrably far greater threat posed by microbial enemies; terrorists
don’t have the capacity to bring Americans’ way of life to a
screeching halt, something Covid-19 accomplished handily in a matter
of weeks.

And then, in addition to the preparations that should have
been started many years ago, there are the preparations that should
have started several months ago, as soon as reports of an unknown
communicable disease that could kill started coming out of China.
The public health community has for years known with certainty
that another major pandemic was on the way, and then another one
after that—not if but when. Mother Nature has always had the upper
hand, and now she has at her disposal all the trappings of the modern
world to extend her reach. The current crisis will eventually end, either
when a vaccine is available or when enough of the global population
has developed immunity (if lasting immunity is even possible),
which would likely require some two-thirds of the total population to
become infected. Neither of those ends will come quickly, and the human
and economic costs in the meantime will be enormous.
Yet some future microbial outbreak will be bigger and deadlier
still. In other words, this pandemic is probably not “the Big One,” the
prospect of which haunts the nightmares of epidemiologists and public
health officials everywhere. The next pandemic will most likely be
a novel influenza virus with the same devastating impact as the pandemic
of 1918, which circled the globe two and a half times over the course of
more than a year, in recurring waves, killing many more people than the
brutal and bloody war that preceded it.Examining why the United States
and the world are in this current crisis is thus not simply a matter of
accountability or assigning blame. Just as this pandemic was in many
ways foretold, the next one will be, as well. If the world doesn’t learn the
 right lessons from its failure to prepare and act on them with the speed,
resources, and political and societal commitment they deserve, the toll
 next time could be considerably steeper. Terrible as it is, Covid-19 should
serve as a warning of how much worse a pandemic could be—and spur the necessary action
to contain an outbreak before it is again too late.

WAKE-UP CALL

For anyone who wasn’t focused on the threat of an infectious disease
pandemic before, the wake-up call should have come with the 2003
outbreak of SARS . A coronavirus—so named because, under an electron
microscope, the proteins projecting out from the virion’s surface resemble
a corona, a halo-like astronomical phenomenon—jumped from
palm civets and ferret badgers in the markets of Guangdong, China,
made its way to Hong Kong, and then spread to countries around the
world. By the time the outbreak was stopped, the animal sources eliminated
from the markets, and infected people isolated, 8,098 cases had
been reported and 774 people had died.

Nine years later, in 2012, another life-threatening coronavirus, MERS,
spread across the Arabian Peninsula. In this instance, the virus originated
in dromedaries, a type of camel. (Since camel owners in the Middle East
understandably will not kill their valuable and culturally important animals,
MERS remains a regional public health challenge.) Both coronaviruses
were harbingers of things to come (as we wrote in our 2017 book,
Deadliest Enemy), even if, unlike COVID -19, which can be transmitted by
carriers not even aware they have it, SARS and MERS tend not to become
highly infectious until the fifth or sixth day of symptomatic illness.
SARS , MERS, and a number of other recent outbreaks—the 2009 H1N1
flu pandemic that started in Mexico, the 2014–16 Ebola epidemic in
West Africa, the 2015–16 spread of the Zika flavivirus from the Pacific
Islands to North and South America—have differed from one another in
a number of ways, including their clinical presentation, their degree of
severity, and their means of transmission. But all have had one notable
thing in common: they all came as surprises, and they shouldn’t have.

For years, epidemiologists and public health experts had been calling
for the development of concrete plans for handling the first months and
years of a pandemic. Such a “detailed operational blueprint,” as “Preparing
for the Next Pandemic” put it in 2005, would have to involve everyone
from private-sector food producers, medical suppliers, and health-care
providers to public-sector health, law enforcement, and emergency-management
officials. And it would have to anticipate “the pandemic-related
collapse of worldwide trade . . . the first real test of the resiliency of the
modern global delivery system.” Similar calls came from experts and officials
around the world, and yet they largely went unheeded.

PREEXISTING CONDITIONS

If anything, despite such warnings, the state of preparedness has
gotten worse rather than better in recent years—especially in the
United States. The problem was not just deteriorating public health
infrastructure but also changes in global trade and production.
During the 2003 SARS outbreak, few people worried about supply
chains. Now, global supply chains are significantly complicating the
U.S. response. The United States has become far more dependent on
China and other nations for critical drugs and medical supplies. The
Center for Infectious Disease Research and Policy at the University
of Minnesota (where one of us, Osterholm, is the director) has identified
156 acute critical drugs frequently used in the United States—the
drugs without which patients would die within hours. All these drugs
are generic; most are now made overseas; and many of them, or their
active pharmaceutical ingredients, are manufactured in China or India.
A pandemic that idles Asian factories or shuts down shipping
routes thus threatens the already strained supply of these drugs to
Western hospitals, and it doesn’t matter how good a modern hospital is
if the bottles and vials on the crash cart are empty. (And in a strategic
showdown with its great-power rival, China might use its ability to
withhold critical drugs to devastating effect.)

Financial pressure on hospitals and health systems has also left
them less able to handle added stress. In any pandemic-level outbreak,
a pernicious ripple effect disturbs the health-care equilibrium.
The stepped-up need for ventilators and the tranquilizing
and paralytic drugs that accompany their use produce a greater need
for kidney dialysis and the therapeutic agents that requires, and so
on down the line. Even speculation that the antimalarial hydroxychloroquine
might be useful in the treatment of Covid-19 caused ashortage of the
drug for patients with rheumatoid arthritis and lupus,
who depend on it for their daily well-being. It remains unclear
what impact Covid-19 has had on the number of deaths due to other
conditions, such as heart attacks. Even if it’s mostly a matter of patients
with severe or life-threatening chronic conditions avoiding
care to minimize their risk of exposure to the virus, this could ultimately
prove to be serious collateral damage of the pandemic.

In normal times, the United States’ hospitals have little in the way
of reserves and therefore little to no surge capacity for emergency
situations: not enough beds, not enough emergency equipment such as
mechanical ventilators, not enough N95 masks and other personal protective
equipment (PPE). The result during a pandemic is the equivalent
of sending soldiers into battle without enough helmets or rifles.
The National Pharmaceutical Stockpile was created during the
Clinton administration and renamed the Strategic National Stockpile
in 2003. It has never had sufficient reserves to meet the kind of crisis
underway today, and it is fair to say that no administration has devoted
the resources to make it fully functional in a large-scale emergency.
Even more of an impediment to a rapid and efficient pandemic
response is underinvestment in vaccine research and development. In
2006, Congress established the Biomedical Advanced Research and
Development Authority (BARDA ). Its charge is to provide an integrated
and systematic approach to the development and purchase of
vaccines, drugs, and diagnostic tools that will become critical in public
health emergencies. But it has been chronically underfunded, and the
need to go to Congress and ask for new money every year has all but
killed the possibility of major long-term projects.

Following the 2014–16 West African Ebola outbreak, there was a
clear recognition of the inadequacy of international investment in
new vaccines for regional epidemic diseases such as Ebola, Lassa fever,
Nipah virus disease, and Zika, despite the efforts of BARDA and
other international philanthropic government programs. To address
this hole in preparedness, CEPI, the Coalition for Epidemic Preparedness
Innovations, a foundation that receives support from public, private,
philanthropic, and civil society organizations, was conceived in
2015 and formally launched in 2017. Its purpose is to finance independent
research projects to develop vaccines against emerging infectious
diseases. It was initially supported with $460 million from the Bill &

Melinda Gates Foundation, the Wellcome Trust, and a consortium of
nations, including Germany, Japan, and Norway. Although CEPI has
been a central player since early this year in developing a vaccine for
SARS -CoV-2, the virus that causes COVID -19, the absence of a prior
major coronavirus vaccine initiative highlights the ongoing underinvestment
in global infectious disease preparedness.
Had the requisite financial and pharmaceutical resources gone into
developing a vaccine for SARS in 2003 or MERS in 2012, scientists already
would have done the essential research on how to achieve
coronavirus immunity, and there would likely be a vaccine platform
on which to build (such a platform is a technology or modality that
can be developed for a range of related diseases). Today, that would
have saved many precious months or even years.

FIRST SYMPTOMS

By late 2019, the lack of long-range preparation had gone on for years,
despite persistent warnings. Then, the short-range failure started. Early
surveillance data suggested to epidemiologists that a microbial storm
was brewing. But the action to prepare for that storm came far too slowly.
By the last week of December, reports of a new infectious disease
in the Chinese city of Wuhan and surrounding Hubei Province
were starting to make their way to the United States and
around the world. There is no question that the Chinese government
suppressed information during the first weeks of the outbreak,
evident especially in the shameful attempt to silence the
warnings of Li Wenliang, the 34-year-old opthamologist who tried
to alert the public about the threat. Yet even with such dissembling
and delay, the warning signs were clear enough by the start of this
year. For example, the Center for Infectious Disease Research and
Policy published its first description of the mystery disease on December
31 and publicly identified it as a novel coronavirus on January
8. And by January 11, China had published the complete genetic
sequence for the virus, at which point the World Health Organization
(WHO) immediately began developing a diagnostic test. By the
second half of January, epidemiologists were warning of a potential
pandemic (including one of us, Osterholm, on January 20). Yet the
U.S. government at the time was still dismissing the prospect of a
serious outbreak in the United States—despite valid suspicions
that the Chinese government was suppressing information on the
Wuhan outbreak and underreporting case figures. It was the moment
when preparation for a specific coming storm should have
started in earnest and quickly shifted into high gear.

U.S. President Donald Trump would later proffer the twin assertions
that he “felt it was a pandemic long before it was called a pandemic”
and that “nobody knew there’d be a pandemic or an epidemic of
this proportion.” But on January 29, Peter Navarro, Trump’s trade adviser,
wrote a memo to the National Security Council warning that when
the coronavirus in China reached U.S. soil, it could risk the health or
lives of millions and cost the economy trillions of dollars. That same day,
as reported by The Wall Street Journal, Alex Azar, the health and human
services secretary, told the president that the potential epidemic was well
under control. Navarro sent an even more urgent memo on February 23,
according to The New York Times, pointing to an “increasing probability
of a full-blown COVID -19 pandemic that could infect as many as 100
million Americans, with a loss of life of as many as 1–2 million souls.”
Washington’s lack of an adequate response to such warnings is by
now a matter of public record. Viewing the initially low numbers of
clinically recognized cases outside China, key U.S. officials were either
unaware of or in denial about the risks of exponential viral spread. If
an infectious disease spreads from person to person and each individual
case causes two more, the total numbers will remain low for a
while—and then take off. (It’s like the old demonstration: if you start
out with a penny and double it every day, you’ll have just 64 cents after
a week and $81.92 after two weeks, and then more than $5 million by
the end of a month.) Covid-19 cases do not typically double overnight,
but every five days is a pretty good benchmark, allowing for rapid
growth even from just a few cases. Once the virus had spread outside
East Asia, Iran and Italy were the first to experience this effect.
Even with the lack of long-range planning and investment, there was
much that the U.S. government could and should have done by way of
a short-range response. As soon as the novel and deadly coronavirus
was identified, Washington could have conducted a quick but comprehensive
review of national PPE requirements, which would have
led to the immediate ramping up of production for N95 masks and
protective gowns and gloves and plans to produce more mechanical
ventilators. Relying on the experience of other countries, it should
have put in place a comprehensive test-manufacturing capability
and been ready to institute testing and contact tracing while the number
of cases was still low, containing the virus as much as possible
wherever it cropped up. It could have appointed a supply chain coordinator
to work with governors, on a nonpartisan basis, to allocate and
distribute resources. At the same time, Congress could have been
drafting emergency-funding legislation for hospitals, to prepare them
for both the onslaught of Covid-19 patients and the sharp drop in
elective surgeries, routine hospitalizations, and visits by foreign visitors,
essential sources of revenue for many institutions.

Instead, the administration resisted calls to advise people to stay at
home and practice social distancing and was unable or unwilling to
coordinate a government-wide effort among relevant agencies and
departments.The Centers for Disease Control and Prevention initially
shipped its own version of a test to state public health labs, only to
find that it didn’t work. This should have immediately triggered an
elevation of the issue to a crisis-driven priority for both the CDC
and the U.S. Food and Drug Administration, including bringing
the private clinical laboratory industry into the process to help manufacture
test kits. Instead, the problem languished, and the FDA took
until the end of February to approve any independent tests. At that
point, the United States had 100 or so recognized cases of Covid-19.
A little over a week later, the number would break 1,000, and after
that, the president declared a national emergency.
In 1918, cities that reacted to the flu early, preventing public gatherings
and advising citizens to stay home, suffered far fewer casualties
overall. But for this approach to work, they had to have reliable information
from central authorities in public health and government,
which requires honesty, responsiveness, and credibility from the beginning.
In the current crisis, the output from the White House was
instead—and continues to be—a stream of self-congratulatory tweets,
mixed messages, and contradictory daily briefings in which Trump
simultaneously asserted far-reaching authority and control and denied
responsibility for anything that went wrong or didn’t get done.
Everything was the governors’ responsibility and fault—including
not planning ahead, the very thing the administration refused to do.
Two years earlier, it had even disbanded the pandemic-readiness arm
of the National Security Council.

“You go to war with the army you have, not the army you might
want or wish to have at a later time,” U.S. Secretary of Defense
Donald Rumsfeld famously declared in 2004, addressing U.S. troops
on the way to Iraq, where the military’s vehicles lacked armor that
could protect the service members inside from explosive devices. That grim
message could apply to the pandemic response, too, with, for example,
frontline health-care workers going to war against COVID -19
without PPE. But in many ways, the current situation is even worse.
The United States and other countries went to war against a rapidly
spreading infectious disease without a battle plan, sufficient personnel,
adequate facilities or stocks of equipment and supplies, a reliable
supply chain, centralized command, or a public educated about or
prepared for the struggle ahead.

In the absence of strong and consistent federal leadership, state
governors and many large-city mayors have taken the primary responsibility
of pandemic response on themselves, as they had to, given
that the White House had even advised them to find their own ventilators
and testing supplies. (And health-care workers, forced into
frontline treatment situations without adequate respiratory protection,
are of course the hero-soldiers of this war.) But fighting the virus
effectively demands that decision-makers start thinking strategically—
to determine whether the actions being taken right now are
effective and evidence-based—or else little will be accomplished despite
the best of intentions. In this regard, it is not too late for the
United States to take on its traditional leadership role and be an example
in this fight, rather than lagging behind, as it has so far, places
such as Germany, Hong Kong, Singapore, and South Korea, and even,
despite its initial missteps, China.

THE BIG ONE

Why did so many policymakers ignore the virus until it was too late
to slow it down? It’s not a failure of imagination that prevented them
from understanding the dimensions and impact of a mass infectious
disease outbreak. In the United States, numerous high-level simulated
bioterror and pandemic tabletop exercises—from Dark Winter
in 2001 through Clade X in 2018 and Event 201 in 2019—have demonstrated
the confusion, poor decision-making, and lack of coordination
of resources and messaging that can undermine a response in the
absence of crisis contingency planning and preparation. The problem
is mainly structural, one that behavioral economists call “hyperbolic
discounting.” Because of hyperbolic discounting, explains Eric Dezenhall,
a crisis manager and one-time Reagan White House staffer who has long
studied the organizational reasons for action and inaction in government
and business, leaders “do what is easy and pays immediate
dividends rather than doing what is hard, where the dividends
seem remote. . . . With something like a pandemic, which sounds like
a phenomenon from another century, it seems too remote to plan for.”
The phenomenon is hardly new. Daniel Defoe relates in A Journal
of the Plague Year that in 1665, municipal authorities in London first
refused to accept that anything unusual was happening, then tried to
keep information from the public, until the spike in deaths made it
impossible to deny the much-feared bubonic plague. By that point, all
they could do was lock victims and their families in their homes in a
vain attempt to stop the spread.

Short of a global thermonuclear war and the long-term impact of
climate change, an infectious disease pandemic has the greatest potential
to devastate health and economic stability across the globe. All
other types of disasters and calamities are limited in geography and
duration—whether a hurricane, an earthquake, or a terrorist attack. A
pandemic can occur everywhere at once and last for months or years.
Worldwide mortality estimates for the 1918 influenza pandemic
range as high as 100 million—as a percentage of the global population,
equivalent to more than 400 million people today—making it easily
the worst natural disaster in modern times. So profound were the pandemic’s
effects that average life expectancy in the United States immediately
fell by more than ten years. Unlike a century ago, the world
today has four times the population; more than a billion international
border crossings each year; air travel that can connect almost any two
points on the globe in a matter of hours; wide-scale human encroachment
on forests and wildlife habitats; developing-world megacities in
which impoverished people live in close confines with others and
without adequate nutrition, sanitation, or medical care; industrial
farming in which animals are kept packed together; a significant overuse
of antibiotics in both human and animal populations; millions of
people living cheek by jowl with domestic birds and livestock (creating
what are essentially genetic reassortment laboratories); and a dependence
on international just-in-time supply chains with much of the
critical production concentrated in China.

The natural tendency might be to reassuringly assume that a century’s
worth of medical progress will make up for such added vulnerabilities.
(The human influenza virus wasn’t even discovered until1933, when the
virologists Wilson Smith, Christopher Andrewes, and Patrick Laidlaw,
working at London’s National Institute for Medical Research, first isolated
the influenza A virus from the nasal secretions and throat washings of
infected patients.) That would be a grave misconception. Even in a nonpandemic
year, aggregated infectious diseases—including malaria, tuberculosis,
HIV/aIDS, seasonal influenza,and diarrheal and other vector-borne illnesses—
represent one of themajor causes of death worldwide and by far the leading
cause of deathin low-income countries, according to the WHO.

In fact, given those realities of modern life, a similarly virulent influenza
pandemic would be exponentially more devastating than the
one a century ago—as the current pandemic makes clear. In the absence
of a reliable vaccine produced in sufficient quantities to immunize
much of the planet, all the significant countermeasures to
prevent the spread of COVID -19 have been nonmedical: avoiding public
gatherings, sheltering in place, social distancing, wearing masks of
variable effectiveness, washing hands frequently. As of this writing,
scientists and policymakers don’t even have a good handle on how
many of the RT-PCR tests that determine whether an individual has the
virus and how many of the serology tests that detect antibodies and
determine whether someone has already had it are even reliable. Meanwhile,
international demand for reagents—the chemicals that make
both kinds of tests work—and sampling swabs is already outstripping
supply and production. It is hard to conclude that the world today is
much better equipped to combat a massive pandemic than doctors,
public health personnel, and policymakers were 100 years ago.
Some are calling the Covid-19 pandemic a once-in-100-year event,
comparable to 100-year floods or earthquakes. But the fact that the
world is enduring a pandemic right now is no more predictive of when
the next one will occur than one roll of dice is of the result of the next
roll. (Although the 1918 flu was the most devastating influenza pandemic
in history, an 1830–32 outbreak was similarly severe, only in a
world with around half of 1918’s population.) The next roll, or the one
after that, could really be “the Big One,” and it could make even the
current pandemic seem minor by comparison.

When it comes, a novel influenza pandemic could truly bring the
entire world to its knees—killing hundreds of millions or more, devastating
commerce, destabilizing governments, skewing the course of history for generations
to come. Unlike Covid-19, which tends to most seriously affect older people and those
with preexisting medical problems, the 1918 influenza took a particularly heavy toll on otherwise
healthy men and women between the ages of 18 and 40 (thought to be a result of their
more robust immune systems overreacting to the threat through a “cytokine storm”).
There is no reason to think that the next big novel influenza pandemic couldn’t have similar results.

PLANS VS. PLANNING

Humans do not have the power to prevent all epidemics or pandemics.
But with the sufficient will, resources, and commitment, we do have
the power to mitigate their awesome potential for causing premature
deaths and attendant misery.
To begin with, Americans must change how they think about the
challenge. Although many people in the public health sphere don’t like
associating themselves with the military—they heal rather than kill, the
thinking goes—there is much that they can learn from military planning.
The military focuses on flexibility, logistics, and maintaining
readiness for any foreseeable situation. As U.S. General Dwight Eisenhower
noted, “Peace-time plans are of no particular value, but peacetime
planning is indispensable.”

The starting point should be to prioritize health threats in terms of
their likelihood and potential consequences if unchecked. First on
that list is a deadly virus that spreads by respiratory transmission
(coughing, sneezing, even simple breathing). By far the most likely
candidate would be another high-mortality influenza strain, like the
1918 one, although as revealed by SARS , MERS, Zika, and Covid-19,
new and deadly noninfluenza microbes are emerging or mutating in
unpredictable and dangerous ways.

Even before a specific threat has arisen, a broad group of actors
should be brought together to develop a comprehensive strategy—
with enough built-in flexibility that it can evolve as conditions demand—
and then they should repeatedly review and rehearse it. That
effort should involve everyone from high-level government and public
health officials to emergency responders, law enforcement, medical
experts and suppliers, food providers, manufacturers, and specialists
in transportation and communications. (As emergency planners are
fond of saying, you don’t want to be exchanging business cards at a
disaster site.) The strategy should offer an operational blueprint for how to get
hrough the one or two years a pandemic would likely last;
among the benefits of such a blueprint would be helping ensure that
leaders are psychologically prepared for what they might face in a
crisis, just as military training does for soldiers anticipating battlefield
conditions. The Bipartisan Commission on Biodefense—jointly
chaired by Tom Ridge, the first secretary of homeland security, under
President George W. Bush, and a former Pennsylvania governor,
and Joseph Lieberman, a former Democratic senator from
Connecticut—has suggested that the operation could be located in the
Office of the Vice President, with direct reporting to the president.
Wherever it is based, it must be run by a smart and responsible coordinator,
experienced in the mechanics of government and able to
communicate effectively with all parties—as Ron Klain was as Ebola
czar in the Obama administration.

In addition to the gaming out of various potential scenarios, adequate
preparation must include a military-like model of procurement
and production. The military doesn’t wait until war is declared to start
building aircraft carriers, fighter jets, or other weapons systems. It
develops weapons over a period of years, with congressional funding
projected over the entire development span. The same type of approach
is needed to develop the weapons systems to fight potential
pandemics. Relying solely on the market and the private sector to
take care of this is a recipe for failure, because in many cases, there
will be no viable customer other than the government to fund both
the development and the manufacturing process.

That has proved particularly true when it comes to drug development,
even when there is no pandemic. For many of the most critical
drugs, a market-driven approach that relies on private pharmaceutical
companies simply doesn’t work. The problem is evident, for example,
in the production of antibiotics. Because of the growing problem of
antimicrobial resistance—which threatens to bring back a pre-antibiotic
dark age, in which a cut or a scrape could kill and surgery was a
risk-filled nightmare—it makes little sense for pharmaceutical companies
to devote enormous human and financial resources to developing
a powerful new antibiotic that might subsequently be restricted to use
in only the most extreme cases. But in a flu pandemic, such highly
effective antibiotics would be essential, since a primary cause of death
in recent flu outbreaks has been secondary bacterial pneumonia infecting
lungs weakened by the virus.


The same holds for developing vaccines or treatments for diseases
such as Ebola. Such drugs have virtually no sales most of the time but
are critical to averting an epidemic when an outbreak strikes. Governments
must be willing to subsidize the research, development, clinical
trials, and manufacturing capacity for such drugs the same way they
subsidize the development and manufacture of fighter planes and tanks.
Preparation for pandemics and for the necessary surge of medical
countermeasures will also require being more attentive to where drugs
and medical supplies are produced. In times of pandemic, every nation
will be competing for the same critical drugs and medical supplies at the
same time, so it is entirely reasonable to expect that each will prioritize
its own needs when distributing what it produces and controls. There is
also the ongoing threat that a localized infectious hot spot will close down
a manufacturing facility that produces critical drugs or medical supplies.
Despite the higher costs that it would involve, it is absolutely essential
that the United States lessen its dependence on China and India for its
lifesaving drugs and develop additional manufacturing capacity in the
United States itself and in reliably friendly Western nations.
The U.S. government must also get more strategic in overseeing
the Strategic National Stockpile. Not only does it need to perform
realistic evaluations of what should be on hand to meet surges in
demand at any given time, in order to avoid repeating the current
shame of not having enough PPE for health-care workers and first
responders; supplies should also be rotated in and out on a regular
basis, so that, for instance, the store doesn’t end up including masks
with degraded rubber bands or expired medications.

HOLISTIC TREATMENT

To make progress on either a specific vaccine or a vaccine platform for
diseases of pandemic potential, governments have to play a central
role. That includes funding basic research, development, and the
Phase 3 clinical trials necessary for validation and licensing. (This
phase is often referred to as “the valley of death,” because it is the
point at which many drugs with early laboratory promise don’t pan
out in real-world applications.) It is also imperative that governments
commit to purchasing these vaccines.
With its current concentration on the development of a vaccine for
COVID -19 and other medical countermeasures, BARDA has had to put
other projects on the back burner. For all the complaints about its cumbersome
contracting process and tight oversight controls (said by critics to stifle outside-the-box
thinking and experimentation), BARDA   is the closest thing the U.S. government has
to a venture capital firm for epidemic response. Covid-19 should spur a commitment
to upgrading it, and a panel of experts should undertake a review of
BARDA ’s annual budget and scope to determine what the agency needs
to meet and respond to future biomedical challenges.

Of all the vaccines that deserve priority, at the very top of the list
should be a “universal” influenza vaccine, which would be game changing.
Twice a year, once for the Northern Hemisphere and once for the
Southern Hemisphere, through an observational and not very precise
committee process, international public health officials try to guess which
flu strains are likely to flare up the next fall, and then they rush a new
vaccine based on these guesstimates into production and distribution.
The problem is that influenza can mutate and reassort its genes with
maddening ease as it passes from one living animal or human host to the
next, so each year’s seasonal flu vaccine is usually only partly effective—
better than nothing, but not a precise and directly targeted bullet like the
smallpox or the measles vaccine. The holy grail of influenza immunity
would be to develop a vaccine that targets the conserved elements of the
virus—that is, the parts that don’t change from one flu strain to the next,
no matter how many mutations or iterations the virus goes through.
A universal influenza vaccine would require a monumental scientific
effort, on the scale of the billion-dollar annual investment that
has gone into fighting HIV/AIDS . The price tag would be enormous,
but since another population-devouring flu pandemic will surely
visit itself on the globe at some point, the expense would be justified
many times over. Such a vaccine would be the greatest public health
triumph since the eradication of smallpox.

Of course, no single nation can fight a pandemic on its own. Microbes
do not respect borders, and they manage to figure out workarounds
to restrictions on international air travel. As the Nobel
Prize–winning molecular biologist Joshua Lederberg warned, “The
microbe that felled one child in a distant continent yesterday can reach
yours today and seed a global pandemic tomorrow.” With that insight
in mind, there should be a major, carefully coordinated disaster drill
every year, similar to the military exercises the United States holds
with its allies, but with a much broader range of partners. These should
involve governments, public health and emergency-response institutions,
and the major medically related manufacturing industries of
various nations that will need to work together quickly when worldwide
disease surveillance—another vital component of pandemic preparedness—
recognizes an outbreak.

The world was able to eradicate smallpox, one of the great
scourges of history, because the two superpowers, the United States
and the Soviet Union, both committed to doing so, following an appeal
at the 1958 convening of the World Health Assembly, the decision-
making body of the WHO. Today’s tense geopolitics makes such
a common commitment hard to achieve. But without it, there is
little chance of adequate preparation for the next pandemic. The
current global health architecture is far from sufficient. It has little
hope of containing an even more threatening outbreak. Instead,
something along the lines of NATO will be necessary—a public-health
oriented treaty organization with prepositioned supplies, a deployment
blueprint, and an agreement among signatories that an epidemic outbreak
in one country will be met with a coordinated and equally vigorous
response by all. Such an organization could work in concert with the
WHO and other existing institutions but act with greater speed, efficiency, and resources.

It is easy enough to dismiss warnings of another 1918 - like pandemic:
the next pandemic might not arise in our lifetimes, and by
the time it does, science may have come up with robust medical
countermeasures to contain it at lower human and economic cost.
These are reasonable possibilities. But reasonable enough to collectively
bet our lives on? History says otherwise.∂