Friday, October 2, 2020

Capitalism after the Pandemic -Getting recovery right

 

  • MARIANA MAZZUCATO is a Professor at University College London and the author of The Value of Everything: Making and Taking in the Global Economy.

After the 2008 financial crisis, governments across the world injected over $3 trillion into the financial system. The goal was to unfreeze credit markets and get the global economy working again. But instead of supporting the real economy—the part that involves the production of actual goods and services—the bulk of the aid ended up in the financial sector. Governments bailed out the big investment banks that had directly contributed to the crisis, and when the economy got going again, it was those companies that reaped the rewards of the recovery. Taxpayers, for their part, were left with a global economy that was just as broken, unequal, and carbon-intensive as before. “Never let a good crisis go to waste,” goes a popular policymaking maxim. But that is exactly what happened.

Now, as countries are reeling from the COVID-19 pandemic and the resulting lockdowns, they must avoid making the same mistake. In the months after the virus first surfaced, governments stepped in to address the concomitant economic and health crises, rolling out stimulus packages to protect jobs, issuing rules to slow the spread of the disease, and investing in the research and development of treatments and vaccines. These rescue efforts are necessary. But it is not enough for governments to simply intervene as the spender of last resort when markets fail or crises occur. They should actively shape markets so that they deliver the kind of long-term outcomes that benefit everyone.

The world missed the opportunity to do that back in 2008, but fate has handed it another chance. As countries climb out of the current crisis, they can do more than spur economic growth; they can steer the direction of that growth to build a better economy. Instead of handing out no-strings-attached assistance to corporations, they can condition their bailouts on policies that protect the public interest and tackle societal problems. They can require COVID-19 vaccines receiving public support to be made universally accessible. They can refuse to bail out companies that won’t curb their carbon emissions or won’t stop hiding their profits in tax havens.

For too long, governments have socialized risks but privatized rewards: the public has paid the price for cleaning up messes, but the benefits of those cleanups have accrued largely to companies and their investors. In times of need, many businesses are quick to ask for government help, yet in good times, they demand that the government step away. The COVID-19 crisis presents an opportunity to right this imbalance through a new style of dealmaking that forces bailed-out companies to act more in the public interest and allows taxpayers to share in the benefits of successes traditionally credited to the private sector alone. But if governments instead focus only on ending the immediate pain, without rewriting the rules of the game, then the economic growth that follows the crisis will be neither inclusive nor sustainable. Nor will it serve businesses interested in long-term growth opportunities. The intervention will have been a waste, and the missed opportunity will merely fuel a new crisis. 

THE ROT IN THE SYSTEM

Advanced economies had been suffering from major structural flaws well before COVID-19 hit. For one thing, finance is financing itself, thus eroding the foundation of long-term growth. Most of the financial sector’s profits are reinvested back into finance—banks, insurance companies, and real estate—rather than put toward productive uses such as infrastructure or innovation. Only ten percent of all British bank lending, for example, supports nonfinancial firms, with the rest going to real estate and financial assets. In advanced economies, real estate lending constituted about 35 percent of all bank lending in 1970; by 2007, it had risen to about 60 percent. The current structure of finance thus fuels a debt-driven system and speculative bubbles, which, when they burst, bring banks and others begging for government bailouts. 

Another problem is that many large businesses neglect long-term investments in favor of short-term gains. Obsessed with quarterly returns and stock prices, CEOs and corporate boards have rewarded shareholders by buying back stocks, increasing the value of the remaining shares and hence of the stock options that form part of most executive pay packages. In the last decade, Fortune 500 companies have repurchased more than $3 trillion worth of their own shares. These buybacks come at the expense of investment in wages, worker training, and research and development.

Then there is the hollowing out of government capacity. Only after an explicit market failure do governments usually step in, and the policies they put forward are too little, too late. When the state is viewed not as a partner in creating value but as just a fixer, publicly funded resources are starved. Social programs, education, and health care all go underfunded.

The relationship between the public and the private sector is broken.

These failures have added up to mega-crises, both economic and planetary. The financial crisis was to a large extent caused by excessive credit flowing into the real estate and financial sectors, inflating asset bubbles and household debt rather than supporting the real economy and generating sustainable growth. Meanwhile, the lack of long-term investments in green energy has hastened global warming, to the point where the UN Intergovernmental Panel on Climate Change has warned that the world has just ten years left to avoid its irreversible effects. And yet the U.S. government subsidizes fossil fuel companies to the tune of some $20 billion a year, largely through preferential tax exemptions. The EU’s subsidies total around $65 billion per year. At best, policymakers trying to deal with climate change are considering incentives, such as carbon taxes and official lists of which investments count as green. They have stopped short of issuing the type of mandatory regulations that are required to avert disaster by 2030.

The COVID-19 crisis has only worsened all these problems. For the moment, the world’s attention is focused on surviving the immediate health crisis, not on preventing the coming climate crisis or the next financial crisis. The lockdowns have devastated people who work in the perilous gig economy. Many of them lack both the savings and the employer benefits—namely, health care and sick leave—needed to ride out the storm. Corporate debt, a key cause of the previous financial crisis, is only climbing higher as companies take on hefty new loans to weather the collapse in demand. And many companies’ obsession with pleasing the short-term interests of their shareholders has left them with no long-term strategy to see them through the crisis.

The pandemic has also revealed how imbalanced the relationship between the public and the private sector has become. In the United States, the National Institutes of Health (NIH) invests some $40 billion a year on medical research and has been a key funder of the research and development of COVID-19 treatments and vaccines. But pharmaceutical companies are under no obligation to make the final products affordable to Americans, whose tax money is subsidizing them in the first place. The California-based company Gilead developed its COVID-19 drug, remdesivir, with $70.5 million in support from the federal government. In June, the company announced the price it would charge Americans for a treatment course: $3,120.

It was a typical move for Big Pharma. One study looked at the 210 drugs approved by the U.S. Food and Drug Administration from 2010 to 2016 and found that “NIH funding contributed to every one.” Even so, U.S. drug prices are the highest in the world. Pharmaceutical companies also act against the public interest by abusing the patent process. To ward off competition, they file patents that are very broad and hard to license. Some of them are too upstream in the development process, allowing companies to privatize not only the fruits of research but also the very tools for conducting it.

For too long, governments have socialized risks but privatized rewards.

Equally bad deals have been made with Big Tech. In many ways, Silicon Valley is a product of the U.S. government’s investments in the development of high-risk technologies. The National Science Foundation funded the research behind the search algorithm that made Google famous. The U.S. Navy did the same for the GPS technology that Uber depends on. And the Defense Advanced Research Projects Agency, part of the Pentagon, backed the development of the Internet, touchscreen technology, Siri, and every other key component in the iPhone. Taxpayers took risks when they invested in these technologies, yet most of the technology companies that have benefited fail to pay their fair share of taxes. Then they have the audacity to fight against regulations that would protect the privacy rights of the public. And although many have pointed to the power of artificial intelligence and other technologies being developed in Silicon Valley, a closer look shows that in these cases, too, it was high-risk public investments that laid the foundations. Without government action, the gains from those investments could once again flow largely to private hands. Publicly funded technology needs to be better governed by the state—and in some cases owned by the state—in order to ensure that the public benefits from its own investments. As the mass closure of schools during the pandemic has made clear, only some students have access to the technology needed for at-home schooling, a disparity that only furthers inequality. Access to the Internet should be a right, not a privilege.

RETHINKING VALUE

All of this suggests that the relationship between the public and the private sector is broken. Fixing it requires first addressing an underlying problem in economics: the field has gotten the concept of value wrong. Modern economists understand value as interchangeable with price. This view would be anathema to earlier theorists such as François Quesnay, Adam Smith, and Karl Marx, who saw products as having intrinsic value related to the dynamics of production, value that wasn’t necessarily related to their price.

The contemporary concept of value has enormous implications for the way economies are structured. It affects how organizations are run, how activities are accounted for, how sectors are prioritized, how the government is viewed, and how national wealth is measured. The value of public education, for example, does not figure into a country’s GDP because it is free—but the cost of teachers’ salaries does. It is only natural, then, that so many people talk about public “spending” rather than public “investment.” This logic also explains why Goldman Sachs’s then CEO, Lloyd Blankfein, could claim in 2009, just a year after his company received a $10 billion bailout, that its workers were “among the most productive in the world.” After all, if value is price, and if Goldman Sachs’s income per employee is among the highest in the world, then of course its workers must be among the most productive in the world. 

Changing the status quo requires coming up with a new answer to the question, What is value? Here, it is essential to recognize the investments and creativity provided by a vast array of actors across the economy—not only businesses but also workers and public institutions. For too long, people have acted as if the private sector were the primary driver of innovation and value creation and therefore were entitled to the resulting profits. But this is simply not true. Pharmaceutical drugs, the Internet, nanotechnology, nuclear power, renewable energy—all were developed with an enormous amount of government investment and risk taking, on the backs of countless workers, and thanks to public infrastructure and institutions. Appreciating the contribution of this collective effort would make it easier to ensure that all efforts were properly remunerated and that the economic rewards of innovation were distributed more equitably. The road to a more symbiotic partnership between public and private institutions begins with the recognition that value is created collectively. 

BAD BAILOUTS

Beyond rethinking value, societies need to prioritize the long-term interests of stakeholders rather than the short-term interests of shareholders. In the current crisis, that should mean developing a “people’s vaccine” for COVID-19, one that is accessible to everyone on the planet. The drug-innovation process should be governed in a way that fosters collaboration and solidarity among countries, both during the research-and-development phase and when it comes time to distribute the vaccine. Patents should be pooled among universities, government labs, and private companies, allowing knowledge, data, and technology to flow freely around the world. Without these steps, a COVID-19 vaccine risks becoming an expensive product sold by a monopoly, a luxury good that only the richest countries and citizens can afford.

More generally, countries must also structure public investments less like handouts and more like attempts to shape the market to the public’s benefit, which means attaching strings to government assistance. During the pandemic, those conditions should promote three particular objectives: First, maintain employment to protect the productivity of businesses and the income security of households. Second, improve working conditions by providing adequate safety, decent wages, sufficient levels of sick pay, and a greater say in decision-making. Third, advance long-term missions such as reducing carbon emissions and applying the benefits of digitization to public services, from transport to health. 

The United States’ main response to COVID-19—the CARES (Coronavirus Aid, Relief, and Economic Security) Act, passed by Congress in March—illustrates these points in reverse. Rather than put in place effective payroll supports, as most other advanced countries did, the United States offered enhanced temporary unemployment benefits. This choice led to over 30 million workers being laid off, causing the United States to have one of the highest rates of pandemic-related unemployment in the developed world. Because the government offered trillions of dollars in both direct and indirect support to large corporations without meaningful conditions, many companies were free to take actions that could spread the virus, such as denying paid sick days to their employees and operating unsafe workplaces.

The CARES Act also established the Paycheck Protection Program, under which businesses received loans that would be forgiven if employees were kept on the payroll. But the PPP ended up serving more as a massive cash grant to corporate treasuries than as an effective method of saving jobs. Any small business, not just those in need, could receive a loan, and Congress quickly loosened the rules regarding how much a firm needed to spend on payroll to have the loan forgiven. As a result, the program put a pitifully small dent in unemployment. An MIT team concluded that the PPP handed out $500 billion in loans yet saved only 2.3 million jobs over roughly six months. Assuming that most of the loans are ultimately forgiven, the annualized cost of the program comes out to roughly $500,000 per job. Over the summer, both the PPP and the expanded unemployment benefits ran out, and the U.S. unemployment rate still exceeded ten percent.

For too long, governments have socialized risks but privatized rewards.

Congress has so far authorized over $3 trillion in spending in response to the pandemic, and the Federal Reserve injected an additional $4 trillion or so into the economy—together totaling more than 30 percent of U.S. GDP. Yet these vast expenditures have achieved nothing in terms of addressing urgent, long-term issues, from climate change to inequality. When Senator Elizabeth Warren, Democrat of Massachusetts, proposed attaching conditions to the bailouts—to ensure higher wages and greater decision-making power for workers and to restrict dividends, stock buybacks, and executive bonuses—she could not get the votes.

The point of the government’s intervention was to prevent the collapse of the labor market and to maintain firms as productive organizations—essentially, to act as a catastrophic risk insurer. But this approach cannot be allowed to impoverish government, nor should the funds be permitted to bankroll destructive business strategies. In the case of insolvencies, the government might consider demanding equity positions in the companies it is rescuing, as happened in 2008 when the U.S. Treasury took ownership stakes in General Motors and other troubled firms. And when rescuing businesses, the government should impose conditions that prohibit all sorts of bad behavior: handing out untimely CEO bonuses, issuing excessive dividends, conducting share buybacks, taking on unnecessary debt, diverting profits to tax havens, engaging in problematic political lobbying. They should also stop firms from price gouging, especially in the case of COVID-19 treatments and vaccines. 

Other countries show what a proper response to the crisis looks like. When Denmark offered to pay 75 percent of firms’ payroll costs at the start of the pandemic, it did so on the condition that firms could not make layoffs for economic reasons. The Danish government also refused to bail out companies that were registered in tax havens and barred the use of relief funds for dividends and share buybacks. In Austria and France, airlines were saved on the condition that they reduce their carbon footprint

The British government, by contrast, gave easyJet access to more than $750 million in liquidity in April, even though the airline had paid out nearly $230 million in dividends to shareholders a month earlier. The United Kingdom declined to attach conditions to its bailout of easyJet and other troubled firms in the name of market neutrality, the idea that it is not the government’s job to tell private companies how to spend their money. But a bailout can never be neutral: by definition, a bailout involves the government choosing to spare one company, and not another, from disaster. Without conditions, government assistance runs the risk of subsidizing bad business practices, from environmentally unsustainable business models to the use of tax havens. The United Kingdom’s furlough scheme, whereby the government paid up to 80 percent of furloughed employees’ wages, should have in the very least been conditioned on workers not being fired as soon as the program ended. But it wasn’t. 

THE VENTURE CAPITALIST MENTALITY

The state cannot just invest; it must strike the right deal. To do so, it needs to start thinking like what I have called an “entrepreneurial state”—making sure that as it invests, it is not just derisking the downside but also getting a share of the upside. One way to do that is to take an equity stake in the deals it makes. 

Consider the solar company Solyndra, which received a $535 million guaranteed loan from the U.S. Department of Energy before going bust in 2011 and becoming a conservative byword for the government’s inability to pick winners. Around the same time, the Department of Energy gave a $465 million guaranteed loan to Tesla, which went on to experience explosive growth. Taxpayers paid for the failure of Solyndra, but they were never rewarded for the success of Tesla. No self-respecting venture capitalist would structure investments like that. Worse, the Department of Energy structured Tesla’s loan so that it would get three million shares in the company if Tesla was unable to repay the loan, an arrangement designed to not leave taxpayers empty-handed. But why would the government want a stake in a failing company? A smarter strategy would have been to do the opposite and ask Tesla to pay three million shares if it was able to repay the loan. Had the government done that, it would have earned tens of billions of dollars as Tesla’s share price grew over the course of the loan—money that could have covered the cost of the Solyndra failure with plenty left over for the next round of investments. 

But the point is to worry not just about the monetary reward of public investments. The government should also attach strong conditions to its deals to ensure they serve the public interest. Medicines developed with government help should be priced to take that investment into account. The patents that the government issues should be narrow and easily licensable, so as to foster innovation, promote entrepreneurship, and discourage rent seeking. 

Lining up for free groceries in Chelsea, Massachusetts, April 2020

Governments also need to consider how to use the returns on their investments to promote a more equitable distribution of income. This is not about socialism; it is about understanding the source of capitalistic profits. The current crisis has led to renewed discussions about a universal basic income, whereby all citizens receive an equal regular payment from the government, regardless of whether they work. The idea behind this policy is a good one, but the narrative would be problematic. Since a universal basic income is seen as a handout, it perpetuates the false notion that the private sector is the sole creator, not a co-creator, of wealth in the economy and that the public sector is merely a toll collector, siphoning off profits and distributing them as charity. 

A better alternative is a citizen’s dividend. Under this policy, the government takes a percentage of the wealth created with government investments, puts that money in a fund, and then shares the proceeds with the people. The idea is to directly reward citizens with a share of the wealth they have created. Alaska, for example, has distributed oil revenues to residents through an annual dividend from its Permanent Fund since 1982. Norway does something similar with its Government Pension Fund. California, which hosts some of the richest companies in the world, might consider doing something similar. When Apple, headquartered in Cupertino, California, set up a subsidiary in Reno, Nevada, to take advantage of that state’s zero percent corporate tax rate, California lost an enormous amount of tax revenue. Not only should such tax gimmicks be blocked, but California should also fight back by creating a state wealth fund, which would offer a way besides taxation to directly capture a share of the value created by the technology and companies it fostered.

A citizen’s dividend allows the proceeds of co-created wealth to be shared with the larger community—whether that wealth comes from natural resources that are part of the common good or from a process, such as public investments in medicines or digital technologies, that has involved a collective effort. Such a policy should not serve as a substitute for getting the tax system to work right. Nor should the state use the lack of such funds as an excuse to not finance key public goods. But a public fund can change the narrative by explicitly recognizing the public contribution to wealth creation—key in the political power play between forces. 

THE PURPOSE-DRIVEN ECONOMY

When the public and private sectors come together in pursuit of a common mission, they can do extraordinary things. This is how the United States got to the moon and back in 1969. For eight years, NASA and private companies in sectors as varied as aerospace, textiles, and electronics collaborated on the Apollo program, investing and innovating together. Through boldness and experimentation, they achieved what President John F. Kennedy called “the most hazardous and dangerous and greatest adventure on which man has ever embarked.” The point was not to commercialize certain technologies or even to boost economic growth; it was to get something done together.

More than 50 years later, in the midst of a global pandemic, the world has a chance to attempt an even more ambitious moonshot: the creation of a better economy. This economy would be more inclusive and sustainable. It would emit less carbon, generate less inequality, build modern public transport, provide digital access for all, and offer universal health care. More immediately, it would make a COVID-19 vaccine available to everyone. Creating this type of economy will require a type of public-private collaboration that hasn’t been seen in decades.

Some who talk about recovering from the pandemic cite an appealing goal: a return to normalcy. But that is the wrong target; normal is broken. Rather, the goal should be, as many have put it, to “build back better.” Twelve years ago, the financial crisis offered a rare opportunity to change capitalism, but it was squandered. Now, another crisis has presented another chance for renewal. This time, the world cannot afford to let it go to waste.

Thursday, October 1, 2020

Why Armenia and Azerbaijan Are on the Brink of war

 

  • JEFFREY MANKOFF is a Distinguished Research Fellow at the U.S. National Defense University’s Institute for National Strategic Studies and the author of the forthcoming book Empires of Eurasia: How Imperial Legacies Shape International Security.
  • The views expressed here are his own.

On September 27, significant fighting broke out between the militaries of Armenia and Azerbaijan, two states that have been locked in an intractable conflict over the disputed region of Nagorno-Karabakh since the last days of the Soviet Union. Nagorno-Karabakh and surrounding regions have seen periodic outbursts of violence in recent years, but the current fighting is the most serious since Armenia and Azerbaijan signed a cease-fire in 1994.

Domestic political factors in both countries militate against compromise. The international context surrounding the conflict in Nagorno-Karabakh has also shifted in ways that complicate efforts to peacefully address the underlying dispute. In particular, Turkey’s growing involvement in a conflict in which Russia has long been the dominant player risks both giving the protagonists—especially Azerbaijan—an incentive to keep fighting and opening up a new front in the Turkish-Russian rivalry that has already engulfed Syria, Libya, and to a lesser extent Ukraine.

A FROZEN CONFLICT HEATS UP

The origins of the Nagorno-Karabakh conflict can be traced back to the Kremlin’s decision to include the Armenian-majority region within Soviet Azerbaijan. When Moscow relaxed restrictions on popular mobilization in the late 1980s, ethnic Armenians began demanding Nagorno-Karabakh’s transfer to Armenia. Moscow refused, and when the Soviet Union collapsed a few years later, a full-scale war broke out between Armenia and Azerbaijan, leaving around 30,000 dead and over one million displaced. With Azerbaijan led by the pan-Turkic nationalist Abulfaz Elchibey for much of the conflict, Russian forces largely supported the Armenian side. A Russian-brokered cease-fire ended the war in May 1994, but not the underlying dispute: today, Nagorno-Karabakh and seven surrounding districts are under Armenian control, but Azerbaijan regards it as illegally occupied. Although Nagorno-Karabakh typically gets little attention in the West, it is perhaps the most dangerous flash point across post-Soviet Eurasia.

The current clashes broke out on September 27, with barrages of artillery and the deployment of heavy armor along the Line of Contact separating Armenian-controlled Nagorno-Karabakh from Azerbaijan proper. While each side blames the other for firing the first shot, local observers have reported for weeks that escalation seemed imminent. Both countries declared martial law and partially mobilized their reserves, suggesting an expectation of sustained conflict. Clips of this week’s fighting posted online show evidence of significant conflict involving artillery, armor, unmanned aerial vehicles (UAVs), and infantry forces. On Monday, Stepanakert, the capital of Nagorno-Karabakh, came under artillery fire.

Domestic political factors in both countries militate against compromise.

This week’s clashes are hardly the first since the 1994 cease-fire. Sporadic sniping across the Line of Contact is common. In April 2016, an Azerbaijani offensive recaptured several strategic high points, leaving around 200 dead. Although Moscow was able to convince the two governments to return to the cease-fire after a few days, the clash was a warning sign that the status quo—frozen in place since 1994—was in danger of unraveling. Fighting along the Line of Contact broke out again in July 2020, raising tensions and expectations of further conflict.

Unlike previous bouts of fighting, this one may result in significant changes to the status quo. Baku and Yerevan both face increasing pressure to resort to harsh measures. In Armenia, the government of Prime Minister Nikol Pashinyan—which came to power amid a popular uprising in 2018 that Russia largely opposed—is worried about what it sees as Moscow’s increasingly ambivalent support for maintaining the status quo. Despite some initial indications that he would be more open to a negotiated solution, Pashinyan has taken a harder line, including calling for Nagorno-Karabakh to be formally integrated into Armenia.

Nagorno-Karabakh is perhaps the most dangerous flash point across post-Soviet Eurasia.

In Azerbaijan, an economic downturn and frustration at the authoritarian rule of President Ilham Aliyev have fed popular discontent. As the losing side in the initial war, Baku has made public calls for the return of Nagorno-Karabakh to mobilize nationalist support but risks being outflanked by public opinion. During the fighting this summer, protesters stormed the parliament building in Baku demanding war with Yerevan.

The fighting so far has encompassed an Azerbaijani offensive against Fizuli and Jabrayil, two of the Armenian-occupied districts outside Nagorno-Karabakh whose relatively flat terrain facilitates offensive operations. The bulk of their Azeri-majority population fled during the 1990s war, and in recent years, Yerevan has started settling them with Armenians. While the overall population of the two districts remains low, a continued Azerbaijani offensive into Nagorno-Karabakh proper could result in significant refugee flows, possibly in the hundreds of thousands.

ON THE FRONTLINES OF EMPIRE

Unlike many other so-called frozen conflicts in the former Soviet Union, the Nagorno-Karabakh dispute is driven almost entirely by local actors. Russia remains the most important outside actor, but its ability to manage, much less control, the conflict is limited.

Russia maintains upward of 5,000 soldiers in Armenia, which most Armenians tolerate as a guarantee of their security. Although it has sided with Armenia throughout the conflict, Moscow has also cultivated relations with Azerbaijan and is the leading supplier of weapons to both sides. As Azerbaijan’s relations with the West have deteriorated in recent years amid declining interest in its oil and gas reserves and growing concern about Aliyev’s authoritarian rule, Russia has made additional inroads with Baku.

While Moscow does not call the shots on the ground, both sides understand that any resolution to the conflict can come only with Russian support. During previous rounds of fighting (including in July), Russian officials were instrumental in brokering a truce. Today, Russia has little interest in a wider conflict, which could force it to make difficult decisions about how far to take its commitments to Armenia and to devote additional resources to the South Caucasus at a time when it is already engaged on multiple other fronts.

ENTER THE TURKS

Although Russia remains the most important power broker, another external power with historical ties to the region has increasingly sought to shape the outcome of the conflict. Turkey sided with Azerbaijan in the initial conflict in the 1990s, and the two countries share close ethnic and cultural ties. Commentators and officials—mostly Turks—describe the relationship as “one nation, two states.” Until recently, however, Turkey’s involvement in the dispute was relatively limited. But as Ankara has adopted a more assertive posture in the Middle East and eastern Mediterranean under President Recep Tayyip Erdogan, it has become more forthright in its support for Azerbaijan.

Over the past year, Turkey has sold Azerbaijan a wide range of weapons, including UAVs, missiles, and electronic warfare equipment. Once the fighting started in Nagorno-Karabakh, Turkey also offered Azerbaijan strong political support. Erdogan declared that Turkey would “remain by the side of our friend and brother Azerbaijan” and demanded that Armenia immediately return its “occupied territory.” Turkey’s main opposition parties joined Erdogan’s ruling Justice and Development Party in passing a resolution condemning Armenian actions. Turkey has also reportedly dispatched Syrian mercenaries to Azerbaijan, and Armenia claimed this week that a Turkish F-16 shot down one of its fighters (a claim Turkey rejects).

Turkey’s deepening involvement in Nagorno-Karabakh is a dangerous game. Within the South Caucasus, strong Turkish support could encourage Baku to take an uncompromising line and resist calls for a cease-fire that maintains some version of the status quo ante. Turkish involvement could also transform the conflict into an existential one in the eyes of the Armenian public, especially in light of the World War I–era massacres of Armenians by Ottoman forces.

Turkey’s deepening involvement in Nagorno-Karabakh is a dangerous game.

Russo-Turkish relations have thrived despite the two countries’ postimperial competition. But Turkey’s intervention in Nagorno-Karabakh is Ankara’s most overt challenge to Russian influence in the former Soviet Union, where Moscow is extremely protective of its claims to preeminence. Even if Russia remains committed to limiting the fighting between Armenia and Azerbaijan, the overlapping presence of Russian and Turkish forces in many other theaters gives Moscow multiple opportunities to escalate. Indeed, Turkey’s direct involvement in Nagorno-Karabakh raises the stakes not only in the South Caucasus but throughout the areas where Ankara and Moscow are at odds. Already, the two countries back opposing sides in the Libyan and Syrian conflicts, where their proxies have engaged in occasional clashes, and have incompatible ambitions in the Balkans and Ukraine. Ankara likely sees its involvement in Nagorno-Karabakh in part as a bargaining chip not only in the Caucasus but in its wider rivalry with Moscow. And the involvement of Turkish mercenaries from Syria, another theater where Russian and Turkish interests clash, suggests that this time the Nagorno-Karabakh conflict may not stay confined to the South Caucasus.

Renewed fighting in and around Nagorno-Karabakh was not unexpected. But the scale of the ongoing clashes, Turkey’s more prominent role, and the potential for the conflict to spill over into other contested regions have already raised the stakes considerably. For now, Russia is calling on all sides to de-escalate, seemingly caught off-guard by the extent of the fighting and Turkey’s role in it. Thanks in part to its recent success in building ties with Baku, Moscow remains reluctant to take sides or intervene directly. Russia is the only outside power in a position to force the sides to return to the negotiating table. Turkey’s intervention threatens Russia’s traditional mediator role, but Moscow still has considerable financial and political leverage to push for a stop to the fighting. It should do so, even if it will ultimately be up to the protagonists in Baku and Yerevan to step back from the brink.

The U.S. Intelligence Community Is Not Prepared for the China Threat

 

  • ADAM B. SCHIFF serves as the Chairman of the House Permanent Select Committee on Intelligence, and represents California’s 28th Congressional District.

We are witnessing the resurgence of authoritarianism across the globe, and it poses a growing challenge to the very idea of liberal democracy. China, with its expanding economic, military, and diplomatic might, is at the forefront of this neoauthoritarian challenge. Beijing seeks to build a world in which its ambitions are unchallenged and individual freedoms give way to the needs of the state. The United States must rise to meet this challenge—and that task begins with understanding China’s intentions and capabilities.

The House Intelligence Committee has spent the last two years looking at whether our nation’s intelligence apparatus is properly focused, postured, and resourced to understand the many dimensions of the China threat and preparing to advise policymakers on how to respond. We conducted hundreds of hours of interviews, visited facilities operated by over a dozen intelligence agencies, and reviewed thousands of analytic assessments in order to produce a classified report with a public summary and recommendations.

What we found was unsettling. Our nation’s intelligence agencies are not ready—not by a long shot. Absent a significant realignment in resources and organization, the United States will be ill prepared to compete with China on the global stage for decades to come.

AMBITIOUS REJUVENATION

China’s rise as a global power has come startlingly fast, and its ambitions have grown even more quickly. The leaders of the Chinese Communist Party believe that they must restore China to its rightful place as the “Middle Kingdom” by achieving what CCP leadership terms the rejuvenation of the Chinese nation. Chinese President Xi Jinping has inextricably linked this concept to the development of a “world-class” military capable of defending China’s core interests, the achievement of “one country, two systems” in Hong Kong and Taiwan, and the elimination of “lax and weak governance” within the CCP itself. In addition to the domestic aspects of its self-proclaimed rejuvenation, Beijing increasingly sees itself as a preeminent power that can dictate terms to its neighbors to achieve its global ambitions. To that end, Beijing has erected an elaborate system of domestic control to maintain power and control information as they do so. This model of technology-driven totalitarianism has also become a growing Chinese export, enabling other would-be autocrats to follow the Chinese example.

The United States will be ill-prepared to compete with China on the global stage for decades to come.

The disturbing degree to which the Chinese government has developed a model of domestic repression is most evident in the western region of Xinjiang, where the Uighur population lives in a vast panopticon of constant surveillance and little contact with the outside world. Not content with mere control, Beijing has also sought to destroy Uighur religion, culture, and society and has erected concentration camps that hold millions of Uighurs in the worst human rights abuses of the twenty-first century. 

China itself views competition with the United States unfolding in ideological and zero-sum terms. It has sought to modernize the People’s Liberation Army and develop doctrine for new domains, such as space and cyber, that would redefine existing conceptions of how a twenty-first century war would unfold, extending the battlefield to our political discourse, mobile devices, and the very infrastructure that modern digital communication and communities rely upon. In that vein, we have also seen China seek to distort the stark reality of the coronavirus pandemic, preventing the world from learning about early indicators and pushing disinformation to blame anyone but themselves for the start and rapid spread of the virus.

A NECESSARY REALIGNMENT 

For the United States to effectively anticipate and respond, we will need the expertise of our intelligence agencies. But as our review found, the intelligence community’s focus and expertise on China is lacking. After 9/11, the United States and its intelligence agencies rapidly reoriented toward a counterterrorism mission to protect the homeland. Although those moves were both necessary and largely successful, our abilities and resources devoted to other priority missions—such as China—waned. In the interim, China has transformed itself into a nation potentially capable of supplanting the United States as the leading power in the world. In tandem with this transformation, Beijing’s increasing control of the domestic information environment and opaque decision-making process has continued to vex U.S. leaders seeking to develop sound and impactful policy toward China.

Going forward, if we fail to accurately predict and characterize Beijing’s intent, we will continue to struggle to understand how and why the leadership of the CCP makes decisions and fail to respond effectively. The good news is that we still have time to change course.

First, our intelligence agencies need to significantly realign resources and personnel to meet the challenge that China poses, quickly and across almost every single agency. China cannot be viewed just through an Asia-specific lens but instead must be integrated throughout the intelligence enterprise and its functional missions. This is especially true when it comes to our ability to provide analysis and warnings on “soft” threats such as pandemics, climate change, and economic trends, which recent experience has shown can have immense consequences for national security. As the intelligence community prioritizes analytic questions related to China, it must focus on the areas of competition that will enable the United States to succeed.

Second, intelligence agencies must do a better job of adapting to the sheer amount of open-source data available to them about global threats and competitors and to quickly get the resulting intelligence to decision-makers. Given the increasing pace of global events, driven partly by social media and mobile communications, we need to quickly adapt and modernize. That means properly utilizing artificial intelligence and machine learning to analyze data to find what we need to make decisions quickly. An external organization should be tasked with conducting a study on the intelligence community’s prosecution of the open-source intelligence mission and make formal recommendations to streamline and strengthen its governance and capabilities. Similarly, the intelligence community should prioritize transferring successful start-up initiatives to long-term sustainment at the earliest possible date, protecting dedicated funding for future innovation whenever possible.  

Third, we need to change how we view the threat from China. Beijing presents not only a military threat but also economic, technological, health, and counterintelligence threats. Addressing these dimensions of the challenge will require a significant realignment of the types of individuals and skill sets we recruit, retain, invest in, and grant security clearances to, including through hiring analysts with nontraditional backgrounds in technology and science. The intelligence community should expand its practice of hiring technical experts, such as trained health professionals, economists, and technologists, to serve throughout its analytic corps. It should also formalize and broaden programs designed to hire and mentor the next generation of China analysts. That is also why we should adopt one of the best lessons from our counterterrorism mission and embed real-time intelligence support on China within different agencies—especially those outside the Defense Department, such at the office of the U.S. Trade Representative, the Department of Commerce, and science and health agencies, which are often on the frontlines of this new multidimensional struggle.

The United States cannot give up on global leadership.

The intelligence community must also continue to prioritize the counterintelligence challenge that China poses. Beyond the known threat emanating from China’s intelligence services, there are a range of Chinese influence actors and operations, many of which are funded and organized by the CCP’s United Front Work Department. According to the Department of Defense’s 2019 China Military Power Report, Chinese influence efforts have targeted cultural institutions, state- and municipal-level government offices, media organizations, educational institutions, businesses, think tanks, and policy communities. The U.S. government must strengthen its ability to categorize, disrupt, and deter such Chinese influence operations occurring on U.S. soil.  

It has become all too clear that the United States cannot give up on global leadership, because if it does, China will gladly step into the breach with its own malign intentions. Even as we contend with the threat from China, we must dramatically increase our own engagement with the rest of the world, including by championing democracy and human rights.

Yet for all the talk in Washington about the need to be “tough on China,” there has been scant action within the U.S. intelligence community—because action, unlike talk, requires hard choices about funding and priorities. But these are not choices we should shrink from. We have to face them before it is too late to act, because unless our intelligence apparatus has its eyes squarely trained on Beijing’s malevolent activity around the world, it is not just our national security that will suffer—so will our economy, health security, and technological edge.


The United States Is NotEntitled to Lead the World

 

  • JAMES GOLDGEIER is Robert Bosch Senior Visiting Fellow at the Brookings Institution and Professor of International Relations at American University.
  • BRUCE W. JENTLESON is William Preston Few Distinguished Professor of Public Policy at Duke University and a former U.S. State Department official in the Obama and Clinton administrations.

That the United States should lead the world is often taken for granted, at least in Washington, D.C. The country played that role for more than seven decades after World War II, and most Americans don’t want China to assume it. So it would be easy to think that if the American people vote Donald Trump out of office and bring in committed internationalist Joe Biden, the United States can just go back to “the head of the table,” as Biden’s recent Foreign Affairs article claimed. But global leadership is not an American entitlement.

Trump has broken with traditions of U.S. global leadership in a long and familiar list of ways. But while most of Washington’s allies (with a few notable exceptions, such as Israel and Saudi Arabia) are of the “Anybody but Trump” inclination, restoring a constructive U.S. role in the world will require much more than proclaiming that the United States is back and reverting to a pre-Trump playbook. The country must come to terms with fundamental shifts in its global position. Seen in historical perspective, the country has gone from being apart to atop and, now, to amid the world, and the transition requires some adjustments.

A Tarnished Model

The self-proclaimed “greatest democracy in the world” has been an erratic one since the late 1990s: in just more than two decades, the country has seen two presidents impeached, an election ultimately decided by the Supreme Court, an internationally controversial war in Iraq, and a financial crisis that sent shock waves around the world. In 2008, the country elected a globally popular Black senator to the presidency—only to lurch in a very different direction eight years later by electing a racist reality TV host who blames American allies for the country’s ills.

If we think of politics as we do monetary currencies—measuring stability by fluctuations within an equilibrium zone—why should the United States’ friends trust that even if Trump loses the 2020 election, American politics will stay within that political equilibrium zone for long? Instead, even close allies will have to hedge their bets, in case the United States shifts yet again in the following presidential election or even after the 2022 midterms.

The self-proclaimed “greatest democracy in the world” has been an erratic one since the late 1990s.

The country’s domestic policy performance has hardly made the United States a paragon of effective governance. The country ranks 27th of 31 Organization for Economic Cooperation and Development countries for social justice, reflecting policies that go back much further than Trump. Economic equality has been declining for more than 40 years, while “deaths of despair” are rapidly rising. Systemic racism tarnishes the country’s image abroad as a champion of democracy, justice, and the rule of law.

Such are the “pre-existing conditions of our body politic,” in the words of the Pulitzer Prize–winning author Viet Thanh Nguyen, and the U.S. response to the coronavirus pandemic has reflected them. No country has been perfect on COVID-19 (even New Zealand saw some community transmission after going more than three months without), but nowhere else is the government fighting so much within itself, with lockdown protesters flaunting guns inside legislative assemblies. The Centers for Disease Control and Prevention, once considered the gold standard for global disease detection and control, has been demeaned and weakened.

Systemic racism tarnishes the country’s image abroad as a champion of democracy.

By mid-July, more Americans had died from COVID-19 than in the Vietnam, Persian Gulf, Afghanistan, and Iraq wars combined; by late September, COVID-19 deaths had increased another 43 percent. Despite a continuing economic downturn, the country still struggles to move past political gamesmanship. Why would anyone in the world think that the United States could provide serious global leadership? Even Hans Morgenthau, the intellectual godfather of power politics, stressed the need to “concentrate . . . efforts upon creating a society at home which can . . . serve as a model for other nations to emulate.”

Think about it: if only 17 percent of Americans trust the government, why should others trust the United States? 

Apart, Atop, Amid

For much of its first century and a half, the United States took advantage of its geographic distance from Europe and Asia to stay generally apart from the world. The country did not strictly isolate itself, but it selectively chose when and where to engage. After 1945, the United States sat mostly atop the world, as the dominant power in military, economic, ideological, and diplomatic terms—all the more so after the Soviet Union collapsed in 1991. Today the United States finds itself not apart or atop but rather amid the world, both shaping and being shaped by global events and forces.

Today’s world is no longer one in which any nation—whether the United States or China—can sit atop the others. Shifts in the relationships between states have made such domination less likely.

Great powers can most easily dominate when a single security threat unites a group of states, superseding other, possibly divergent interests. Take, for instance, the early nineteenth century, when the Concert of Europe emerged after the havoc of the Napoleonic period, or the Cold War, when the United States and the Soviet Union viewed each other as existential threats and countries sought protection from one or the other superpower. The twenty-first century has yet to furnish an overarching, shared security threat. The administration of President George W. Bush did not succeed in creating one with its post-9/11 global “war on terror.” China has become more aggressive and will likely remain the United States’ major competitor for decades, but the U.S. effort to stir up a “new China scare” has limited appeal to countries that want to maintain relations with both nations.

Today’s world is no longer one in which any nation can sit atop the others.

In today’s world of comparatively diffuse threats and interests, few states feel best served by a largely exclusive relationship with just one major power. During the Cold War, U.S. allies in Europe and Asia genuinely feared that the Soviet Union would either invade or try to undermine their political systems. Few harbor comparable fears today, and correspondingly few feel the need to choose sides. India and Australia have significant tensions with China but still cooperate with Beijing on matters of mutual interest. Even with all the support the Trump administration has given Israel, China is now that country’s largest Asian trading partner and an increasingly prominent investor in its economy. Saudi Arabia, another Trump favorite, may be turning to China for a nuclear weapons program.  

During the Cold War and immediately after, the United States was an attractive protector because of its military preponderance and centrality to the international economy. Neither, though, provides comparable leverage today. While American military power remains crucial for extended deterrence through NATO and Indo-Pacific partnerships, close to 20 years of war in Afghanistan and Iraq, at a cost of more than $6 trillion, demonstrates the limited utility of military superiority for achieving strategic objectives. The U.S. share of global GDP—51 percent in 1951 and 25 percent in 1991—has declined to around 15 percent. And the sanctions the United States has imposed against Iran, Venezuela, and North Korea have inflicted heavy economic costs, but without producing compliance with U.S. demands. 

Few states feel best served by a largely exclusive relationship with just one major power.

During the Cold War, the United States was able to lead in part by dividing the world into democracies and autocracies. But such ideological bifurcation has its limits. Democratic allies are the United States’ natural partners, and the strongest foreign policy to counter China, Russia, and other autocratic states remains a collective one. But the United States has always been inconsistent, if not hypocritical, in counting certain nondemocracies as allies or partners. Both during the Cold War and today, issues such as arms control, nuclear nonproliferation, climate change, and, as we now know, pandemics require the United States to cooperate with authoritarian regimes in order to achieve American objectives.

Indeed, the combination of COVID-19 and the ever-worsening climate crisis has driven home that we live amid rather than atop the world. Even if the United States had first-rate domestic policies on pandemic and climate change prevention, it would still be vulnerable to what others in the world do and don’t do. Climate change causes 400,000 deaths globally each year, compared with fewer than 16,000 from terrorism in 2018, and this rate is projected to increase by 50 percent by 2030. Building resilience against threats such as these has to be a globally shared enterprise.

Leadership, but Chastened           

The United States remains enormously powerful, but being more amid than atop requires a chastened rather than restorationist approach to internationalism. Washington needs to recognize the global leadership roles others can and must play. In the future, there will be very few issues on which the United States plays a solo leadership role and some on which others are better suited to take the lead. Public opinion within the United States supports such an approach, with 68 percent preferring that the United States share leadership rather than dominate.

A Biden administration cannot just return to multilateral agreements that Trump has abandoned, such as the Paris climate accord and the Iran nuclear deal. It needs to push them further. Even if Trump had not reneged on the Paris agreement and all countries were on track to meet their pledges (which few are), greenhouse gas emissions would still be close to double what they should be. A Biden administration should therefore not only rejoin but push for a Paris-plus, which would make signatories’ commitments more binding and enforceable. Biden can build on the energy and enthusiasm of his party’s progressive wing in order to go big in dealing with climate change. The recent Pew Research Center polls suggest that he would have the support of close to two-thirds of the American public in doing so.

As for the Iran nuclear deal, its original intent was both to address nuclear proliferation and to establish a basis for working out other tensions in the Iranian-U.S. relationship over time. Now the deal’s sunset provisions are closer to expiration, and geopolitical tensions have intensified. Rejoining the deal won’t be enough to solve these problems: the participants will need new agreements that are stronger and up to date. But Tehran is understandably wary about the durability of any American commitment, the Europeans are furious about the secondary sanctions Trump has imposed on them, and Russia and China are exploiting the situation despite the fundamental interest they share in a non-nuclear Iran. These circumstances will require an even greater collaborative effort from the P5+1 than before.

In the future, there will be very few issues on which the United States plays a solo leadership role.

A chastened internationalism would require reassessing the state of U.S. alliances. The United States should neither automatically maintain nor precipitously end its longtime commitments, but it does need to recalibrate them based on current national interests. The transatlantic relationship is the perfect place to start. At the end of the Cold War, Europe still seemed to need the United States to remain in charge of its security: Germany’s neighbors feared the country’s unification, and violence convulsed the former Yugoslavia. But the time has come for the United States to actively support European Union efforts to deliver on the promise of a European defense capability. Such efforts should not be seen as a threat to NATO. In fact, cooperation between NATO and the EU, along with a strengthening of U.S.-EU ties, will be critical to their success.

The United States should view NATO as an instrument for coordinating security policies with Canada and Europe, not as a means of dominating its allies. A true rebalancing of U.S. foreign policy toward Asia depends on a stronger Europe, able to do more in its own backyard. The United States should play more of a supporting role over time, rather than having to take charge, as it did, for example, in the Western Balkans in the 1990s.

Within the Indo-Pacific, regional allies need reassurance of American presence and commitment, but they have their own interests in relations with China and are resistant to being pressured into with-us-or-with-them alignments. Japan, for its part, conducts bilateral summits with China. Intraregional agreements—between Australia and India, with Australia and Japan, and within the Association of Southeast Asian Nations, for example—play an ever larger part in regional security. U.S. policy needs to work with, not against, these currents, reinforcing other countries’ interests in restraining China rather than pressuring for bipolarization of the region.

A Biden administration cannot just return to multilateral agreements that Trump has abandoned.

In keeping a more realistic vision of its leadership, the United States can and should recalibrate its role in the Middle East, including relations with Saudi Arabia, and more generally adjust to a reduced expectation of shaping the region’s future. And in Afghanistan, it should commit to a high-level, regional, diplomatic initiative, engaging Pakistan, India, Russia, China, Saudi Arabia, and Iran, all of whom have their own favored groups but also a lot at risk once the United States gets off the frontlines. Success is never guaranteed, but such strategic diplomacy has a better chance of working than making the United States’ longest war in history that much longer.

The United States can play a global leadership role on COVID-19 that is comparable to its twentieth-century one but appropriate to the twenty-first by collaborating with others and allowing others to take the lead as their ideas and capacity warrant. The World Health Organization needs reform, but rather than taking a punitive approach, the United States should follow the German and French lead in increasing funding, building broad support for stronger and more independent WHO authority, and pushing constructively for change. Moreover, after years telling others to learn from it, the United States would do well to learn from others. A liberal government in New Zealand, a conservative government in Australia, a centrist government in Germany, a congenitally weak government in Italy, and any number of others, such as in South Korea and Taiwan, have done far better than the United States. While policies can’t be surgically transplanted from one country to another, lessons can be learned. When this pandemic abates, U.S. officials should immediately make fact-finding missions to those more successful countries, so that the United States will be better prepared for the next global health crisis.

While the United States won’t always be—and shouldn’t always be—at the head of the table, the pandemic has shown what happens when Washington isn’t even at the table. But other nations are not just waiting for Godot. The United States needs to abandon any sense of entitlement—and do what it takes at home and abroad to be a leader amid this twenty-first-century world.