America–Led Liberal World Order, R.I.P

March 22, 2018

America–Led Liberal World Order, R.I.P

by Richard N. Haass–haass-2018-03

Image result for Liberal World Order, R.I.P.

America’s decision to abandon the global system it helped build, and then preserve for more than seven decades, marks a turning point, because others lack either the interest or the means to sustain it. The result will be a world that is less free, less prosperous, and less peaceful, for Americans and others alike.

NEW DELHI – After a run of nearly one thousand years, quipped the French philosopher and writer Voltaire, the fading Holy Roman Empire was neither holy nor Roman nor an empire. Today, some two and a half centuries later, the problem, to paraphrase Voltaire, is that the fading liberal world order is neither liberal nor worldwide nor orderly.

The United States, working closely with the United Kingdom and others, established the liberal world order in the wake of World War II. The goal was to ensure that the conditions that had led to two world wars in 30 years would never again arise.

Image result for Liberal World Order, R.I.P.

To that end, the democratic countries set out to create an international system that was liberal in the sense that it was to be based on the rule of law and respect for countries’ sovereignty and territorial integrity. Human rights were to be protected. All this was to be applied to the entire planet; at the same time, participation was open to all and voluntary. Institutions were built to promote peace (the United Nations), economic development (the World Bank) and trade and investment (the International Monetary Fund and what years later became the World Trade Organization).

All this and more was backed by the economic and military might of the US, a network of alliances across Europe and Asia, and nuclear weapons, which served to deter aggression. The liberal world order was thus based not just on ideals embraced by democracies, but also on hard power. None of this was lost on the decidedly illiberal Soviet Union, which had a fundamentally different notion of what constituted order in Europe and around the world.

The liberal world order appeared to be more robust than ever with the end of the Cold War and the collapse of the Soviet Union. But today, a quarter-century later, its future is in doubt. Indeed, its three components – liberalism, universality, and the preservation of order itself – are being challenged as never before in its 70-year history.

Image result for Liberal World Order, R.I.P.

Liberalism is in retreat. Democracies are feeling the effects of growing populism. Parties of the political extremes have gained ground in Europe. The vote in the United Kingdom in favor of leaving the EU attested to the loss of elite influence. Even the US is experiencing unprecedented attacks from its own president on the country’s media, courts, and law-enforcement institutions. Authoritarian systems, including China, Russia, and Turkey, have become even more top-heavy. Countries such as Hungary and Poland seem uninterested in the fate of their young democracies.

It is increasingly difficult to speak of the world as if it were whole. We are seeing the emergence of regional orders – or, most pronounced in the Middle East, disorders – each with its own characteristics. Attempts to build global frameworks are failing. Protectionism is on the rise; the latest round of global trade talks never came to fruition. There are few rules governing the use of cyberspace.

At the same time, great power rivalry is returning. Russia violated the most basic norm of international relations when it used armed force to change borders in Europe, and it violated US sovereignty through its efforts to influence the 2016 election. North Korea has flouted the strong international consensus against the proliferation of nuclear weapons. The world has stood by as humanitarian nightmares play out in Syria and Yemen, doing little at the UN or elsewhere in response to the Syrian government’s use of chemical weapons. Venezuela is a failing state. One in every hundred people in the world today is either a refugee or internally displaced.Image result for Liberal World Order, R.I.P.

The Retreating Eagle–“America First” and the liberal world order seem incompatible.–Richard N. Haass

There are several reasons why all this is happening, and why now. The rise of populism is in part a response to stagnating incomes and job loss, owing mostly to new technologies but widely attributed to imports and immigrants. Nationalism is a tool increasingly used by leaders to bolster their authority, especially amid difficult economic and political conditions. And global institutions have failed to adapt to new power balances and technologies.

But the weakening of the liberal world order is due, more than anything else, to the changed attitude of the US. Under President Donald Trump, the US decided against joining the Trans-Pacific Partnership and to withdraw from the Paris climate agreement. It has threatened to leave the North American Free Trade Agreement and the Iran nuclear deal. It has unilaterally introduced steel and aluminum tariffs, relying on a justification (national security) that others could use, in the process placing the world at risk of a trade war. It has raised questions about its commitment to NATO and other alliance relationships. And it rarely speaks about democracy or human rights. “America First” and the liberal world order seem incompatible.

My point is not to single out the US for criticism. Today’s other major powers, including the EU, Russia, China, India, and Japan, could be criticized for what they are doing, not doing, or both. But the US is not just another country. It was the principal architect of the liberal world order and its principal backer. It was also a principal beneficiary.

America’s decision to abandon the role it has played for more than seven decades thus marks a turning point. The liberal world order cannot survive on its own, because others lack either the interest or the means to sustain it. The result will be a world that is less free, less prosperous, and less peaceful, for Americans and others alike.

Image result for Richard N. Haass.
*Richard N. Haass, President of the Council on Foreign Relations, previously served as Director of Policy Planning for the US State Department (2001-2003), and was President George W. Bush’s special envoy to Northern Ireland and Coordinator for the Future of Afghanistan. He is the author of A World in Disarray: American Foreign Policy and the Crisis of the Old Order.


Inequality in the 21st Century

March 19, 2018

Inequality in the 21st Century

by Kaushik Basu

Image result for Inequality in 21st Century

As inequality continues to deepen worldwide, we do not have the luxury of sticking to the status quo. Unless we confront the inequality challenge head on – as we have just begun to do with another existential threat, climate change – social cohesion, and especially democracy, will come under growing threat.

At the end of a low and dishonest year, reminiscent of the “low, dishonest decade” about which W.H. Auden wrote in his poem “September 1, 1939,” the world’s “clever hopes” are giving way to recognition that many severe problems must be tackled. And, among the severest, with the gravest long-term and even existential implications, is economic inequality.

Image result for thomas piketty

The alarming level of economic inequality globally has been well documented by prominent economists, including Thomas Piketty, François Bourguignon, Branko Milanović, and Joseph E. Stiglitz, and well-known institutions, including OXFAM and the World Bank. And it is obvious even from a casual stroll through the streets of New York, New Delhi, Beijing, or Berlin.

Image result for Stiglitz on Inequality

Voices on the right often claim that this inequality is not only justifiable, but also appropriate: wealth is a just reward for hard work, while poverty is an earned punishment for laziness. This is a myth. The reality is that the poor, more often than not, must work extremely hard, often in difficult conditions, just to survive.

Moreover, if a wealthy person does have a particularly strong work ethic, it is likely attributable not just to their genetic predisposition, but also to their upbringing, including whatever privileges, values, and opportunities their background may have afforded them. So there is no real moral argument for outsize wealth amid widespread poverty.

This is not to say that there is no justification for any amount of inequality. After all, inequality can reflect differences in preferences: some people might consider the pursuit of material wealth more worthwhile than others. Moreover, differential rewards do indeed create incentives for people to learn, work, and innovate, activities that promote overall growth and advance poverty reduction.

But, at a certain point, inequality becomes so severe that it has the opposite effect. And we are far beyond that point.

Plenty of people – including many of the world’s wealthy – recognize how unacceptable severe inequality is, both morally and economically. But if the rich speak out against it, they are often shut down and labeled hypocrites. Apparently, the desire to lessen inequality can be considered credible or genuine only by first sacrificing one’s own wealth.

The truth, of course, is that the decision not to renounce, unilaterally, one’s wealth does not discredit a preference for a more equitable society. To label a wealthy critic of extreme inequality as a hypocrite amounts to an ad hominem attack and a logical fallacy, intended to silence those whose voices could make a difference.

Fortunately, this tactic seems to be losing some of its potency. It is heartening to see wealthy individuals defying these attacks, not only by openly acknowledging the economic and social damage caused by extreme inequality, but also by criticizing a system that, despite enabling them to prosper, has left too many without opportunities.

In particular, some wealthy Americans are condemning the current tax legislation being pushed by Congressional Republicans and President Donald Trump’s administration, which offers outsize cuts to the highest earners – people like them. As Jack Bogle, the founder of Vanguard Group and a certain beneficiary of the proposed cuts, put it, the plan – which is all but guaranteed to exacerbate inequality – is a “moral abomination.”

Yet recognizing the flaws in current structures is just the beginning. The greater challenge is to create a viable blueprint for an equitable society. (It is the absence of such a blueprint that has led so many well-meaning movements in history to end in failure.) In this case, the focus must be on expanding profit-sharing arrangements, without stifling or centralizing market incentives that are crucial to drive growth.

A first step would be to give all of a country’s residents the right to a certain share of the economy’s profits. This idea has been advanced in various forms by Marty Weitzman, Hillel Steiner, Richard Freeman, and, just last month, Matt Bruenig. But it is particularly vital today, as the share of wages in national income declines, and the share of profits and rents rises – a trend that technological progress is accelerating.

There is another dimension to profit-sharing that has received little attention, related to monopolies and competition. With modern digital technology, the returns to scale are so large that it no longer makes sense to demand that, say, 1,000 firms produce versions of the same good, each meeting one-thousandth of total demand.

A more efficient approach would have 1,000 firms each creating one part of that good. So, when it comes to automobiles, for example, one firm would produce all of the gears, another producing all of the brake pads, and so on.

Traditional antitrust and pro-competition legislation – which began in 1890 with the Sherman Act in the US – prevents such an efficient system from taking hold. But a monopoly of production need not mean a monopoly of income, as long as the shares in each company are widely held. It is thus time for a radical change, one that replaces traditional anti-monopoly laws with legislation mandating a wider dispersal of shareholding within each company.

These ideas are largely untested, so much work would need to be done before they could be made operational. But as the world lurches from one crisis to another, and inequality continues to deepen, we do not have the luxury of sticking to the status quo. Unless we confront the inequality challenge head on, social cohesion and democracy itself will come under growing threat.

Image result for Kaushik Basu
*Kaushik Basu, former Chief Economist of the World Bank, is Professor of Economics at Cornell University and Nonresident Senior Fellow at the Brookings Institution.

US Foreign Policy: Misjudging Kim Jong-un

March 16, 2018

US Foreign Policy: Misjudging Kim Jong-un

by John C Hulsman*

Related image

If US President Donald Trump and his advisers continue to assume that traditional deterrence does not apply to North Korea, they are likely to lose the latest geopolitical chess match. History shows that those who mistake their political or military adversaries for lunatics are usually disastrously wrong.

MILAN – Throughout history, political observers have found decision-makers who are deemed “crazy” the most difficult to assess. In fact, the problem is rarely one of psychopathology. Usually, the label merely indicates behavior that is different from what conventional analysts were expecting.

This was surely true of the twelfth-century Syrian religious leader Rashid al-Din Sinan. During the Third Crusade, the supposedly mad “Old Man of the Mountain,” as he was known, succeeded in disrupting a Crusader advance on Jerusalem by directing his followers to carry out targeted assassinations. After carrying out their orders, the assassins often stayed put and awaited capture in full view of the local populace, to ensure that their leader received proper credit for the act.

At the time, such actions were incomprehensible to the Western mind. Westerners took to calling the Old Man’s followers hashashin, or users of hashish, because they regarded intoxication as the only possible explanation for such “senseless” disregard for one’s own physical wellbeing. But the hashashin were not drug users on the whole. And, more to the point, they were successful: their eventual assassination of Conrad of Montferrat led directly to the political collapse of the Crusader coalition and the defeat of Richard the Lionheart of England. As Polonius says of Hamlet, there was method to the Old Man’s madness.

Today, the problem of analyzing supposedly lunatic leaders has reappeared with the North Korean nuclear crisis. Whether North Korean dictator Kim Jong-un is mad is not merely an academic question; it is the heart of the matter.

US President Donald Trump’s administration has stated unequivocally that it will not tolerate a North Korean capability to threaten the mainland United States with nuclear weapons. According to Trump’s national security adviser, H.R. McMaster, the administration’s position reflects its belief that Kim is crazy, and that “classical deterrence theory” thus does not apply.

Image result for trump and rocket man

White House Chief of Staff John Kelly

During the Cold War, US President Dwight Eisenhower reasoned that even if Stalin (and later Mao) was homicidal, he was also rational, and did not wish to perish in a US counter-strike. The logic of “mutually assured destruction” that underlay nuclear deterrence worked.

If, however, the leader of a nuclear-armed state is a lunatic who is indifferent to his physical safety and that of those around him, the entire deterrence strategy falls apart. If Kim is insane, the only option is to take him out before his suicidal regime can kill millions of people.

Image result for Kim Jong Un

“Kim Jong-un’s dramatic overture to hold a summit with Trump by May hardly seems to fit the “madman” narrative. In fact, it looks like the act of someone who knows exactly what he is doing.”–John C Hulsman

But is Kim truly crazy, or does he simply have a worldview that discomfits Western analysts? His dramatic overture to hold a summit with Trump by May hardly seems to fit the “madman” narrative. In fact, it looks like the act of someone who knows exactly what he is doing.

Consider three strategic considerations that Kim could be weighing. First, his regime might be planning to offer concessions that it has no intention of fulfilling. After all, an earlier nuclear deal that the US brokered with his father, Kim Jong-il, was derailed by duplicity. In 2002, the US discovered that the regime was secretly enriching weapons-grade uranium in direct violation of its earlier pledge.

In fact, North Korea has demonstrated time and again that it doesn’t play by the rules. It enters into negotiations to extract concessions such as food aid, and then returns to its objectionable activities, thus starting the entire Sisyphean cycle again. There is no reason to think that this time will be different. But the regime’s deviousness should not be mistaken for irrationality or madness. Simply by expressing his openness to talks, Kim has already won some of the political legitimacy he craves.

Second, rather than being a lunatic, Kim seems mindful of recent history. Whereas Saddam Hussein in Iraq and Muammar el-Qaddafi in Libya paid the ultimate price for giving up their nuclear programs, Kim has advanced his regime’s nuclear capabilities and is now publicly treated as a near-equal by the most powerful man on the planet. The Kim regime has always sought such vindication above everything else.

A third and final consideration is that North Korea is playing for time. Though it has agreed to halt nuclear and missile tests in the run-up to the summit, it could be using the intervening months to develop related technologies. For example, it still needs to perfect an atmospheric re-entry mechanism to make its intercontinental ballistic missiles capable of striking the US mainland reliably and accurately. Moreover, as long as the summit is in play, North Korea need not fear a US military strike. That is a perfectly rational and sensible prize for Kim to pursue.

All told, North Korea’s “opening” will most likely amount to much less than meets the eye. But one can still glean valuable strategic insights from Kim’s diplomatic gambit. North Korean thinking reflects cunning, to be sure; but it also betrays the regime’s will to survive, and its desire to master the current situation. This suggests that Kim is not “crazy” after all, and that conventional deterrence will still work, as it has since 1945.

That is good news for everyone, but particularly for the Trump administration, given that it will almost certainly fail to secure any meaningful concessions from North Korea in the upcoming talks.

*John C. Hulsman is President and Co-Founder of John C. Hulsman Enterprises, a global political risk consulting firm, and the author of To Dare More Boldly (Princeton University Press, 2018).

Economists vs. Scientists on Long-Term Growth

March 4, 2018

Economists vs. Scientists on Long-Term Growth

Artificial intelligence researchers and conventional economists may have very different views about the impact of new technologies. But right now, and forgetting the possibility of an existential battle between man and machine, it seems quite plausible to expect a significant pickup in productivity growth over the next five years.

Image result for artificial intelligence

CAMBRIDGE – Most economic forecasters have largely shrugged off recent advances in artificial intelligence (for example, the quantum leap demonstrated by DeepMind’s self-learning chess program last December), seeing little impact on longer-term trend growth. Such pessimism is surely one of the reasons why real (inflation-adjusted) interest rates remain extremely low, even if the bellwether US ten-year bond rate has ticked up half a percentage point in the last few months. If supply-side pessimism is appropriate, the recent massive tax and spending packages in the United States will likely do much more to raise inflation than to boost investment.

Image result for artificial intelligence and economic growth

It is hard to know who is right: neither economists nor scientists have a great track record when it comes to making long-term predictions. But right now, and leaving aside the possibility of an existential battle between man and machine, it seems quite plausible to expect a significant pickup in productivity growth over the next five years.–

There are plenty of reasons to object to recent US fiscal policy, even if lowering the corporate-tax rate made sense (albeit not by the amount enacted). Above all, we live in an era of rising inequality and falling income shares for labor relative to capital. Governments need to do more, not less, to redistribute income and wealth.

It is hard to know what US President Donald Trump is thinking when he boasts that his policies will deliver up to 6% growth (unless he is talking about prices, not output!). But if inflationary pressures do indeed materialize, current growth might last significantly longer than forecasters and markets believe.

In any case, the focus of economists’ pessimism is long-term growth. Their stance is underpinned by the belief that advanced economies cannot hope to repeat the dynamism that the US enjoyed from 1995-2005 (and other advanced economies a bit later), much less the salad days of the 1950s and 1960s.

But the doubters ought to consider the fact that many scientists, across many disciplines, see things differently. Young researchers, in particular, believe that advances in basic knowledge are coming as fast as ever, even if practical applications are taking a long time to develop. Indeed, a small but influential cult touts the Hungarian-American mathematician John von Neumann’s “singularity” theory. Someday, thinking machines will become so sophisticated that they will be able to invent other machines without any human intervention, and suddenly technology will advance exponentially.

If so, perhaps we should be far more worried about the ethical and social implications of material growth that is faster than humans can spiritually absorb. The angst over AI mostly focuses on inequality and the future of work. But as science fiction writers have long warned us, the potential threats arising from the birth of silicon-based “life” forms are truly frightening.

It is hard to know who is right: neither economists nor scientists have a great track record when it comes to making long-term predictions. But right now, and leaving aside the possibility of an existential battle between man and machine, it seems quite plausible to expect a significant pickup in productivity growth over the next five years.

Consider that the main components of economic growth are increases in the labor force, increases in investment (both public and private), and “productivity,” namely the output than can be produced with a given amount of inputs, thanks to new ideas. Over the past 10-15 years, all three have been dismally low in the advanced economies.

Labor force growth has slowed sharply, owing to declining birth rates, with immigration failing to compensate even in pre-Trump America. The influx of women into the labor force played a major role in boosting growth in the latter part of the twentieth century. But now that has largely played out, although governments could do more to support female labor force participation and pay equity.

Similarly, global investment has collapsed since the 2008 financial crisis (though not in China), lowering potential growth. And measured productivity growth has declined everywhere, falling roughly by half in the US since the tech boom of the mid-1990s. No wonder global real interest rates are so low, with high post-crisis savings chasing a smaller supply of investment opportunities.

Still, the best bet is that AI and other new technologies will eventually come to have a much larger impact on growth than they have up to now. It is well known that it can take a very long time for businesses to reimagine productive processes to exploit new technologies: railroads and electricity are two leading examples. The pickup in global growth is likely to be a catalyst for change, creating incentives for firms to invest and introduce new technologies, some of which will substitute for labor, offsetting the slowdown in the growth of the workforce.

With the after-effects of the financial crisis fading, and AI perhaps starting to gain traction, trend US output growth can easily stay strong for the next several years (though, of course, a recession is also possible). The likely corresponding rise in real global interest rates will be tricky for central bankers to navigate. In the best case, they will be able to “ride the wave,” as Alan Greenspan famously did in the 1990s, though more inflation is likely this time.

The bottom line is that neither policymakers nor markets should be betting on the slow growth of the past decade carrying over to the next. But that might not be entirely welcome news. If the scientists are right, we may come to regret the growth we get.


The Myth of Sound Fundamentals

February 26, 2018

The Myth of Sound Fundamentals

by Stephen S.Roach*

Related image

The recent correction in the US stock market is now being characterized as a fleeting aberration – a volatility shock – in what is still deemed to be a very accommodating investment climate. In fact, for a US economy that has a razor-thin cushion of saving, dependence on rising asset prices has never been more obvious.

NEW HAVEN – The spin is all too predictable. With the US stock market clawing its way back from the sharp correction of early February, the mindless mantra of the great bull market has returned. The recent correction is now being characterized as a fleeting aberration – a volatility shock – in what is still deemed to be a very accommodating investment climate. After all, the argument goes, economic fundamentals – not just in the United States, but worldwide – haven’t been this good in a long, long time.But are the fundamentals really that sound? For a US economy that has a razor-thin cushion of saving, nothing could be further from the truth. America’s net national saving rate – the sum of saving by businesses, households, and the government sector – stood at just 2.1% of national income in the third quarter of 2017. That is only one-third the 6.3% average that prevailed in the final three decades of the twentieth century.

It is important to think about saving in “net” terms, which excludes the depreciation of obsolete or worn-out capacity in order to assess how much the economy is putting aside to fund the expansion of productive capacity. Net saving represents today’s investment in the future, and the bottom line for America is that it is saving next to nothing.

Image result for The Wall Street


Alas, the story doesn’t end there. To finance consumption and growth, the US borrows surplus saving from abroad to compensate for the domestic shortfall. All that borrowing implies a large balance-of-payments deficit with the rest of the world, which spawns an equally large trade deficit. While President Donald Trump’s administration is hardly responsible for this sad state of affairs, its policies are about to make a tough situation far worse.

Under the guise of tax reform, late last year Trump signed legislation that will increase the federal budget deficit by $1.5 trillion over the next decade. And now the US Congress, in its infinite wisdom, has upped the ante by another $300 billion in the latest deal to avert a government shutdown. Never mind that deficit spending makes no sense when the economy is nearing full employment: this sharp widening of the federal deficit is enough, by itself, to push the already-low net national saving rate toward zero. And it’s not just the government’s red ink that is so troublesome. The personal saving rate fell to 2.4% of disposable (after-tax) income in December 2017, the lowest in 12 years and only about a quarter of the 9.3% average that prevailed over the final three decades of the twentieth century.

As domestic saving plunges, the US has two options – a reduction in investment and the economic growth it supports, or increased borrowing of surplus saving from abroad. Over the past 35 years, America has consistently opted for the latter, running balance-of-payments deficits every year since 1982 (with a minor exception in 1991, reflecting foreign contributions for US military expenses in the Gulf War). With these deficits, of course, come equally chronic trade deficits with a broad cross-section of America’s foreign partners. Astonishingly, in 2017, the US ran trade deficits with 102 countries.

Image result for The Wall Street


The multilateral foreign-trade deficits of a saving-short US economy set the stage for perhaps the most egregious policy blunder being committed by the Trump administration: a shift toward protectionism. Further compression of an already-weak domestic saving position spells growing current-account and trade deficits – a fundamental axiom of macroeconomics that the US never seems to appreciate.

Attempting to solve a multilateral imbalance with bilateral tariffs directed mainly at China, such as those just imposed on solar panels and washing machines in January, doesn’t add up. And, given the growing likelihood of additional trade barriers – as suggested by the US Commerce Department’s recent recommendations of high tariffs on aluminum and steel – the combination of protectionism and ever-widening trade imbalances becomes all the more problematic for a US economy set to become even more dependent on foreign capital. Far from sound, the fundamentals of a saving-short US economy look shakier than ever.

Lacking a cushion of solid support from income generation, the lack of saving also leaves the US far more beholden to fickle asset markets than might otherwise be the case. That’s especially true of American consumers who have relied on appreciation of equity holdings and home values to support over-extended lifestyles. It is also the case for the US Federal Reserve, which has turned to unconventional monetary policies to support the real economy via so-called wealth effects. And, of course, foreign investors are acutely sensitive to relative returns on assets – the US versus other markets – as well as the translation of those returns into their home currencies.

Driven by the momentum of trends in employment, industrial production, consumer sentiment, and corporate earnings, the case for sound fundamentals plays like a broken record during periods of financial market volatility. But momentum and fundamentals are two very different things. Momentum can be fleeting, especially for a saving-short US economy that is consuming the seed corn of future prosperity. With dysfunctional policies pointing to a further compression of saving in the years ahead, the myth of sound US fundamentals has never rung more hollow.

Stephen S. Roach

*Stephen S. Roach, former Chairman of Morgan Stanley Asia and the firm’s chief economist, is a senior fellow at Yale University’s Jackson Institute of Global Affairs and a senior lecturer at Yale’s School of Management. He is the author of Unbalanced: The Codependency of America and China.


Tech and Higher Education

February 21, 2018

Tech and Higher Education

Universities pride themselves on producing creative ideas that disrupt the rest of society, yet higher-education teaching techniques continue to evolve at a glacial pace. Given education’s centrality to raising productivity, shouldn’t efforts to reinvigorate today’s sclerotic Western economies focus on how to reinvent higher education?

Image result for Tech and Higher Education


CAMBRIDGE – In the early 1990s, at the dawn of the Internet era, an explosion in academic productivity seemed to be around the corner. But the corner never appeared. Instead, teaching techniques at colleges and universities, which pride themselves on spewing out creative ideas that disrupt the rest of society, have continued to evolve at a glacial pace.

Sure, PowerPoint presentations have displaced chalkboards, enrollments in “massive open online courses” often exceed 100,000 (though the number of engaged students tends to be much smaller), and “flipped classrooms” replace homework with watching taped lectures, while class time is spent discussing homework exercises. But, given education’s centrality to raising productivity, shouldn’t efforts to reinvigorate today’s sclerotic Western economies focus on how to reinvent higher education?

One can understand why change is slow to take root at the primary and secondary school level, where the social and political obstacles are massive. But colleges and universities have far more capacity to experiment; indeed, in many ways, that is their raison d’être.

For example, what sense does it make for each college in the United States to offer its own highly idiosyncratic lectures on core topics like freshman calculus, economics, and US history, often with classes of 500 students or more? Sometimes these giant classes are great, but anyone who has gone to college can tell you that is not the norm.

At least for large-scale introductory courses, why not let students everywhere watch highly produced recordings by the world’s best professors and lecturers, much as we do with music, sports, and entertainment? This does not mean a one-size-fits-all scenario: there could be a competitive market, as there already is for textbooks, with perhaps a dozen people dominating much of the market.

And videos could be used in modules, so a school could choose to use, say, one package to teach the first part of a course, and a completely different package to teach the second part. Professors could still mix in live lectures on their favorite topics, but as a treat, not as a boring routine.

A shift to recorded lectures is only one example. The potential for developing specialized software and apps to advance higher education is endless. There is already some experimentation with using software to help understand individual students’ challenges and deficiencies in ways that guide teachers on how to give the most constructive feedback. But so far, such initiatives are very limited.

Perhaps change in tertiary education is so glacial because the learning is deeply interpersonal, making human teachers essential. But wouldn’t it make more sense for the bulk of faculty teaching time to be devoted to helping students engage in active learning through discussion and exercises, rather than to sometimes hundredth-best lecture performances?3

Yes, outside of traditional brick-and-mortar universities, there has been some remarkable innovation. The Khan Academy has produced a treasure trove of lectures on a variety of topics, and it is particularly strong in teaching basic mathematics. Although the main target audience is advanced high school students, there is a lot of material that college students (or anyone) would find useful.

Moreover, there are some great websites, including Crash Course and Ted-Ed, that contain short general education videos on a huge variety of subjects, from philosophy to biology to history. But while a small number of innovative professors are using such methods to reinvent their courses, the tremendous resistance they face from other faculty holds down the size of the market and makes it hard to justify the investments needed to produce more rapid change.

Let’s face it, college faculty are no keener to see technology cut into their jobs than any other group. And, unlike most factory workers, university faculty members have enormous power over the administration. Any university president that tries to run roughshod over them will usually lose her job long before any faculty member does.

Of course, change will eventually come, and when it does, the potential effect on economic growth and social welfare will be enormous. It is difficult to suggest an exact monetary figure, because, like many things in the modern tech world, money spent on education does not capture the full social impact. But even the most conservative estimates suggest the vast potential. In the US, tertiary education accounts for over 2.5% of GDP (roughly $500 billion), and yet much of this is spent quite inefficiently. The real cost, though, is not the squandered tax money, but the fact that today’s youth could be learning so much more than they do.

Image result for the george washington university

Universities and colleges are pivotal to the future of our societies. But, given impressive and ongoing advances in technology and artificial intelligence, it is hard to see how they can continue playing this role without reinventing themselves over the next two decades. Education innovation will disrupt academic employment, but the benefits to jobs everywhere else could be enormous. If there were more disruption within the ivory tower, economies just might become more resilient to disruption outside it.