‘The goal is to automate us’: welcome to the age of surveillance capitalism


February 3, 2019

‘The goal is to automate us’: welcome to the age of surveillance capitalism

Shoshana Zuboff’s new book is a chilling exposé of the business model that underpins the digital world. Observer tech columnist John Naughton explains the importance of Zuboff’s work and asks the author 10 key questions

 

‘Technology is the puppet, but surveillance capitalism is the puppet master.’ Photograph: Getty Images

We’re living through the most profound transformation in our information environment since Johannes Gutenberg’s invention of printing in circa 1439. And the problem with living through a revolution is that it’s impossible to take the long view of what’s happening. Hindsight is the only exact science in this business, and in that long run we’re all dead. Printing shaped and transformed societies over the next four centuries, but nobody in Mainz (Gutenberg’s home town) in, say, 1495 could have known that his technology would (among other things): fuel the Reformation and undermine the authority of the mighty Catholic church; enable the rise of what we now recognise as modern science; create unheard-of professions and industries; change the shape of our brains; and even recalibrate our conceptions of childhood. And yet printing did all this and more.

Why choose 1495? Because we’re about the same distance into our revolution, the one kicked off by digital technology and networking. And although it’s now gradually dawning on us that this really is a big deal and that epochal social and economic changes are under way, we’re as clueless about where it’s heading and what’s driving it as the citizens of Mainz were in 1495.

That’s not for want of trying, mind. Library shelves groan under the weight of books about what digital technology is doing to us and our world. Lots of scholars are thinking, researching and writing about this stuff. But they’re like the blind men trying to describe the elephant in the old fable: everyone has only a partial view, and nobody has the whole picture. So our contemporary state of awareness is – as Manuel Castells, the great scholar of cyberspace once put it – one of “informed bewilderment”.

Which is why the arrival of Shoshana Zuboff’s new book is such a big event. Many years ago – in 1988, to be precise – as one of the first female professors at Harvard Business School to hold an endowed chair she published a landmark book, The Age of the Smart Machine: The Future of Work and Power, which changed the way we thought about the impact of computerisation on organisations and on work. It provided the most insightful account up to that time of how digital technology was changing the work of both managers and workers. And then Zuboff appeared to go quiet, though she was clearly incubating something bigger. The first hint of what was to come was a pair of startling essays – one in an academic journal in 2015, the other in a German newspaper in 2016. What these revealed was that she had come up with a new lens through which to view what Google, Facebook et al were doing – nothing less than spawning a new variant of capitalism. Those essays promised a more comprehensive expansion of this Big Idea.

And now it has arrived – the most ambitious attempt yet to paint the bigger picture and to explain how the effects of digitisation that we are now experiencing as individuals and citizens have come about.

The headline story is that it’s not so much about the nature of digital technology as about a new mutant form of capitalism that has found a way to use tech for its purposes. The name Zuboff has given to the new variant is “surveillance capitalism”. It works by providing free services that billions of people cheerfully use, enabling the providers of those services to monitor the behaviour of those users in astonishing detail – often without their explicit consent.

“Surveillance capitalism,” she writes, “unilaterally claims human experience as free raw material for translation into behavioural data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioural surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioural futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behaviour.”

While the general modus operandi of Google, Facebook et al has been known and understood (at least by some people) for a while, what has been missing – and what Zuboff provides – is the insight and scholarship to situate them in a wider context. She points out that while most of us think that we are dealing merely with algorithmic inscrutability, in fact what confronts us is the latest phase in capitalism’s long evolution – from the making of products, to mass production, to managerial capitalism, to services, to financial capitalism, and now to the exploitation of behavioural predictions covertly derived from the surveillance of users. In that sense, her vast (660-page) book is a continuation of a tradition that includes Adam Smith, Max Weber, Karl Polanyi and – dare I say it – Karl Marx.

Viewed from this perspective, the behaviour of the digital giants looks rather different from the roseate hallucinations of Wired magazine. What one sees instead is a colonising ruthlessness of which John D Rockefeller would have been proud. First of all there was the arrogant appropriation of users’ behavioural data – viewed as a free resource, there for the taking. Then the use of patented methods to extract or infer data even when users had explicitly denied permission, followed by the use of technologies that were opaque by design and fostered user ignorance.

And, of course, there is also the fact that the entire project was conducted in what was effectively lawless – or at any rate law-free – territory. Thus Google decided that it would digitise and store every book ever printed, regardless of copyright issues. Or that it would photograph every street and house on the planet without asking anyone’s permission. Facebook launched its infamous “beacons”, which reported a user’s online activities and published them to others’ news feeds without the knowledge of the user. And so on, in accordance with the disrupter’s mantra that “it is easier to ask for forgiveness than for permission”.

When the security expert Bruce Schneier wrote that “surveillance is the business model of the internet” he was really only hinting at the reality that Zuboff has now illuminated. The combination of state surveillance and its capitalist counterpart means that digital technology is separating the citizens in all societies into two groups: the watchers (invisible, unknown and unaccountable) and the watched. This has profound consequences for democracy because asymmetry of knowledge translates into asymmetries of power. But whereas most democratic societies have at least some degree of oversight of state surveillance, we currently have almost no regulatory oversight of its privatised counterpart. This is intolerable.

And it won’t be easy to fix because it requires us to tackle the essence of the problem – the logic of accumulation implicit in surveillance capitalism. That means that self-regulation is a nonstarter. “Demanding privacy from surveillance capitalists,” says Zuboff, “or lobbying for an end to commercial surveillance on the internet is like asking old Henry Ford to make each Model T by hand. It’s like asking a giraffe to shorten its neck, or a cow to give up chewing. These demands are existential threats that violate the basic mechanisms of the entity’s survival.”

The Age of Surveillance Capital is a striking and illuminating book. A fellow reader remarked to me that it reminded him of Thomas Piketty’s magnum opus, Capital in the Twenty-First Century, in that it opens one’s eyes to things we ought to have noticed, but hadn’t. And if we fail to tame the new capitalist mutant rampaging through our societies then we will only have ourselves to blame, for we can no longer plead ignorance.

Ten questions for Shoshana Zuboff: ‘Larry Page saw that human experience could be Google’s virgin wood’

Continuing a tradition that includes Adam Smith, Max Weber, Karl Polanyi, Marx… Shoshana Zuboff.

John Naughton: At the moment, the world is obsessed with Facebook. But as you tell it, Google was the prime mover.

Shoshana Zuboff: Surveillance capitalism is a human creation. It lives in history, not in technological inevitability. It was pioneered and elaborated through trial and error at Google in much the same way that the Ford Motor Company discovered the new economics of mass production or General Motors discovered the logic of managerial capitalism.

Surveillance capitalism was invented around 2001 as the solution to financial emergency in the teeth of the dotcom bust when the fledgling company faced the loss of investor confidence. As investor pressure mounted, Google’s leaders abandoned their declared antipathy toward advertising. Instead they decided to boost ad revenue by using their exclusive access to user data logs (once known as “data exhaust”) in combination with their already substantial analytical capabilities and computational power, to generate predictions of user click-through rates, taken as a signal of an ad’s relevance.

Operationally this meant that Google would both repurpose its growing cache of behavioural data, now put to work as a behavioural data surplus, and develop methods to aggressively seek new sources of this surplus.

The company developed new methods of secret surplus capture that could uncover data that users intentionally opted to keep private, as well as to infer extensive personal information that users did not or would not provide. And this surplus would then be analysed for hidden meanings that could predict click-through behaviour. The surplus data became the basis for new predictions markets called targeted advertising.

Sheryl Sandberg, says Zuboff, played the role of Typhoid Mary, bringing surveillance capitalism from Google to Facebook.

Pinterest
Sheryl Sandberg, says Zuboff, played the role of Typhoid Mary, bringing surveillance capitalism from Google to Facebook. Photograph: John Lee for the Guardian

 

Here was the origin of surveillance capitalism in an unprecedented and lucrative brew: behavioural surplus, data science, material infrastructure, computational power, algorithmic systems, and automated platforms. As click-through rates skyrocketed, advertising quickly became as important as search. Eventually it became the cornerstone of a new kind of commerce that depended upon online surveillance at scale.

The success of these new mechanisms only became visible when Google went public in 2004. That’s when it finally revealed that between 2001 and its 2004 IPO, revenues increased by 3,590%.

JN: So surveillance capitalism started with advertising, but then became more general?

SZ: Surveillance capitalism is no more limited to advertising than mass production was limited to the fabrication of the Ford Model T. It quickly became the default model for capital accumulation in Silicon Valley, embraced by nearly every startup and app. And it was a Google executive – Sheryl Sandberg – who played the role of Typhoid Mary, bringing surveillance capitalism from Google to Facebook, when she signed on as Mark Zuckerberg’s number two in 2008. By now it’s no longer restricted to individual companies or even to the internet sector. It has spread across a wide range of products, services, and economic sectors, including insurance, retail, healthcare, finance, entertainment, education, transportation, and more, birthing whole new ecosystems of suppliers, producers, customers, market-makers, and market players. Nearly every product or service that begins with the word “smart” or “personalised”, every internet-enabled device, every “digital assistant”, is simply a supply-chain interface for the unobstructed flow of behavioural data on its way to predicting our futures in a surveillance economy.

JN: In this story of conquest and appropriation, the term “digital natives” takes on a new meaning…

SZ: Yes, “digital natives” is a tragically ironic phrase. I am fascinated by the structure of colonial conquest, especially the first Spaniards who stumbled into the Caribbean islands. Historians call it the “conquest pattern”, which unfolds in three phases: legalistic measures to provide the invasion with a gloss of justification, a declaration of territorial claims, and the founding of a town to legitimate the declaration. Back then Columbus simply declared the islands as the territory of the Spanish monarchy and the pope.

The sailors could not have imagined that they were writing the first draft of a pattern that would echo across space and time to a digital 21st century. The first surveillance capitalists also conquered by declaration. They simply declared our private experience to be theirs for the taking, for translation into data for their private ownership and their proprietary knowledge. They relied on misdirection and rhetorical camouflage, with secret declarations that we could neither understand nor contest.

Google began by unilaterally declaring that the world wide web was its to take for its search engine. Surveillance capitalism originated in a second declaration that claimed our private experience for its revenues that flow from telling and selling our fortunes to other businesses. In both cases, it took without asking. Page [Larry, Google co-founder] foresaw that surplus operations would move beyond the online milieu to the real world, where data on human experience would be free for the taking. As it turns out his vision perfectly reflected the history of capitalism, marked by taking things that live outside the market sphere and declaring their new life as market commodities.

We were caught off guard by surveillance capitalism because there was no way that we could have imagined its action, any more than the early peoples of the Caribbean could have foreseen the rivers of blood that would flow from their hospitality toward the sailors who appeared out of thin air waving the banner of the Spanish monarchs. Like the Caribbean people, we faced something truly unprecedented.

Once we searched Google, but now Google searches us. Once we thought of digital services as free, but now surveillance capitalists think of us as free.

JN: Then there’s the “inevitability” narrative – technological determinism on steroids.

SZ: In my early fieldwork in the computerising offices and factories of the late 1970s and 80s, I discovered the duality of information technology: its capacity to automate but also to “informate”, which I use to mean to translate things, processes, behaviours, and so forth into information. This duality set information technology apart from earlier generations of technology: information technology produces new knowledge territories by virtue of its informating capability, always turning the world into information. The result is that these new knowledge territories become the subject of political conflict. The first conflict is over the distribution of knowledge: “Who knows?” The second is about authority: “Who decides who knows?” The third is about power: “Who decides who decides who knows?”

Now the same dilemmas of knowledge, authority and power have surged over the walls of our offices, shops and factories to flood each one of us… and our societies. Surveillance capitalists were the first movers in this new world. They declared their right to know, to decide who knows, and to decide who decides. In this way they have come to dominate what I call “the division of learning in society”, which is now the central organising principle of the 21st-century social order, just as the division of labour was the key organising principle of society in the industrial age.

JN: So the big story is not really the technology per se but the fact that it has spawned a new variant of capitalism that is enabled by the technology?

SZ: Larry Page grasped that human experience could be Google’s virgin wood, that it could be extracted at no extra cost online and at very low cost out in the real world. For today’s owners of surveillance capital the experiential realities of bodies, thoughts and feelings are as virgin and blameless as nature’s once-plentiful meadows, rivers, oceans and forests before they fell to the market dynamic. We have no formal control over these processes because we are not essential to the new market action. Instead we are exiles from our own behaviour, denied access to or control over knowledge derived from its dispossession by others for others. Knowledge, authority and power rest with surveillance capital, for which we are merely “human natural resources”. We are the native peoples now whose claims to self-determination have vanished from the maps of our own experience.

While it is impossible to imagine surveillance capitalism without the digital, it is easy to imagine the digital without surveillance capitalism. The point cannot be emphasised enough: surveillance capitalism is not technology. Digital technologies can take many forms and have many effects, depending upon the social and economic logics that bring them to life. Surveillance capitalism relies on algorithms and sensors, machine intelligence and platforms, but it is not the same as any of those.

JN: Where does surveillance capitalism go from here?

SZ: Surveillance capitalism moves from a focus on individual users to a focus on populations, like cities, and eventually on society as a whole. Think of the capital that can be attracted to futures markets in which population predictions evolve to approximate certainty.

This has been a learning curve for surveillance capitalists, driven by competition over prediction products. First they learned that the more surplus the better the prediction, which led to economies of scale in supply efforts. Then they learned that the more varied the surplus the higher its predictive value. This new drive toward economies of scope sent them from the desktop to mobile, out into the world: your drive, run, shopping, search for a parking space, your blood and face, and always… location, location, location.

The evolution did not stop there. Ultimately they understood that the most predictive behavioural data comes from what I call “economies of action”, as systems are designed to intervene in the state of play and actually modify behaviour, shaping it toward desired commercial outcomes. We saw the experimental development of this new “means of behavioural modification” in Facebook’s contagion experiments and the Google-incubated augmented reality game Pokémon Go.

It is no longer enough to automate information flows about us; the goal now is to automate us. These processes are meticulously designed to produce ignorance by circumventing individual awareness and thus eliminate any possibility of self-determination. As one data scientist explained to me, “We can engineer the context around a particular behaviour and force change that way… We are learning how to write the music, and then we let the music make them dance.”

This power to shape behaviour for others’ profit or power is entirely self-authorising. It has no foundation in democratic or moral legitimacy, as it usurps decision rights and erodes the processes of individual autonomy that are essential to the function of a democratic society. The message here is simple: Once I was mine. Now I am theirs.

JN: What are the implications for democracy?

SZ: During the past two decades surveillance capitalists have had a pretty free run, with hardly any interference from laws and regulations. Democracy has slept while surveillance capitalists amassed unprecedented concentrations of knowledge and power. These dangerous asymmetries are institutionalised in their monopolies of data science, their dominance of machine intelligence, which is surveillance capitalism’s “means of production”, their ecosystems of suppliers and customers, their lucrative prediction markets, their ability to shape the behaviour of individuals and populations, their ownership and control of our channels for social participation, and their vast capital reserves. We enter the 21st century marked by this stark inequality in the division of learning: they know more about us than we know about ourselves or than we know about them. These new forms of social inequality are inherently antidemocratic.

At the same time, surveillance capitalism diverges from the history of market capitalism in key ways, and this has inhibited democracy’s normal response mechanisms. One of these is that surveillance capitalism abandons the organic reciprocities with people that in the past have helped to embed capitalism in society and tether it, however imperfectly, to society’s interests. First, surveillance capitalists no longer rely on people as consumers. Instead, supply and demand orients the surveillance capitalist firm to businesses intent on anticipating the behaviour of populations, groups and individuals. Second, by historical standards the large surveillance capitalists employ relatively few people compared with their unprecedented computational resources. General Motors employed more people during the height of the Great Depression than either Google or Facebook employs at their heights of market capitalisation. Finally, surveillance capitalism depends upon undermining individual self-determination, autonomy and decision rights for the sake of an unobstructed flow of behavioural data to feed markets that are about us but not for us.

This antidemocratic and anti-egalitarian juggernaut is best described as a market-driven coup from above: an overthrow of the people concealed as the technological Trojan horse of digital technology. On the strength of its annexation of human experience, this coup achieves exclusive concentrations of knowledge and power that sustain privileged influence over the division of learning in society. It is a form of tyranny that feeds on people but is not of the people. Paradoxically, this coup is celebrated as “personalisation”, although it defiles, ignores, overrides, and displaces everything about you and me that is personal.

surveillance capitalism illustration

Pinterest
‘The power to shape behaviour for others’ profit or power is entirely self-authorising,’ says Zuboff. ‘It has no foundation in democratic or moral legitimacy.’

JN: Our societies seem transfixed by all this: we are like rabbits paralysed in the headlights of an oncoming car.

SZ: Despite surveillance capitalism’s domination of the digital milieu and its illegitimate power to take private experience and to shape human behaviour, most people find it difficult to withdraw, and many ponder if it is even possible. This does not mean, however, that we are foolish, lazy, or hapless. On the contrary, in my book I explore numerous reasons that explain how surveillance capitalists got away with creating the strategies that keep us paralysed. These include the historical, political and economic conditions that allowed them to succeed. And we’ve already discussed some of the other key reasons, including the nature of the unprecedented, conquest by declaration. Other significant reasons are the need for inclusion, identification with tech leaders and their projects, social persuasion dynamics, and a sense of inevitability, helplessness and resignation.

We are trapped in an involuntary merger of personal necessity and economic extraction, as the same channels that we rely upon for daily logistics, social interaction, work, education, healthcare, access to products and services, and much more, now double as supply chain operations for surveillance capitalism’s surplus flows. The result is that the choice mechanisms we have traditionally associated with the private realm are eroded or vitiated. There can be no exit from processes that are intentionally designed to bypass individual awareness and produce ignorance, especially when these are the very same processes upon which we must depend for effective daily life. So our participation is best explained in terms of necessity, dependency, the foreclosure of alternatives, and enforced ignorance.

JN: Doesn’t all this mean that regulation that just focuses on the technology is misguided and doomed to fail? What should we be doing to get a grip on this before it’s too late?

SZ: The tech leaders desperately want us to believe that technology is the inevitable force here, and their hands are tied. But there is a rich history of digital applications before surveillance capitalism that really were empowering and consistent with democratic values. Technology is the puppet, but surveillance capitalism is the puppet master.

Surveillance capitalism is a human-made phenomenon and it is in the realm of politics that it must be confronted. The resources of our democratic institutions must be mobilised, including our elected officials. GDPR [a recent EU law on data protection and privacy for all individuals within the EU] is a good start, and time will tell if we can build on that sufficiently to help found and enforce a new paradigm of information capitalism. Our societies have tamed the dangerous excesses of raw capitalism before, and we must do it again.

While there is no simple five-year action plan, much as we yearn for that, there are some things we know. Despite existing economic, legal and collective-action models such as antitrust, privacy laws and trade unions, surveillance capitalism has had a relatively unimpeded two decades to root and flourish. We need new paradigms born of a close understanding of surveillance capitalism’s economic imperatives and foundational mechanisms.”

For example, the idea of “data ownership” is often championed as a solution. But what is the point of owning data that should not exist in the first place? All that does is further institutionalise and legitimate data capture. It’s like negotiating how many hours a day a seven-year-old should be allowed to work, rather than contesting the fundamental legitimacy of child labour. Data ownership also fails to reckon with the realities of behavioural surplus. Surveillance capitalists extract predictive value from the exclamation points in your post, not merely the content of what you write, or from how you walk and not merely where you walk. Users might get “ownership” of the data that they give to surveillance capitalists in the first place, but they will not get ownership of the surplus or the predictions gleaned from it – not without new legal concepts built on an understanding of these operations.

Another example: there may be sound antitrust reasons to break up the largest tech firms, but this alone will not eliminate surveillance capitalism. Instead it will produce smaller surveillance capitalist firms and open the field for more surveillance capitalist competitors.

So what is to be done? In any confrontation with the unprecedented, the first work begins with naming. Speaking for myself, this is why I’ve devoted the past seven years to this work… to move forward the project of naming as the first necessary step toward taming. My hope is that careful naming will give us all a better understanding of the true nature of this rogue mutation of capitalism and contribute to a sea change in public opinion, most of all among the young.

The Age of Surveillance Capitalism by Shoshana Zuboff is published by Profile (£25). To order a copy for £22 go to guardianbookshop.com or call 0330 333 6846. Free UK p&p over £15, online orders only. Phone orders min p&p of £1.99.

Warning!Everything is going Deep: The Age Of ‘ Surveillance Capitalism’


February 3, 2019

Around the end of each year major dictionaries declare their “word of the year.” Last year, for instance, the most looked-up word at Merriam-Webster.com was “justice.” Well, even though it’s early, I’m ready to declare the word of the year for 2019.

The word is “deep.”

Why? Because recent advances in the speed and scope of digitization, connectivity, big data and artificial intelligence are now taking us “deep” into places and into powers that we’ve never experienced before — and that governments have never had to regulate before. I’m talking about deep learning, deep insights, deep surveillance, deep facial recognition, deep voice recognition, deep automation and deep artificial minds.

Image result for thomas friedman

Some of these technologies offer unprecedented promise and some unprecedented peril — but they’re all now part of our lives. Everything is going deep.

We sure are. But the lifeguard is still on the beach and — here’s what’s really scary — he doesn’t know how to swim! More about that later. For now, how did we get so deep down where the sharks live?

The short answer: Technology moves up in steps, and each step, each new platform, is usually biased toward a new set of capabilities. Around the year 2000 we took a huge step up that was biased toward connectivity, because of the explosion of fiber-optic cable, wireless and satellites.

Suddenly connectivity became so fast, cheap, easy for you and ubiquitous that it felt like you could touch someone whom you could never touch before and that you could be touched by someone who could never touch you before.

Around 2007, we took another big step up. The iPhone, sensors, digitization, big data, the internet of things, artificial intelligence and cloud computing melded together and created a new platform that was biased toward abstracting complexity at a speed, scope and scale we’d never experienced before.

So many complex things became simplified. Complexity became so fast, free, easy to use and invisible that soon with one touch on Uber’s app you could page a taxi, direct a taxi, pay a taxi, rate a taxi driver and be rated by a taxi driver.

That’s why the adjective that so many people are affixing to all of these new capabilities to convey their awesome power is “deep.”

On Jan. 20, The London Observer looked at Harvard Business School professor Shoshana Zuboff’s new book, the title of which perfectly describes the deep dark waters we’ve entered: “The Age of Surveillance Capital.”

“Surveillance capitalism,” Zuboff wrote, “unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioral surplus, fed into advanced manufacturing processes known as ‘machine intelligence,’ and fabricated into prediction products that anticipate what you will do now, soon and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioral futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behavior.”

Unfortunately, we have not developed the regulations or governance, or scaled the ethics, to manage a world of such deep powers, deep interactions and deep potential abuses.

 

Two quotes tell that story: Last April, Senator Orrin Hatch was questioning Facebook C.E.O. Mark Zuckerberg during a joint hearing of the commerce and judiciary committees. At one point Hatch asked Zuckerberg, “So, how do you sustain a business model in which users don’t pay for your service?”

Zuckerberg, clearly trying to stifle a laugh, replied, “Senator, we run ads.” Hatch did not seem to understand that Facebook’s business model is to mine users’ data and then run targeted ads — and Hatch was one of Facebook’s regulators.

But then Zuckerberg was also clueless about how deep the powers of the Facebook platform had gone — deep enough that a few smart Russian hackers could manipulate it to help Donald Trump win the presidency.

Image result for Zuckerberg

When faced with evidence that fake news spread on Facebook influenced the outcome of the 2016 election, Zuckerberg dismissed that notion as a “pretty crazy idea.” It turns out that it was happening at an industrial scale and he later had to apologize.

Regulations often lag behind new technologies, but when they move this fast and cut this deep, that lag can be really dangerous. I wish I thought that catch-up was around the corner. I don’t. Our national discussion has never been more shallow — reduced to 280 characters.

This has created an opening and burgeoning demand for political, social and religious leaders, government institutions and businesses that can go deep — that can validate what is real and offer the public deep truths, deep privacy protections and deep trust.

But deep trust and deep loyalty cannot be forged overnight. They take time. That’s one reason this old newspaper I work for — the Gray Lady — is doing so well today. Not all, but many people, are desperate for trusted navigators.

Many will also look for that attribute in our next President, because they sense that deep changes are afoot. It is unsettling, and yet, there’s no swimming back. We are, indeed, far from the shallow now.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

Thomas L. Friedman is the foreign affairs Op-Ed columnist. He joined the paper in 1981, and has won three Pulitzer Prizes. He is the author of seven books, including “From Beirut to Jerusalem,” which won the National Book Award.

 

@tomfriedman Facebook

 

 

 

interactions and deep potential abuses.

https://tpc.googlesyndication.com/safeframe/1-0-31/html/container.htmlImage result for thomas friedman

 

Two quotes tell that story: Last April, Senator Orrin Hatch was questioning Facebook C.E.O. Mark Zuckerberg during a joint hearing of the commerce and judiciary committees. At one point Hatch asked Zuckerberg, “So, how do you sustain a business model in which users don’t pay for your service?”

 

Zuckerberg, clearly trying to stifle a laugh, replied, “Senator, we run ads.” Hatch did not seem to understand that Facebook’s business model is to mine users’ data and then run targeted ads — and Hatch was one of Facebook’s regulators.

But then Zuckerberg was also clueless about how deep the powers of the Facebook platform had gone — deep enough that a few smart Russian hackers could manipulate it to help Donald Trump win the presidency.

When faced with evidence that fake news spread on Facebook influenced the outcome of the 2016 election, Zuckerberg dismissed that notion as a “pretty crazy idea.” It turns out that it was happening at an industrial scale and he later had to apologize.

Regulations often lag behind new technologies, but when they move this fast and cut this deep, that lag can be really dangerous. I wish I thought that catch-up was around the corner. I don’t. Our national discussion has never been more shallow — reduced to 280 characters.

This has created an opening and burgeoning demand for political, social and religious leaders, government institutions and businesses that can go deep — that can validate what is real and offer the public deep truths, deep privacy protections and deep trust.

But deep trust and deep loyalty cannot be forged overnight. They take time. That’s one reason this old newspaper I work for — the Gray Lady — is doing so well today. Not all, but many people, are desperate for trusted navigators.

Many will also look for that attribute in our next president, because they sense that deep changes are afoot. It is unsettling, and yet, there’s no swimming back. We are, indeed, far from the shallow now.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

Thomas L. Friedman is the foreign affairs Op-Ed columnist. He joined the paper in 1981, and has won three Pulitzer Prizes. He is the author of seven books, including “From Beirut to Jerusalem,” which won the National Book Award. @tomfriedman Facebook

 

A version of this article appears in print on , on Page A23 of the New York edition with the headline: Warning! Everything Is Going Deep. Order Reprints | Today’s Paper | Subscribe

 

Dr. Fareed on DAVOS without America


January 28, 2019

Dr. Fareed on DAVOS without America

Image result for Fareed and Jane Goodall

The atmosphere at the 2019 World Economic Forum reflects the global picture perhaps more genuinely than in years past, and the painting is not very pretty. The mood here is subdued, cautious and apprehensive. There’s not much talk of a global slowdown, but no one is confident about a growth story, either. There is no great global political crisis, yet people speak in worried tones about the state of democracy, open societies and the international order”. — Dr.Fareed Zakaria

A Davos without America mirrors a world without America: The United States has withdrawn from the world.

DAVOS, Switzerland

https://fareedzakaria.com/columns/2019/1/24/a-davos-without-america-mirrors-a-world-without-america

The atmosphere at the 2019 World Economic Forum reflects the global picture perhaps more genuinely than in years past, and the painting is not very pretty. The mood here is subdued, cautious and apprehensive. There’s not much talk of a global slowdown, but no one is confident about a growth story, either. There is no great global political crisis, yet people speak in worried tones about the state of democracy, open societies and the international order.

The White House scrapped the official U.S. delegation’s trip to this year’s conference — an outgrowth of President Trump’s spat with Congress — providing a perfect metaphor for the broader outlook: The United States has withdrawn from the world.

Meanwhile, Europe is distracted, divided and despondent. Of the continent’s three major leaders, only one, Germany’s lame-duck Chancellor Angela Merkel, even showed up. British Prime Minister Theresa May did not attend because of turmoil over Brexit. French President Emmanuel Macron chose not to come because he faces ongoing populist protests from the right and left. In this environment, there is a gaping absence of leadership in Davos from the usual defenders of liberal democracy and the rules-based international system.

This does not mean that any new global leaders have stepped into the void. Contrary to some speculation, China is playing a more muted role at the forum than in the past. It sent a respected statesman, Vice President Wang Qishan, with an anodyne message aiming to reassure the world that Beijing seeks “win-win” solutions and global cooperation. This probably reflects the reality that — politically and economically — China faces its own challenges at home, with slowing growth and President Xi Jinping trying to tighten his grip over China’s vast society. India’s Prime Minister Narendra Modi faces a tougher-than-expected fight in upcoming national elections, so he didn’t show up, either.

Image result for Fareed and Jane Goodall

It is not really the dawn of dictators, few of whom came, perhaps a reflection of the fact that global norms and fora like Davos still do not celebrate strongmen. Although Western democracies may be flagging, Russia’s Vladimir Putin and Turkey’s Recep Tayyip Erdogan hold a much weaker hand than most people realize. They, too, along with Crown Prince Mohammed bin Salman of Saudi Arabia, stayed home. Jair Bolsonaro, the new president of Brazil, did attend and gave a much-anticipated speech, but it was barely six minutes long — and was received with decidedly mixed reviews.

The one area of consistent optimism among the attendees remains technology. Executives from multinational corporations such as Novartis and Cargill spoke about the next great technological opportunity — leveraging artificial intelligence to make their companies far more efficient and productive. This is a trend that they see as inexorable, forcing them to adapt or watch the competition grow. Executives and experts alike foresee that another layer of white-collar jobs could be at risk — those involving routine analytic skills. But chief executives here voiced optimism that it will all work out.

Businessmen and executives are more openly pessimistic about trade. They worry that a U.S.-China trade war could spill over across the world. Whether it happens, it seems clear that the great expansion of globalization is over. For the past 15 years, there has been no significant forward movement on trade, and many minor setbacks. This hasn’t yet translated into large-scale protectionism and tariff wars, but it is a new stagnancy.

If the West is divided, so are other regions. Almost no Arab leaders showed up to last weekend’s Arab League meeting in Beirut, relegating the summit to even greater irrelevance than usual. Latin America is now split between leaders such as the right-wing Bolsonaro and the new leftist president of Mexico, Andrés Manuel López Obrador.

The leaders of several smaller countries (all of whom insisted on staying off the record) described the world as adrift and lacking in any collective purpose, with only voices about narrow self-interest and conflict being heard. “When the Americans are engaged, we have a sense of direction,” one of them said to me. “We might disagree on some points, but at least there is a larger conversation, some efforts at cooperation. Now the only energy is negative — worries about retreat, trade wars. That’s not a world in which it is easy for us to move forward. We are all stuck.”

This, then, is the post-American world. Not one marked by Chinese dominance or Asian arrogance. Not an outright anti-American one, but one in which many yearn for a greater U.S. presence. One in which countries are freelancing, narrowly pursuing their own interests, and hoping that the framework of international order remains reasonably stable. But with no one actively shoring up the international system, the great question remains: In a world without leaders, will that system over time weaken and eventually crumble?

(c) 2019, Washington Post Writers Group

 

 

The ‘Next America’


How do we govern in the age that will begin with the 2020 election?

Thomas L. Friedman

By Thomas L. Friedman

Opinion Columnist

CreditSteven Senne/Associated Press
Image
CreditCreditSteven Senne/Associated Press

I have this feeling that the 2020 presidential election in the United States will be unlike any in my lifetime — and not only because it will likely involve Donald Trump running as an incumbent — he alone is a one-man, three-ring circus — but also because the huge issue that should have been the focus of the 2016 election will be unavoidable by 2020. That is: How do we govern the “Next America’’?

Image result for Trump in 2020

“You know William Gibson’s line, ‘The future is already here, it’s just not evenly distributed’? Well, the future is here, and now it’s starting to really get distributed. This is the Next America. But our institutions and political parties have not adapted to it,’’ Gautam Mukunda, a Harvard Kennedy School Research Fellow and the author of “Indispensable: When Leaders Really Matter,’’ remarked in an interview. By 2020, it will be impossible to ignore the Next America. “The basic premises of how the economy works have shifted under our feet and the government will have to respond.’’

This Next America will raise a whole web of new intertwined policy, legal, moral, ethical and privacy issues because of changes in technology, demographics, the environment and globalization that are reaching critical mass.

Where do I start? A good place is with 5G — fifth-generation wireless systems. With the two telecom giants Verizon and AT&T now beginning to deploy 5G technology across the country, the metabolism of business, entertainment, education and health care will dramatically accelerate in the Next America, beginning around … 2020.

Getting the most from artificial intelligence and machine learning — like deploying self-driving vehicles — requires quickly transmitting massive amounts of data with very low latency. We will have that capacity in the Next America. With 5G, a Hollywood movie that now takes six or seven minutes to download onto your iPad will take six or seven seconds and microsensors in your shirt will gather intelligence and broadcast vital signs to your doctor.

As AT&T notes in one of its 5G ads, “Think of this as the next frontier in untethering, giving you the ability to take the ultrafast experience you have in your home or business with you virtually anywhere.’’

It could be as revolutionary as the internet.

But it will require all kinds of new regulations to govern applications from self-driving cars to drone delivery systems to robots that will work as security guards and home health aides.

An Associated Press report on Monday said that the government estimated there were currently “about 110,000 commercial drones operating in U. S. airspace, and the number is expected to soar to about 450,000 in 2022.’’

All of this new technology will have important implications for the education-to-work pipeline. My friend Heather E. McGowan, a future-of-work strategist, puts it this way: “The old model of work was three life blocks: Get an education. Use that education for 40 years. And then retire. We then made the faulty assumption that the next new model would be: Get an education. Use it for 20 years. Then get retrained. Then use that for 20 more years and then retire.’’

But in fact, in the Next America, argues McGowan, the right model will be “continuous lifelong learning’’ — because when the pace of change is accelerating, “the fastest-growing companies and most resilient workers will be those who learn faster than their competition.”

That means that in addition to our traditional big safety nets — Social Security and Medicare — we will need new national trampolines.

We will need to make some level of postsecondary education free to every American who meets a minimum grade and attendance requirement, so that every adult and every high school graduate can earn an associate degree or technical certificate free of tuition at a community college at any time.

Tennessee has already done that.

These same technological transformations mean the Next America will require changes in antitrust policy. Since the 1980s, antitrust policy judged if a company was getting too big largely by one question: Was the loss of competition hurting consumers through higher prices or fewer services?

“But that definition is increasingly irrelevant in an age in which the most powerful companies in the world offer products and services for ‘free’ in exchange for personal data,” Rana Foroohar, the Financial Times technology columnist, noted in a June 24 essay. “This has provoked calls for a return to the definition of monopoly in the 1890 Sherman Antitrust Act, which emphasizes the need to ensure that the economic power of large companies does not result in the corruption of the political process.’’

That’s because we are more than consumers, “we’re citizens,’’ notes Mukunda, “We have interests that stretch far beyond consumer pricing, and it’s the job of the government to protect citizens’ liberty, not just consumers’ interests. It says so right in the Constitution, and we’ve forgotten that.’’

Just one person — Mark Zuckerberg — controls Facebook, WhatsApp and Instagram. The fact that he has shown himself to be much more interested in scaling his platforms than combating those who abused them for political and economic gain — and that his lieutenants were ready to go after their high-profile critics, like George Soros — should make breaking up or regulating Facebook a front-and-center issue in 2020. But just the raw political weight of behemoths like Facebook, Amazon, Google, Microsoft and Apple needs a closer look.

The Next America is more than technology. It literally will be born in 2020. The United States Census Bureau has predicted that by 2020, for the first time, “more than half of the nation’s children are expected to be part of a minority race or ethnic group.” That will begin a process by which by 2044 “no one racial or ethnic group will dominate the U.S. in terms of size,” NPR reported.

Alas, though, the fiscal tools we need to build the Next America have been weakened by President Trump’s tax cuts. The federal deficit was not supposed to hit $1 trillion until 2020, but the White House now says it will hit that number in 2019. We’ve had deficits this size in response to the 2008 financial crisis, but we’ve never run one so huge during a boom.

That means the Next America may have to be built in the face of higher interest rates on more debt, with less fiscal ammunition to stimulate the economy should it slow down or face a crisis. So the Next America may very likely have to raise taxes or trim military spending, or Social Security or Medicare — just when all the baby boomers are retiring.

In sum, the Next America requires addressing each of those issues, and many more — from climate change to zoning rules — and how they interact. So the next election must too. The craziness around Trump has delayed much of this discussion. But 2020 won’t let us do that again. The Next America won’t wait.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

 

Thomas L. Friedman is the foreign affairs Op-Ed columnist. He joined the paper in 1981, and has won three Pulitzer Prizes. He is the author of seven books, including “From Beirut to Jerusalem,” which won the National Book Award. @tomfriedman Facebook

 

A version of this article appears in print on , on Page A31 of the New York edition with the headline: Next America, A New Age, Starts in 2020. Order Reprints | Today’s Paper | Subscribe