February 3, 2019
Around the end of each year major dictionaries declare their “word of the year.” Last year, for instance, the most looked-up word at Merriam-Webster.com was “justice.” Well, even though it’s early, I’m ready to declare the word of the year for 2019.
The word is “deep.”
Why? Because recent advances in the speed and scope of digitization, connectivity, big data and artificial intelligence are now taking us “deep” into places and into powers that we’ve never experienced before — and that governments have never had to regulate before. I’m talking about deep learning, deep insights, deep surveillance, deep facial recognition, deep voice recognition, deep automation and deep artificial minds.
Some of these technologies offer unprecedented promise and some unprecedented peril — but they’re all now part of our lives. Everything is going deep.
We sure are. But the lifeguard is still on the beach and — here’s what’s really scary — he doesn’t know how to swim! More about that later. For now, how did we get so deep down where the sharks live?
The short answer: Technology moves up in steps, and each step, each new platform, is usually biased toward a new set of capabilities. Around the year 2000 we took a huge step up that was biased toward connectivity, because of the explosion of fiber-optic cable, wireless and satellites.
Suddenly connectivity became so fast, cheap, easy for you and ubiquitous that it felt like you could touch someone whom you could never touch before and that you could be touched by someone who could never touch you before.
Around 2007, we took another big step up. The iPhone, sensors, digitization, big data, the internet of things, artificial intelligence and cloud computing melded together and created a new platform that was biased toward abstracting complexity at a speed, scope and scale we’d never experienced before.
So many complex things became simplified. Complexity became so fast, free, easy to use and invisible that soon with one touch on Uber’s app you could page a taxi, direct a taxi, pay a taxi, rate a taxi driver and be rated by a taxi driver.
That’s why the adjective that so many people are affixing to all of these new capabilities to convey their awesome power is “deep.”
On Jan. 20, The London Observer looked at Harvard Business School professor Shoshana Zuboff’s new book, the title of which perfectly describes the deep dark waters we’ve entered: “The Age of Surveillance Capital.”
“Surveillance capitalism,” Zuboff wrote, “unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioral surplus, fed into advanced manufacturing processes known as ‘machine intelligence,’ and fabricated into prediction products that anticipate what you will do now, soon and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioral futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behavior.”
Unfortunately, we have not developed the regulations or governance, or scaled the ethics, to manage a world of such deep powers, deep interactions and deep potential abuses.
Two quotes tell that story: Last April, Senator Orrin Hatch was questioning Facebook C.E.O. Mark Zuckerberg during a joint hearing of the commerce and judiciary committees. At one point Hatch asked Zuckerberg, “So, how do you sustain a business model in which users don’t pay for your service?”
Zuckerberg, clearly trying to stifle a laugh, replied, “Senator, we run ads.” Hatch did not seem to understand that Facebook’s business model is to mine users’ data and then run targeted ads — and Hatch was one of Facebook’s regulators.
But then Zuckerberg was also clueless about how deep the powers of the Facebook platform had gone — deep enough that a few smart Russian hackers could manipulate it to help Donald Trump win the presidency.
When faced with evidence that fake news spread on Facebook influenced the outcome of the 2016 election, Zuckerberg dismissed that notion as a “pretty crazy idea.” It turns out that it was happening at an industrial scale and he later had to apologize.
Regulations often lag behind new technologies, but when they move this fast and cut this deep, that lag can be really dangerous. I wish I thought that catch-up was around the corner. I don’t. Our national discussion has never been more shallow — reduced to 280 characters.
This has created an opening and burgeoning demand for political, social and religious leaders, government institutions and businesses that can go deep — that can validate what is real and offer the public deep truths, deep privacy protections and deep trust.
But deep trust and deep loyalty cannot be forged overnight. They take time. That’s one reason this old newspaper I work for — the Gray Lady — is doing so well today. Not all, but many people, are desperate for trusted navigators.
Many will also look for that attribute in our next President, because they sense that deep changes are afoot. It is unsettling, and yet, there’s no swimming back. We are, indeed, far from the shallow now.
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.
Thomas L. Friedman is the foreign affairs Op-Ed columnist. He joined the paper in 1981, and has won three Pulitzer Prizes. He is the author of seven books, including “From Beirut to Jerusalem,” which won the National Book Award.
interactions and deep potential abuses.
https://tpc.googlesyndication.com/safeframe/1-0-31/html/container.html
Two quotes tell that story: Last April, Senator Orrin Hatch was questioning Facebook C.E.O. Mark Zuckerberg during a joint hearing of the commerce and judiciary committees. At one point Hatch asked Zuckerberg, “So, how do you sustain a business model in which users don’t pay for your service?”
Zuckerberg, clearly trying to stifle a laugh, replied, “Senator, we run ads.” Hatch did not seem to understand that Facebook’s business model is to mine users’ data and then run targeted ads — and Hatch was one of Facebook’s regulators.
But then Zuckerberg was also clueless about how deep the powers of the Facebook platform had gone — deep enough that a few smart Russian hackers could manipulate it to help Donald Trump win the presidency.
When faced with evidence that fake news spread on Facebook influenced the outcome of the 2016 election, Zuckerberg dismissed that notion as a “pretty crazy idea.” It turns out that it was happening at an industrial scale and he later had to apologize.
Regulations often lag behind new technologies, but when they move this fast and cut this deep, that lag can be really dangerous. I wish I thought that catch-up was around the corner. I don’t. Our national discussion has never been more shallow — reduced to 280 characters.
This has created an opening and burgeoning demand for political, social and religious leaders, government institutions and businesses that can go deep — that can validate what is real and offer the public deep truths, deep privacy protections and deep trust.
But deep trust and deep loyalty cannot be forged overnight. They take time. That’s one reason this old newspaper I work for — the Gray Lady — is doing so well today. Not all, but many people, are desperate for trusted navigators.
Many will also look for that attribute in our next president, because they sense that deep changes are afoot. It is unsettling, and yet, there’s no swimming back. We are, indeed, far from the shallow now.
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.
Thomas L. Friedman is the foreign affairs Op-Ed columnist. He joined the paper in 1981, and has won three Pulitzer Prizes. He is the author of seven books, including “From Beirut to Jerusalem,” which won the National Book Award. @tomfriedman • Facebook