Photo: TOLGA AKMEN/AFP via Getty Images


December 16, 2019   6 mins

“2016: the year it all changed” is Chapter 12 of Rana Foroohar’s book, Don’t Be Evil: the Case Against Big Tech. Underneath the heading I scribbled in pencil: “It’s not though. It’s just the year you stopped liking it.”

That was the year the UK voted for Brexit, and the US voted for Trump, so yes, you could say that we saw major political changes. But Foroohar falls into the trap of seeking explanations for the status-quo-shaking votes of 2016 in technology, instead of asking what has changed in politics. There’s a whiff of disillusionment from somebody who thought technology was going to change everything for the better, and then doesn’t like the changes that actually happened.

Whether good or evil, Foroohar seems to ascribe supernatural powers to technology, or at least to the companies that create and provide it.

Let me backtrack for a moment. This is a highly readable and thoroughly researched book. In particular, it’s good to learn about the business side of Silicon Valley from somebody who has spent years reporting on the big companies whose businesses make our business their business. Foroohar even turned poacher for a while, working for a dotcom startup shortly before the first bubble burst at the beginning of this century, before going back to being gamekeeper at the Financial Times.

It makes a refreshing change to look at the FAANGs — Facebook, Amazon, Apple, Netflix and Google — not as magical communities of wizards (good or evil) creating technology in some kind of other-worldly, post-material universe, but as companies that employ people, sell stuff and seek profits.

It’s useful to explore in what ways they’re just like the dominant industries of earlier eras, the railway or oil companies that formed monopolies, lobbied politicians for regulation that would hamper their competitors but not them, and discouraged their employees from organising for better lives.

It’s also useful to look at what’s different about Big Tech. Google’s chief economist, Hal Varian, describes how the importance of data to Silicon Valley rests on capturing relationships with consumers in digital form, enabling companies to both access and understand us on a mass scale. This lends itself to a new form of natural monopoly, as we gravitate to platforms that are more popular, concentrating data about us with a few big companies. This in turn advantages those companies, and the bigger they grow the more they can simply buy up the competition, as Facebook did to Instagram and WhatsApp.

But sometimes Foroohar’s description  of what’s changed sounds overblown, even mystical. “A smartphone’s powers are both unknowable and immensely powerful in the way that magic, by definition, is,” she asserts, contrasting the new tech to old tech such as cars, light bulbs and telephones. But hang on, isn’t electricity also “silent and invisible”, as she puts it? How many of us even today could explain how electricity is generated, how a landline telephone connects us to somebody many miles away, or even the workings of an internal combustion engine? And as for the financial structures that keep the amps flowing and the automobiles rolling, they are as opaque as they have always been.

True, the business models of Google and Facebook, which provide free services to millions of users while getting their income from selling adverts, are not the same as railroads and oil companies, which made their money by selling goods and services to the people who used them. But they’re not utterly different from magazine publishers whose real profits came from advertisers, not the cover price.

In fact, the more I read, the more I felt that Big Tech is not as different from past industries as it would like us to believe, or as Foroohar appears to have believed until her recent disillusionment.

Interviewing the former Google Chief Ethics Officer, Tristan Harris, she tells us: “He saw something that most people ‘on the inside’ didn’t: that we had reached a tipping point in which the interests of the tech giants and the customers they supposedly served were no longer aligned.”

I have two problems with this revelation. First, the users of free services are not the customers. Remember the aphorism — if you’re not paying for the product, you are the product. We users of free internet platforms, apps and email are served only in the sense that a buffet is served: on a platter, pre-sorted for easier targeting by advertisers. We should never have expected our interests to be the priority of these platforms. Dinner tables are not designed for the comfort and happiness of the sandwiches.

And second, even from the point of view of the real customers, the ones paying to advertise to us or to access data about us, who would be naïve enough to assume that the customers’ interests are aligned with the interests of the sellers? True, most companies want satisfied customers who will keep coming back, but one side wants to turn a profit and the other to bag a bargain.

“Their goals are not your goals,” says Harris, perhaps revealing more about the unworldliness of tech company insiders than about the constant nature of capitalism. Call me a cynical old Marxist, but nobody should be allowed to leave home before understanding that life is a constant process of negotiating conflicts of interest, and that no tech company is going to love you like your mother does.

Bringing us back to ‘2016: the year it all changed.’

Foroohar has already told us some of the history of how technology was designed to keep us online and interacting for as long as possible, using behaviourist theories of human psychology. She has also told us the story of her 10-year-old son, bamboozled into spending a lot of (her) money while playing FIFA Mobile, notching up $947.73 on one credit card bill. “I couldn’t help it!” he tells her, “… the game just kind of took over.”

“He described a kind of brain fog, a trance, in which he simply lost himself.”

So what changed in 2016?

Not the way Big Tech is involved in political campaigning as well as commercial advertising. Foroohar herself notes that politicians have been using data and marketing techniques for years, and that Facebook, Twitter and Google played important roles in the 2012 US presidential election (working for both sides). She could have pointed to UK examples too, as all major British political parties borrowed strategies, software and people from successful US campaigns.

What changed seems to be Foroohar’s own attitude to how Big Tech is used. Methods that Barack Obama and Hillary Clinton used became “dark”, “dirty” and “corrupt” when Donald Trump’s campaign uses them. She describes the Facebook investor Roger McNamee as being “shocked, as were most people, by the outcome of Brexit”. Which is a bizarre way to describe a majority vote on a high turnout. We can probably assume that more than 17 million British people were not shocked, though they might have been pleasantly surprised to find their Leave votes on the winning side.

There is much to criticise about the way data and social media platforms are used in political campaigning. Microtargeting, in particular, allows political parties to evade arguing for principles that could unite a majority of voters, by instead sending different messages to different voters, tailoring their policies for niche audiences. More broadly, the reduction of politics to a marketing exercise, using the same tools as those that sell us soap or shiny gadgets, is a travesty of democracy. That’s why a few of us started raising the alarm well before 2016.

But because Foroohar points the finger at technology, instead of politics, her remedies are technical, not political. Ironically, they involve asking those same tech companies to control political messaging, and to take responsibility for political campaigns.

Much as it pains me, I agree with Mark Zuckerberg, who said (in a statement that Foroohar calls “truly stunning”): “I find the notion that people would only vote some way because they were tricked to be almost viscerally offensive.”

Foroohar takes seriously the challenge of Big Tech companies, and the monopolies they hold not just in services but — through data — in relationships with us, the public. She calls for more discussion by democratic governments about what can be done to defend the interests of citizens, rather than narrowly defined consumer rights. There is much to learn and discuss in this book.

But as she looks to the same powerful companies to exert more control over how democracy works, I am reminded that her model for the pernicious influence of gadgets and games designed to bypass our rational selves is her 10-year-old son. Children don’t (so far) have the vote, precisely because we don’t expect them to have the necessary powers of judgment and self-control. We also don’t let them buy houses or drive cars, sign contracts or have their own credit cards (which is how Foroohar ended up with that $900 bill).

But we are not children. If we let ourselves be bamboozled into voting for something we don’t believe, that’s our responsibility. If politicians seek to address our fears and resentments instead of our hopes and our capacity to weigh up arguments, it’s those politicians who should be held to account, not the platforms they use to find and target us.

And we should know better than to blindly trust a big company to do what’s best for society, just because its services are free, its founders are cool, and it once had the motto, “Don’t Be Evil.


Timandra Harkness presents the BBC Radio 4 series, FutureProofing and How To Disagree. Her book, Big Data: Does Size Matter? is published by Bloomsbury Sigma.

TimandraHarknes