Tim Wu: ‘The internet is like the classic story of the party that went sour’ | Technology | The Guardian


John Naughton

Tim Wu is a law professor at Columbia University. His specialities include competition, copyright and telecommunications law. So far, so conventional. But Wu is an unconventional academic. For one thing, he ran for the Democratic nomination for lieutenant governorship of New York (and won 40% of the popular vote, though not the primary election). For another, he served for a time in the office of New York’s attorney general, specialising in issues involving technology, consumer protection and ensuring fair competition among online companies. “If I have a life mission,” he said once, “it is to fight bullies. I like standing up for the little guy and I think that’s what the state attorney general’s office does.”

As I said, no ordinary academic. But it gets better. Wu is also the guy who coined the phrase “net neutrality”, which has turned out to be a key concept in debates about regulation of the internet. He was for a time a senior adviser to the Federal Trade Commission, America’s main consumer protection agency. And somehow, in the middle of all this activity, he writes books that make a big impact.

In 2010, for example, he published The Master Switch: The Rise and Fall of Information Empires, a sobering history of the great communications technologies of the 20th century – the telephone, movies, broadcast radio and television. In telling the history, Wu perceived a recurring cycle in the evolution of these technologies. Each started out as open, chaotic, diverse and intensely creative; each stimulated utopian visions of the future, but in the end they all wound up “captured” by industrial interests.

The cue for his new book, The Attention Merchants, is an observation the Nobel prize-winning economist Herbert Simon made in 1971. “In an information-rich world,” Simon wrote, “the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

Wu’s book is a history of the attention industry, that is to say the enterprises that harvest human attention and sell it to advertisers. This is not quite the same thing as a history of advertising, which is a more ancient business. His story begins therefore not with the paid advertisements that featured in early newspapers and printed pamphlets, but with two separate developments.

The first was Benjamin Day’s scandal sheet, the New York Sun, the first issue of which appeared on 3 September, 1833. Day’s big idea was to build circulation fast with sensational content (including what we would now call “fake news”) and a very low cover price. In the process, he established the attention merchant’s basic modus operandi: “draw attention with apparently free stuff and then resell it”, a business model that is still alive and prospering on the contemporary internet.

The second development appeared in, of all places, 1860s Paris, in the form of huge, wall-mounted posters portraying beautiful, half-dressed women cavorting over fields of vibrant colour. They were the brainchild of Jules Chéret, an aspiring artist and former printer, who understood that there are few better ways of gaining male attention than by displaying images of the barely clothed female form. Thus was the billboard ad born.

The Attention Merchants chronicles the attempts that publishers and entrepreneurs have made to capture and resell human attention over nearly two centuries, from Day in 1833 to BuzzFeed, Instagram, Google and Facebook today, with a major detour into state propaganda (Britain during the first world war, Goebbels during the second) along the way.

In large measure, this is a story of communication technologies, starting with print, moving on to broadcast media (radio, television) and winding up with the internet and the technologies it has spawned (email, blogging, search engines, social media). But the striking feature of the book is the way it interweaves this story of technological development with two other strands. The first is an account of how the human subjects whose attention is being sought eventually rebel, giving rise to outbreaks of resistance that sometimes lead to regulatory intervention, but more often to changes of tack by the attention merchants. The second strand is a series of meditations on the cultural implications of the attention merchants’ success.

As in his previous book, Wu identifies periodic cycles in the evolution of an industry for which nothing succeeds like excess – at least for a time. As Google’s executive chairman, Eric Schmidt, once put it in an uncharacteristically unguarded moment: “The ideal is to get right up to the creepy line and not cross it.”

Periodically, though, the line has been crossed. Day’s sensationalist approach, for example, generated an epidemic of patent-medicine fraud (there really was something called snake oil), until it was eventually unhorsed by crusading journalists and scientific research. In the 1950s, American television networks overdosed on quiz shows until revelations that all of the big ones were fixed. And in our own time, the ubiquity of intrusive smartphone ads has led to the ad-blocking revolt that currently threatens to undermine the basic business model of cyberspace.

What is even more sobering, though, is the direction of travel of the long journey on which the industry has taken our culture. In the US, radio brought advertising into people’s homes in a compelling way, especially after the invention of the soap opera. (The situation was different in the UK, because of the BBC.) Then television arrived, adding the mesmeric power of images and in due course created the “primetime moment” that Goebbels craved, a moment when the entire nation’s attention was transfixed on a single broadcast event. Not for nothing were the 1950s a decade of stultifying complacency.

Timothy Leary and the counterculture movement tried to break the hypnotic stranglehold of network TV, but in fact what really killed it was the fragmentation of the national audience with the arrival of cable TV and, later, the internet. Thus did we go from an era of mass audiences in which we were all treated as members of an undifferentiated whole to an age when personalisation technology enables Facebook and Google to profit from ads aimed – allegedly precisely – at a single individual.

The Attention Merchants is a sobering and significant book that takes the long view of technology and in the process escapes the “sociology of the last five minutes” that characterises so much discourse about the subject. Much of what we have to grapple with today is genuinely new. But there also appear to be some eternal verities. One is that there is no such thing as a free lunch. The other is that HL Mencken was right when he observed that no one ever lost money by underestimating the intelligence of the “great masses of the plain people”. Amen.

In The Master Switch, you found that the great communication technologies of the 20th century all went through a cycle: each started out open, chaotic, creative and exciting, but each eventually wound up being controlled by industrial interests. At the end of the book, you said that the great question for the internet was whether the same fate would befall it. My reading of The Attention Merchants is that you now have the answer and it’s yes. Have I got that right?Yep. I had hoped that the historic cycle could be broken, but the power of fate or economics or whatever has proved irresistible. Everyone thought the web, in particular, would remain more competitive. Yet, as Lenin put it, quantity has a quality all of its own and the last 10 years have seen the emergence of a class of superpowers driven by old-school scale economics – especially Facebook, Google and Amazon – who have gained control over their respective domains and seem unlikely to be dislodged soon. Generally, as you gaze out across the main internet ecosystem, you see less of a competitive marketplace than a succession of globally dominant firms as far as the eye can see, followed by a tired group of companies fighting over the few crumbs left over.

Looking back at the 00s, the great mistake of the web’s idealists was a near-total failure to create institutions designed to preserve that which was good about the web (its openness, its room for a diversity of voices and its earnest amateurism), and to ward off that which was bad (the trolling, the clickbait, the demands of excessive and intrusive advertising, the security breaches). There was too much faith that everything would take care of itself – that “netizens” were different, that the culture of the web was intrinsically better. Unfortunately, that excessive faith in web culture left a void, one that became filled by the lowest forms of human conduct and the basest norms of commerce. It really was just like the classic story of the party that went sour.

The lesson should have been obvious, if you consider important public-spirited institutions like public parks, universities, museums, charities, some parts of the media. None of the best of these maintains a public character by just assuming people will be good or by adopting not-for-profit business models and assuming, arrogantly, that they’d be different somehow. The exception that proves the point is Wikipedia, which did commit itself to a structured non-profit path. Today, I think Wikipedia can hold its head high: it has thrived without advertising or other commercial distortions, while attracting and handling more traffic than nearly any other site on Earth.

Unfortunately, most of the rest, despite plenty of California idealism, either just allowed themselves to self-destruct or just accepted a standard corporate form with its unrelenting demands for constant revenue growth. In so doing, they shed much of their potential to be the kind of truly remarkable institutions that their founders might have wanted.

In some ways, the history of early 00s web idealism reminds me a little of the history of the counterculture of the 1960s. Both clearly had a major cultural impact in their times. But both were overconfident in believing that they’d overcome some of humanity’s worst tendencies. Over the long term, however, it was only those who managed to create some kind of structure to preserve what they believed in that had any lasting impact.

In chronicling the history of advertising you also discern cycles: a new medium appears, entrepreneurs find ways of using it to capture people’s attention that they then sell to advertisers; the advertising becomes increasingly intrusive and objectionable; eventually there’s a revolt or pushback. At the moment, you see mobile ad-blocking and the success of Netflix-type immersive content as signs of the latest revolt. But you also clearly believe that the cycle will continue. What drives this process?Simply put: the profit motive. Industries, unlike organisms, have no organic limits on their own growth; they are constantly in search of new markets or of new ways to exploit old ones more effectively. Having found a way to make money, a firm will persist trying to make more, even to the point of self-destruction. If the business model is advertising, that means squeezing in ever more ads that are increasingly intrusive and thereby making the product worse. In a normal market, a firm starts to realise that it has set its price too high when people stop buying and they back off. But with an advertising model, there is a delayed reaction – and then a revolt. One day people start to say: “I’ve had it, I quit.”

There’s a clear direction of travel in your history of advertising – towards more and more intrusion, culminating with the smartphone as the Trojan horse in everyone’s pocket. And you ask a plaintive rhetorical question: “Do we draw any lines between the private and the commercial?” Given where we are now, could we draw such a line? And how would we go about it?When President Obama threw parties at the White House, his guests were required to check their phones and other devices at the door. So it was a party – and with no selfies, no tweeting – where everyone’s really at the party.

Obama’s example shows how you do it: by reclaiming physical spaces and making them non-commercial. The easiest place to do this is your home. You might have a policy of checking devices at the door, say, or confining their usage to one room in the house. However you do it, the real key is drawing physical lines, not mental lines, in recognition of just how weak our wills truly are. For you will not win trying to fight a running battle with the forces of commerce. You’ll end up like the alcoholic who goes to bars and says to himself: “I’ll be fine with just one drink.”

One of the themes in the story you tell is fragmentation and atomisation: it takes us from mass media/society to the individual locked in his/her own tech bubble. You also have, at one point, a neat throwaway line: “Technology always embodies ideology.” Do you see a connection in this evolutionary story between information technology and neoliberalism?The internet clearly embodied an ideological reaction to the mass media’s collectivist and nationalist ideology – particularly in broadcasting, whether it was the power of American and British national broadcasting or the propaganda media of Nazi Germany and the Soviet Union. Overtly or implicitly, the goal of national broadcasting was always the fusing of the nation into a more unified whole. The American networks NBC and CBS saw “unifying the nation” as among their goals (conveniently, that also maximised advertising revenue). More darkly, Joseph Goebbels saw the point of broadcasting as achieving Volksgemeinschaft, the people’s community, the elevation of the nation over the individual.

The internet was a reaction to these impulses. Invented in the 70s and 80s, it embodied a mixture of the countercultural and the libertarian instincts and with it brought the attractions and dangers of both. But it is a bit more complicated than that, I think, because the web, in particular, always served to elevate not just individuals, but subcultures and groups over the great undifferentiated whole. I think this helps account for a broader fragmentation not just along individual lines, but cultural and political lines as well. While once upon a time one nation tended to consist of a dominant culture and various subcultures, nowadays there is sometimes no real centre, no mainstream left, just a collection of powerful subgroups that command deep allegiances.

Whether that’s better or worse than the old conformist media I leave to the reader to decide. Personally, I think it is much better when it comes to culture, but can be absolutely terrible when it comes to politics.

Via theguardian.com


Leave a Reply