Advertising and Technology

attention-merchantsMost industry folks know the name Tim Wu. He’s the Columbia professor that coined the phrase ‘net neutrality’ and who has been an advisor to the FCC on telecom issues. He’s written a new book, The Attention Merchants, about the history of advertising that culminates with the advertising we see today on the Internet.

Wu specifically looks at what he calls the attention industry, being that part of advertising that works hard to get people’s attention – as opposed to the part of the industry that produces advertising copy and materials. Wu pegs the start of the attention industry with the New York Sun, a scandal sheet started in 1833 that built up circulation by selling papers at a low price that included sensational (and untrue) content. The Sun was the first generation of publications like today’s National Enquirer and like a lot of websites today that peddle fake news. But that model worked and Herbert Simon of the Sun created an industry and made a lot of money selling advertisements.

Wu has painted a picture about advertising in terms of its place in the larger society. He observes that advertising has always come in cycles. At times advertisers grow to become too pervasive and annoying, and society then reacts by ignoring the abuses or by forcing the end to the largest abuses of the industry.

Wu traces the history of the attention industry through the years. He looks at the development of billboards and at state-sponsored propaganda machines like the British during WW1 and the Germans in WW2. He ends up by looking at Google, Facebook, Instagram and others as today’s latest manifestation of industries built from the concept of gaining people’s attention.

The attention industry has changed along with technology, and so Wu’s story is as much about technology as advertising. From the early days of sensational newspapers the attention industry morphed over the years to adapt to the new technologies of radio, television and now the Internet.

Probably the heyday of advertising was during the 1950s in the US when as many as two-thirds of the nation tuned in to watch the same shows like I Love Lucy or the Ed Sullivan Show. Advertisers for those shows caught the attention of the whole nation at the same time. But that uniformity of a huge market fragmented over time with the advent of cable TV and multiple channels for people to watch.

Today we are in the process of carrying advertising to the ultimate degree where ads are being aimed at specific people. The attention industry is spending a lot of money today on big data and on building profiles for each of us that are then sold to specific advertisers.

But we are already seeing the pushback from this effort. At the end of 2016 it was reported that over 70 million Americans were using ad blockers. These ad blockers don’t stop all ads and the advertising industry is working hard to do an end run around ad blockers. But it’s clear that like at times in the past, the advertisers have gone too far for many people. In the early days of the tabloids there was a lot of advertising for fake health products and other dangerous items and the government stepped in and stopped the worst of the practices. When TV ads became too pervasive and repetitive people invested in TiVo and DVRs in order to be able to skip the ads.

And the same is happening with online advertising. I am probably a good example and I rarely notice online advertising any more. I use an ad blocker to block a lot of it. I refuse to use web sites that are too annoying with pop-ups or other ads. And over time I’ve trained my eyes to just not notice online ads on web pages and on social media streams. And so advertisers are wasting their money on me, as they are on many people who have grown immune to the new forms of online ads.

But advertisers wouldn’t be going through the efforts if it didn’t work. Obviously online advertising is bringing tangible results or companies wouldn’t be moving the majority of their ad revenues from other media to the web. Wu’s book is a fascinating read that puts today’s advertising into perspective – it’s mostly the attention industry doing the same things they’ve always done, wrapped into a new medium. The technology may be new, but this is still the same attention industry that was trying to gain eyeballs in the 1800s. If nothing else, the book reminds us that the goal of the industry is to get your attention – and that you have a choice to participate or not.

A History of Net Neutrality

Network_neutrality_poster_symbolThese days it seems like everybody has an opinion about net neutrality. Ever since Arpanet was opened to the public in 1981 we have had almost the same debate we are having today. So today I thought I would look back at some of the key history in the net neutrality debate.

The first key event that could be called the beginning of the net neutrality debate was the publication of a paper entitled End-to-End Arguments in System Design by three computer scientists, Jerome Saltzer, David Reed and David Clark. For the real nerds among us I’ve included a link to that paper. This paper was written for a conference and was not intended as a scholarly piece, and yet it shaped the thinking of the early public Internet.

In the paper the authors said that the only logical way to design a network that had limited resources and that had to serve a large number of users with widely different interests was to have a network that performed logical operations on the edges, rather than the core. What they meant by this was that the core of the Internet should consist only of fast but dumb pipes and that any manipulation of data, and the paper focused on error correction as the example, should be done at or near the edge of the network with the last mile ISP or the user.

This paper had a big influence on the way the Internet was operated and for many years the Internet operated in a way consistent with this paper. Everything that was done on the Internet was done near the edge. For instance, the servers for large services like CompuServe or AOL were on the edge. The functions that ISPs made to receive and reconstruct files were on the edge. And end user software was contained in computers on the edge. In the middle were a handful of large carriers that transmitted data from hub to hub.

As the general public got introduced to the Internet the idea that the Internet ought to somehow be regulated arose. People who used the Internet liked the wide open feel of it and were worried that commercial uses of the Internet would change the nature and experience for everybody. During the 19080s we started seeing things like early versions of VPNs where large corporate data was given priority over other data. There was talk of creating priority bits for real time events like voice calls and video. And so the discussion began on whether the government ought to intervene and regulate the Internet in some fashion.

In 2000 Harvard law professor Lawrence Lessig published a book Code and Other Laws of Cyberspace. This was a scholarly work that explored the topic of Internet regulation. Lessig said that the end-to-end principle was one of the most important reasons that the Internet had produced growth and innovation and that a free and open Internet ought to be maintained. Lessig argued that there was a role for government, which was to maintain the end-to-end principle. He thought that without government regulation of some sort that commercial interests would chip away at the freedom and function of the Internet until it would lose the characteristics that make it so beneficial to society.

He used the word ‘code’ as a surrogate for software, meaning that whoever controls the software of the Internet can control what happens on it. He thought, rightfully so, that either commercial or government code could eventually interfere with the operation of the Internet. Today it’s obvious that both kinds of control are going on. Entire countries have been carved away from the open Internet by governments and other countries like Russia are considering doing the same. US carriers want to create Internet fast lanes and the ones in Europe have already done so. And we find ourselves being spied upon by governments and by commercial entities who either record everything we do or who plant spyware on our computers.

Tim Wu, a law professor at the University of Virginia built on the ideas in Lessig’s book and published an article in 2002, A Proposal for Network Neutrality. Wu argued in favor of the same end-to-end principle and said that an open internet caused a Darwinian competition among every conceivable use of the Internet and that only the best uses would survive. He said that network neutrality (he coined the phrase) was necessary to make sure that there was no bias against any use of the Internet.

Wu understood that some network bias was unavoidable, such as giving priority to voice packets so that voice could be transmitted over the Internet. But he thought that there should be some sort of defined dividing line between permissible bias and impermissible bias. And that dividing line, almost by definition has to be defined by regulators.

And so today we are still at the same point where Wu left the argument. Sadly much of the debate about network neutrality has wandered off into political directions and no longer has to do with the way we manage packets. But absent some sort of regulation it seem clear to me that commercial and government use of the Internet will continue to chip away a little at a time until the Internet is a controlled environment, and that any user’s Internet experience is going to be subject to the whims of whoever controls their local part of the Internet.