A Real Chance for OTT?

FCC_New_LogoOn Friday the FCC released an NPRM in Docket FCC 14-210 that asks a host of questions about allowing Internet content providers to be treated as cable companies. The NPRM contains a very thorough discussion of all of the rights and obligations of being a cable company, and anybody who doesn’t understand the regulation of cable companies can get a quick education just by reading the NPRM.

It’s obvious that by raising the issue that the FCC is in favor of promoting more competition for cable TV. This is something that the public obviously wants. But the FCC has to walk a fine balance with this issue. If they make it too easy for online content providers then they might accelerate the collapse of the traditional cable TV business. I know many would applaud that, but there are a lot of homes that can’t get cable over the Internet and who are not situated to get it from a satellite. On the other hand, if they make it too hard to qualify to deliver content online then not many companies will try and they will have accomplished little.

One might think that it’s an easy question to answer until you read the NPRM. There are some very tricky issues for the FCC to wrangle with:

  • For example, should somebody who only wants to deliver a package of a few channels be able to buy them? (Cable companies can’t do that).
  • Should they require an Internet provider to carry the major network channels like cable companies must do, and if so, would they be required to carry the channels in every market and have to swing deals with hundreds or even thousands of stations?
  • Can an Internet provider that only wants to deliver content on a delayed basis, like Netflix, be able to buy any programming they want?
  • Can a content provider like Disney offer a package of programming online that only includes content they own?
  • Do online providers have to provide services like closed captioning (for the deaf) and video description (for the blind)?
  • Would ad-based online companies have to comply with the rules about the loudness of commercials?
  • Does an online provider have to notify customers of things like weather alerts or other emergency announcements?
  • Can the FCC require content providers to negotiate with possibly thousands of new online market entrants? Even today many content providers send smaller providers to somebody like the National Cable Television Cooperative to get content. Would this mean that NCTC would have to accept online providers into the Coop?
  • Would online providers have the same restrictions against making exclusive deals with MDU owners?
  • What do they do about the more arcane rules such as cable cards, inside wiring and signal leakage?
  • Can a company with no business presence in the US become a US cable company since they have access to customers through the Internet?

I think it’s pretty obvious that the FCC is going to do something to allow online competition. But they are starting with a regulatory framework that was written specifically with coaxial networks in mind and that has many rules that don’t make sense for an Internet provider.

I think there are a lot of people who would become cord cutters if they could buy smaller packages of the programming they want online. I know I would personally be very happy with a package of Netflix, Amazon Prime, ESPN and the Big 10 Network. But I think a lot of people are going to be disappointed when the find out that online cable competition is not going to be the same thing as a la carte programming where subscribers can choose only the channels they want to buy. It might be that on-line packages cost as much as the ones from the cable companies.

Once a company qualifies as an online cable company they are going to be saddled with many of the same rules that apply to cable companies. And they are going to be in an industry where the balance of the power has swung very much to the content providers. For example, it’s common today that if a cable company wants to buy one channel from one of the big eight content providers that they have to take virtually every channel that the provider owns.

There is also an issue that is faced by many customers that is not addressed in the NPRM. It’s a very common trend these days for cable companies to require at least some bundling in order to buy Internet access. For example, in my town I can only buy Comcast’s slowest Internet speed without having to subscribe to at least some cable channels. But it’s doubtful that without considering Internet as a Title II service that the FCC can order cable companies to sell all speeds of broadband as a standalone product. This is one of the issue that is stopping potential cord cutters. So here is yet another issue that is tangled up in in the Title II regulatory debate along with net neutrality.

How We Get Our Entertainment

Fatty_watching_himself_on_TVNielsen just published The Total Audience Report for the third quarter of 2014 in which they look in detail at how Americans are getting their entertainment. They define entertainment broadly and include things like web browsing on computers and cellphones. But they do not count voice calling or texting, which are communications.  This report concentrates on the time that people spend on different devices. The report looks at the data in a number of different ways and I found a few of the comparisons to be quite interesting.

The report shows that the way we are accessing entertainment is changing rapidly. Consider the following statistics comparing the number of hours per day that the average person uses various devices , for the third quarters of 2012 and 2014 (in hours and minutes):

‘                                                            3Q 2012          3Q 2014

Watch Live TV                                        4:50                  4:32

Watch Time Shift TV                              0:24                  0:32

Watch DVD / Blu-Ray                            0:09                   0:09

Use a Game Console                             0:09                  0:12

Use Internet on a Computer                1:04                  1:06

Use a Smartphone                                 0:53                  1:33

Listen to AM/FM Radio                          2:51                  2:44

Use a Multimedia Device                      0:00                  0:04

The total time spent by the average person doing these activities increased over the two years from 10 hours and 20 minutes to 10 hours and 52 minutes. The time spent watching traditional TV and listening to AM/FM radio dropped while everything else stayed the same or climbed. The most dramatic shift was in the use of smartphones for entertainment which grew by 75% in just two years.

It’s also interesting to look at these same statistics by age group. Consider the following that shows weekly statistics for the average person in three different age groups (in hours and minutes):

‘                                                               18 – 24            35 – 49            65+

Watch Live TV                                         17:34              29:41               47:13

Watch Time Shift TV                                1:43                3:40                 3:19

Watch DVD / Blu-Ray                               0:46                1:08                 0:37

Use a Game Console                               3:35                1:03                 0:07

Use Internet on a Computer                  4:54                7:22                 2.48

Watch Video on Internet                        1:46                 1:48                 0:26

Using an App on Smartphone               9:40                 9:39                 1:16

Watch Video on Smartphone                 0:29                 0:14                 0:00

Listen to AM/FM Radio                          10:30              13:48               12:06

Use a Multimedia Device                        0:38                 0:30                 0.13

This shows a dramatic difference by age for watching traditional TV. The younger you are, the less TV you watch. Young people in the 18-34 age group watch 63% less TV than those over 65 while 35-49 year olds watch TV 37% less. It’s the dramatic decrease in TV viewing by younger viewers that has the TV industry worried. This is certainly going to mean a major shift in advertising dollars away from TV, something that has recently become noticeable. And this same trend of caring less about TV might be what breaks the traditional cable model rather than cord cutters. Young people still watch TV, but a lot less than older generations.

There is also a huge difference between generations in terms of total hours spent using these devices. The 18–24 year-olds spend 51 hours and 20 minutes per week, those 35-49 spend 68 hours and 20 minutes, and those over 65 spend 68 hours and 6 minutes.

People under age 50 have made a dramatic shift to using their smartphones for entertainment, be that playing games, browsing the web or shopping. Both the 18-24 year olds and the 35-49 year olds use their smartphones over 9 hours per week. Interestingly, I have read a lot of articles talking about how smartphone video usage is growing rapidly and will eventually swamp other kinds of viewing, but these numbers don’t support that contention. This shows that even those in the 18-24 group are watching video on the smartphones less than a half-four per week on average. Certainly usage of smartphones in general is way up, but they still only represent a very tiny sliver of the market for watching video.

These charts also reminded me how much people still listen to AM/FM radio. I listen to Sirius XM radio in my car since I am a talk radio junky and I haven’t listened to regular radio in years. But these numbers still show that all age groups are listening to the radio more than 10 hours per week.

The Unregulated Texting Market

Text-messageComcast recently started texting me about my cable bill. And like most things they do they didn’t quite get it right. They sent me three texts telling me I was being billed and when they sent me a second text telling me I had paid I discontinued the texting service. Since my bill is identical month to month and I always pay my bill on time, I thought getting five texts was annoying.

I would contrast this to the texting service that I have had with AT&T wireless for many years. They send me a text when my bill is ready to review and they notify me a second time when they have billed against my credit card. I think in the dozen years I have used them that they have only sent me a few other texts, such as asking me to rate customer service after I visited their store. I am sure that if I didn’t pay my bill on time that AT&T would text me more to prompt me to pay. But overall I am satisfied with the AT&T texting service. It’s not intrusive and it keeps me adequately informed.

Comcast is a bit late to this game and many other carriers and other types of companies already use texting to connect to customers. When I moved to Florida I found that a lot of businesses here use text messages. For example, I bought some furniture from Haverty’s who texted me throughout the delivery process. They let me know when my furniture was delivered to their warehouse, and they texted me several times to coordinate delivery. Using texts they were able to pin down the time of my delivery to about an hour. I think a lot of people would be happy if Comcast technicians could do the same thing.

So texting can be a great tool when used correctly. A lot of people don’t want to talk to customer service reps and a two-way texting service provides a great alternative. With AT&T I can do such things as make queries about my bill and I don’t have to call or be at a computer.

But there is a darker side to texting because the large wireless carriers control the market very tightly. SMS texting as we know it got introduced in cellphones in the late 90s. But, like the Internet, texting is not covered by Title II regulation and so there are very few FCC rules that apply to text. The FCC has a few rules, such as mandating that texting can’t interfere with voice calling, but otherwise the product is largely controlled by the big carriers like AT&T and Verizon.

Since Comcast is not a wireless carrier they must buy texts wholesale from one of these large wireless carriers. Interestingly, those carriers are quite strict about how texting is used. For example, they limit the number of times per month that texting can be used to send a sales message to a given customer. I assume that the carriers are careful about this because they don’t want a lot of customer complaints at the FCC, which might result in becoming regulated by Title II.

The big carriers have a good reason to be cautious, because they make a fortune on texting. It costs almost nothing to send a text, as in a very tiny fraction of a penny (with many zeros before the first digit). The bandwidth used for a text message is tiny, and the date path being used has to be there any way since it is a control channel for some of the functions of cellular calls. The texts they have been selling for years for ten cents has to be the most obscenely profitable product in the world.

But the carriers often go further than just limiting the number of texts. For instance, in the past there are instances where the big carriers have blocked texts. One well-known case was when Verizon blocked text messages coming from Naral Pro-Choice America. Verizon thought the content of the texts was too controversial and graphic and blocked the group from texting. In an unregulated world Verizon is free to establish any rules they want for the texting service they sell, and so they are free to block Naral. But I find it disturbing when Verizon gets in the censorship business while using spectrum they got from the government.

This is a good example of what might happen to the Internet without any net neutrality rules. In the texting world the carriers have become judge, jury and executioner and they control texting with an iron fist. One can imagine over time that the major ISPs could do the same thing to the Internet.

Regulation by the carriers has a positive side. Verizon is actually more likely than the FCC to quickly slap Comcast’s wrist if they get carried away with the number of texts they send to a given customer. But do we really want a large company like Verizon deciding what can and cannot be done in the texting world?

Texting is directly analogous to the regulation of the Internet. Today we have no net neutrality rules since the last set are in limbo. The Internet is being controlled right now by the large carriers. I think the only thing stopping the carriers from making deals for Internet fast lanes or even worse things is that they are afraid the FCC will use that as an excuse to implement Title II regulation. But if the day comes when the carriers stop worrying about that threat, then we only have to look at the texting market to see what carrier regulation looks like. It’s not particular pretty.

Is 10 Mbps Really Broadband?

Polk County Sign‘On December 11 the FCC released an order in Docket No 10-90 that increased the definition of broadband for rural landline connections that can receive funding from the Universal Service Fund. The new baseline definition of broadband is 10 Mbps download and 1 Mbps upload and which replaces the old definition of 4 Mbps download and 1 Mbps upload. In today’s world is 10 Mbps really broadband?

The FCC came to this number based upon tables they included in the Tenth Broadband Progress Notice of Inquiry released last August. The FCC suggested the following as representative of the broadband usage today in different sizes of homes:

‘                                       Light Use        Moderate Use             Heavy Use

One User                      1 – 2 Mbps          1 – 2 Mbps                6 – 15 Mbps

Two Users                    1 – 2 Mbps          1 – 2 Mbps                6 – 15 Mbps

Three Users                 1 – 2 Mbps          1 – 15 Mbps              More than 15 Mbps

Four Users                   1 – 15 Mbps        6 – 15 Mbps              More than 15 Mbps

The first thing that is obvious is that the FCC didn’t set the new standard high enough to satisfy households with 3 or 4 people. I know in my household with 3 users that we often look something like the following in the evening:

1 User watching HD movie                      5.0 Mbps

1 User watching SD movie                       3.0 Mbps

Web browsing                                           0.5 Mbps

Cloud Storage                                           1.0 Mbps

Background (synching emails, etc.)       0.4 Mbps

‘          Total                                                9.9 Mbps

But we can use more than that. For instance we might be watching three HD videos at the same time while still doing the background stuff, and using over 16 Mbps, as the FCC suggests.

Clearly the old metric of 4 Mbps that was adopted in 2011 is now too low. But I think the new standard is already too low for today’s usage and it will probably be three years or more before this is considered again.

This new definition is going to be used in the upcoming reverse auction for Universal Service Fund support. Carriers can ask for a monthly subsidy from that fund to help to offset construction of broadband facilities that will deliver the 10 Mbps speeds. That is not a lump sum grant, but instead a payment per month over 5 – 7 years that help to pay for the new investment over time.

So what kind of landline technology can deliver this much speed? Clearly fiber can do it. Cable companies with their hybrid fiber coaxial plant can do it. And DSL can do it on good copper up to about 7,000 feet from the central office. The problem in rural areas is that the copper is often not in good shape. Plus we know that both AT&T and Verizon want to ditch copper and are unlikely to take any funding to expand copper capability. The only other way that DSL can deliver this speed any significant distance is to locate the DSLAMs (DSL hubs) in the field. But that means building more fiber.

The way I understand this change, it only applies to companies looking for a subsidy out of the Universal Service Fund. This change would be a lot more impressive if it was also the new definition of bandwidth for all purposes including the National Broadband Map. If that map was accurate, then changing the minimum definition of broadband to 10 Mbps would mean that many millions of rural homes would suddenly be classified as having no broadband. But that map is full of inaccuracies because the speeds that are ‘available’ to customers are self-reported by carriers which often exaggerate the actual speeds they can deliver.

I know that there are a lot of rural small towns where the real speeds of either DSL or cable modems today is under 6 Mbps and sometimes only a few Mbps. These areas are mostly counted today as having broadband and under the new definition they would no longer have it. But making that change is a public hot potato and no FCC staffer or politicians wants to say that fewer Americans have broadband.

Anything that brings more broadband to rural areas is good, but this increase just feels inadequate. If CAF funds are used to build DSL that barely can deliver 10 Mbps, then households getting the speed upgrades will find themselves being too slow in just a few years before the ink has barely dried on the paperwork. Statistics show that household bandwidth consumption is doubling every three years, and so in six short years, a household that needs 10 Mbps today is going to need 40 Mbps.

In the recent experimental grants that were just awarded by the FCC, the broadband speeds required to get funding varied between 25 Mbps and 100 Mbps. Those kinds of speeds are enough broadband to provide a little future-proofing. It is a mistake to give anybody federal money to build a new network that can only deliver 10 Mbps. Any town that gets such an investment will get very temporary relief but will then fall behind the curve in a few years when the rest of the country has 100 Mbps or faster service.

The FCC’s Plate is Full

FCC_New_LogoI don’t think I can remember a time when the FCC had more major open dockets that could impact small carriers. Let’s look at some of the things that are still hanging open:

Net Neutrality. This is the granddaddy of all FCC dockets, if for no other reason than the number of responses filed in the docket. The network neutrality docket asks the basic question if there is any legal mechanism that the FCC can use to ensure that the Internet remains open. The public debate on the issue has concentrated on discussion of whether there should be Internet fast lanes, meaning that some companies could buy priority access to customers. Of course, the flip side of that question is if most of the Internet can be made slower in favor of a handful of large companies willing to pay a premium price to ISPs to be faster.

The issue has become political and there are polarized positions on opposite sides of the topic. The large players in the industry have also lined up in predictable ways with the giant cable companies and telcos against any form of regulation on broadband and almost everybody else on the opposite side of the fence.

Municipal Broadband. Petitions filed by Chattanooga TN and Wilson NC prompted the FCC to investigate if they should overturn the various state restrictions against broadband. There are roughly twenty states that have some sort of restriction against municipalities either entering the business or for operating as a retail provider of services. In some state there is an outright ban against any form of municipal broadband competition. In others, municipalities can build networks but can only provide wholesale access to the networks.

This issue is a classic case of pitting states’ rights against the ability of the a federal agency to preempt them. The FCC has overturned numerous state laws in the past and certainly has that ability in terms of telecom law. But in most past cases the FCC overturned rules established by state commissions and here they would be overturning laws created by state legislatures. There are a number of states that say they will sue over the issue as well as some members of congress that are vehemently against overturning state laws.

IP Transition. The IP transition can have huge repercussions on LECs and CLECs. At issue is the replacement of the traditional TDM network with an all-IP network. From a technical perspective this transition if very straightforward and the carrier world is already in the process of implementing IP connections in the voice network.

But there is a long list of carrier compensation issues that are tied deeply to TDM network rules that must be dealt with. For example, one the primary principles that help to make CLECs competitive is that they can choose to meet incumbent networks at any technically feasible point of their choice. The RBOCs view the IP transition as a way to change this balance and they want CLECs to pay to bring all voice traffic to them.

And rural consumers have a huge stake in this docket since the large telcos see this as an opportunity to ditch customers on rural copper. AT&T, for example, has made it clear that they would like to cut the copper to millions of rural customers.

Mergers. The FCC is processing two large merges between Comcast and Time Warner and between AT&T and DirecTV. The Comcast merger is the one with the most practical market consequences since it merges the two largest incumbent cable companies. The cable industry already suffers from the lowest customer satisfaction among all industries and the two companies are near the bottom of the pack in the industry.

So customers are worried that the merger will lead to even worse service. And competitors worry that the mega-company that would result from these two mergers will have too much market power. The FCC Chairman Tom Wheeler has publically expressed some concern about this merger being good for the industry, so it doesn’t sound like a slam dunk.

Internet TV. The FCC is looking at whether it should regulate Internet TV. For example, should a channel line-up broadcast over the Internet have to follow the same rules as a broadcast over a cable network? This ruling is going to have a huge influence over how small companies deliver cable TV.

Everything Else. In addition to these big issues the FCC has a lot of other open dockets. Some of them are relatively small, such as the docket that looks at whether the FCC should regulate robocalls. But some cover large issues, such as the docket that is examining how the FCC sells wireless spectrum.

New Technology – December 2014

MagneticMapHere are some of the interesting new technologies I’ve run across in recent weeks:

Faster Data Speeds. Researchers at Aalborg University, MIT and Caltech have developed a new mathematically-based technique that can boost Internet data speeds up to 10 times. In a nutshell they code data packets and embed them within an equation. The equation can be solved when all of the packets are received at the other end.

While this sounds complicated, it is vastly faster than the current TCP/IP standard that is used to transmit packets. With TCP/IP once a data file begins to be transmitted the packets must be both sent and received in order, and they use the same data path over the Internet. If a packet is bad or gets lost the TCP/IP process slows down trying to find the missing packet. But under the new technique, different packets can take different paths on the Internet and it doesn’t matter if they are receive in the right order. They are reordered as the equation is solved.

In prototype trials this speeded up data transmissions from between 5 and 10 times. And transmissions are inherently safer because all of the packets don’t take the same path, making it a lot harder to intercept them. This technology can apply to any data transmission network. This is one of those changes that is a fundamental breakthrough because we have been using TCP/IP for decades and everything is geared to use it. But this has promise to become the new data transmissions standard.

Any Surface Can be an Antenna. Scientists at Southeast University in Nanjing China have developed a meta-material that can turn any hard surface into an antenna. They do this by embedding tiny U-shaped metallic components in the surface. These little Us act like what is called a Luneburg lens. Normal lenses are made out of one material and refract light in a consistent way. But a Luneburg lens is made up of multiple materials and can bend the light in multiple ways. For example, these materials can be used to focus on a point that is off to the side of the lens (something normal lenses can’t do) or they can radiate all incoming radiation in the same direction.

These meta-material surfaces can be designed to act as an antenna, meaning that almost any surface could become an antenna without having to have an external dish or receiver. Perhaps even more interesting, these same meta-materials can be used to scatter radiation which could make fighter jets invisible to radar.

Another Step Towards Photonic Chips. Researchers at Stanford have developed an optical link that uses silicon strips to bend light at right angles. This adds a 3D aspect into the chip topography which will help to accommodate the speeds needed by future faster computers. The can be reconfigured on the fly to use different light wavelengths making it possible to use the strips to change the nature of the computer as needed. This is one of the many steps that is needed to create a purely photonic computer chip.

Cooling With Magnets. Scientists in Canada and Bulgaria have developed a way to produce cooling using magnetic fields. This works by removing ferromagnetic materials from magnetic fields which causes them to cool down. They have found several substances that are efficient in heat transfer. Further, they are using water as the heat transfer fluid eliminating harmful hydrofluorocarbons. This can be used for refrigerators or air conditioners without the coils and pipes by just rotating the cooling element in a magnetic field.

Synthetic Gasoline out of Water. German company Sunfire GmbH has developed a process that can make synthetic fuel from water and carbon dioxide. The technology has been around for a long time and uses a process called the Fischer-Tropsch process. But the company has found a way to make the process far more efficient. The fuel that is produced has a high energy coefficient of 50%, similar to diesel fuel, compared to a much lower efficiency for gasoline between 14% and 30%. But the company thinks they can get the efficiency up to 70%.

The interesting thing about the technology is that it is carbon neutral since it takes the carbon dioxide out of the atmosphere to create the fuel, as compared to pulling it out of the ground. The are also numerous benefits from having a more efficient. With this technology we can keep our gasoline cars without having to rely on the petroleum industry. It could help to take the politics out of oil and could let us cut back on the amount of petroleum we need to refine.

The Dark Side of Web Advertising

virusYesterday I talked about the general way that Internet ads function. But today I want to look at one of the darker aspects of web advertising by looking at how ads spread malware.

Cisco’s Annual Security Report for 2013 provided some pretty amazing statistics about Internet advertising:

  • They said that the highest concentration of online security threats are not found on pornography, pharmaceutical or gambling sites, but rather that the most danger today comes from major search engines, retail web pages and social media outlets,
  • They said that online shopping sites are 21 times more likely, and search engines are 27 times more like to deliver a malicious piece of software than a counterfeit software site.
  • But no threat compares to online advertising, and Internet ads are 182 times more likely to give you a virus as searching the web for porn. (Of course, they didn’t say how the intrepid Cisco researchers made the comparison to porn).

Probably the major culprit of malware in advertising comes from a practice called real-time bidding. When you go to load a web page that has real-time bidding, an ad company like AppNexus (or many others) asks for bids for placing ads on your page. The solicitation gives a quick profile of who you are in terms of age, demographics, geography, etc. The highest bidder then gets the ad space, and this all happens in a flash. The problem with this kind of system is that nobody has time to monitor the ads that are placed and so malicious advertisers gain access to you by bidding the highest. And they don’t have to bid much. It takes only a very tiny fraction of a penny to get an ad placed at one specific user.

The malicious ads don’t look malicious and are usually disguised to look like an ad for some normal company. But the purpose of the malicious ad is to put a piece of code on your computer. The bad news these days is that you don’t have to click on the ad to get the bad software – the act of opening the web page is often enough to activate it.

I run a malware checker regularly and I am amazed at how many pieces of malicious software I get regularly. It is not unusual for my computer to have picked up a hundred pieces of malware within three days after having scrubbed it. I don’t shop much on-line, but I read a lot of articles and I assume that is the source of most of my malware.

According to my malware software, most of the bad things that I pick up are adware, which they define as a piece of code that is gathering and transmitting data about me to the world. These days adware is generally something a little more complex than a cookie. Cookies are somewhat passive files that sit on your machine to tell somebody later that you have already been to a certain web site or something similar. Think of adware as cookies+ in that they gather specific data and either store it for later retrieval or, in the worst cases send it out to the world.

I’d say 99% of what I get is adware with only the occasional more malicious malware, which could be a virus or some other nasty piece of code. But think about what I am getting. I am inadvertently downloading 100 pieces of adware within just a few days, each of which is looking for specific facts about me and reporting back to whoever placed the malware. I am sure that mostly they are tracking the web sites I’ve visited in order to build up a more detailed profile about me. But these little pieces of malware can pick up almost anything else from bank account numbers to passwords.

I think we all understand that half of what is on the web these days is designed to build a profile for each of us. But I don’t think most people realize how intrusive this effort has become. They are not building a profile by slowly studying your web usage. They are spying on your directly to know everything you do. It’s a bit scary when the most dangerous place on the web is a search engine or a major news site that has ads.

Yesterday I talked about ad blocking and perhaps this is what is going to save us from this rash of malicious malware and adware. Certainly if somebody will block all ads to my computer then I can’t be receiving ads with malware. But I would be just as happy if somebody could deliver ads to my machine that are certifiably safe. It doesn’t take a lot of effort for an ad company to test an ad first to make sure it doesn’t leave bad code behind. But that can’t be done in a process where an ad space is advertised and subscribed in milliseconds. This gives the bad guys a really cheap way to get their ads to anybody they want.

So I think Google is onto something with their product that can block all ads. But as I described yesterday, Google is not the last company in the chain between a web site and a user, so I am guessing that even with Google ad blocking that some ads and malware are still introduced after Google has wiped out the first ads. Your ISP is the last entity to touch the data stream coming to your house and thus has the final chance to get rid of malware. I think ISPs might be missing the opportunity to offer better security to their customers by either blocking ads or by making sure that ads are safe.

Who Owns Internet Ad Space?

advertise-hereGoogle made a very interesting announcement a few weeks ago that led me to find out more about the ad space on web sites. Google announced that for $2 per month they would block all ads on web sites for a customer as long as they browse through the Chrome browser.

I find this fascinating because it means that Google thinks that they have the ability to block an ad, even when they are not the one to have placed the ad in the first place. Google sells a lot of ads, and so it makes sense that they can block ads that they have placed on a web page. But when they say they can block all ads it also means that they think they have the ability to block ads placed by somebody else.

Just to be clear about what I mean by ads, look at this web page. At the top is a banner ad. At the top right of the story is an ad. And across the bottom of the article are four ads. After loading this web site multiple times I noticed that the ads changed.

It turns out that there are two kinds of ads on a web page. There are fixed ads and remnant ads. Fixed ads are placed there by the web site owner or somebody they partner with to advertise for them. Fixed ads embedded into the web page and can only be accessed by the website owner. The other kind of ads are called remnant ads. These are coded in such a way as to be available to outsiders, and anybody that has access to a website before it reaches a customer can change what is in the remnant ad space.

And as you would expect, these remnant ad spaces get changed all of the time. There are a lot of companies that sell advertising into the remnant ad space including Google (DoubleClick), Yahoo, Amazon, Facebook, AOL, AppNexus, Openx, Adroll, RightMedia and dECN. It was very easy for me to spot remnant ads in the recent election season, because I swear that every web page I looked at here in Florida had a political ad for Rick Scott who was running for reelection as Governor. So somebody was being paid in Florida to put those ads onto Florida computers.

The first question this raised for me is: who owns this ad space? The web page example is from the TechCrunch web site. TechCrunch chose to make the ads open to the public and I assume they gets revenues from at least some of the parties that use that space, which is their motivation to use remnant ad space. Google thinks they have a right to go in and block whatever is on the remnant ad space on that page, so they are sure that it is theirs to grab. I know that some of the larger ISPS like cable companies are also in the advertising business, through partners, and I wouldn’t be surprised if it was Comcast that gave me all of the Rick Scott ads.

I was shown a recent legal opinion by one of the companies that advertises in the remnant space who was gracious enough to share it with me as long as I don’t publish it. The opinion says basically that nobody owns the remnant ad space. The legal opinion says that the act of a web site owner in making this available to the public means just that, and it can be used by anybody who somehow has access to the website before it reaches a customer. That generally is going to mean some company who is part of the chain between a web site and the customer. Obviously the web site owner can hire somebody to place ads in the remnant space. If you reach the web site through a browser then the browser owner can place the ad in there. If you get to a web site through a link on another web site like Yahoo News then they can place ads there. And your ISP also would have access to this ad space.

I really like the Google product that blocks ads. I think there are plenty of customers who would love to avoid all of those ads. Further, blocking ads means a faster Internet experience for a customer. I know there are web sites I go to that have multiple videos automatically running that seems like an extravagant use of my bandwidth. I have a 50 Mbps Internet connection and there are still web sites that load very slowly due to all of the extra videos that have been layered into the ad spaces. I also learned that remnant ads are one of the most common sources today of adware and malware and I will talk about that more in tomorrow’s blog.

A History of Net Neutrality

Network_neutrality_poster_symbolThese days it seems like everybody has an opinion about net neutrality. Ever since Arpanet was opened to the public in 1981 we have had almost the same debate we are having today. So today I thought I would look back at some of the key history in the net neutrality debate.

The first key event that could be called the beginning of the net neutrality debate was the publication of a paper entitled End-to-End Arguments in System Design by three computer scientists, Jerome Saltzer, David Reed and David Clark. For the real nerds among us I’ve included a link to that paper. This paper was written for a conference and was not intended as a scholarly piece, and yet it shaped the thinking of the early public Internet.

In the paper the authors said that the only logical way to design a network that had limited resources and that had to serve a large number of users with widely different interests was to have a network that performed logical operations on the edges, rather than the core. What they meant by this was that the core of the Internet should consist only of fast but dumb pipes and that any manipulation of data, and the paper focused on error correction as the example, should be done at or near the edge of the network with the last mile ISP or the user.

This paper had a big influence on the way the Internet was operated and for many years the Internet operated in a way consistent with this paper. Everything that was done on the Internet was done near the edge. For instance, the servers for large services like CompuServe or AOL were on the edge. The functions that ISPs made to receive and reconstruct files were on the edge. And end user software was contained in computers on the edge. In the middle were a handful of large carriers that transmitted data from hub to hub.

As the general public got introduced to the Internet the idea that the Internet ought to somehow be regulated arose. People who used the Internet liked the wide open feel of it and were worried that commercial uses of the Internet would change the nature and experience for everybody. During the 19080s we started seeing things like early versions of VPNs where large corporate data was given priority over other data. There was talk of creating priority bits for real time events like voice calls and video. And so the discussion began on whether the government ought to intervene and regulate the Internet in some fashion.

In 2000 Harvard law professor Lawrence Lessig published a book Code and Other Laws of Cyberspace. This was a scholarly work that explored the topic of Internet regulation. Lessig said that the end-to-end principle was one of the most important reasons that the Internet had produced growth and innovation and that a free and open Internet ought to be maintained. Lessig argued that there was a role for government, which was to maintain the end-to-end principle. He thought that without government regulation of some sort that commercial interests would chip away at the freedom and function of the Internet until it would lose the characteristics that make it so beneficial to society.

He used the word ‘code’ as a surrogate for software, meaning that whoever controls the software of the Internet can control what happens on it. He thought, rightfully so, that either commercial or government code could eventually interfere with the operation of the Internet. Today it’s obvious that both kinds of control are going on. Entire countries have been carved away from the open Internet by governments and other countries like Russia are considering doing the same. US carriers want to create Internet fast lanes and the ones in Europe have already done so. And we find ourselves being spied upon by governments and by commercial entities who either record everything we do or who plant spyware on our computers.

Tim Wu, a law professor at the University of Virginia built on the ideas in Lessig’s book and published an article in 2002, A Proposal for Network Neutrality. Wu argued in favor of the same end-to-end principle and said that an open internet caused a Darwinian competition among every conceivable use of the Internet and that only the best uses would survive. He said that network neutrality (he coined the phrase) was necessary to make sure that there was no bias against any use of the Internet.

Wu understood that some network bias was unavoidable, such as giving priority to voice packets so that voice could be transmitted over the Internet. But he thought that there should be some sort of defined dividing line between permissible bias and impermissible bias. And that dividing line, almost by definition has to be defined by regulators.

And so today we are still at the same point where Wu left the argument. Sadly much of the debate about network neutrality has wandered off into political directions and no longer has to do with the way we manage packets. But absent some sort of regulation it seem clear to me that commercial and government use of the Internet will continue to chip away a little at a time until the Internet is a controlled environment, and that any user’s Internet experience is going to be subject to the whims of whoever controls their local part of the Internet.

Watching Networks Die

old telephone wiresA few weeks ago I went to Phoenix, and in driving around various neighborhoods I noticed a lot of problems with the copper network. I was out in an outer suburb, in horse country where the lots are large and where most people live back long unpaved lanes. It was a rural area, but a pretty upscale rural area. What I saw was that a number of poles were looking pretty ragged, with some looking ready to fall down in a stiff wind. But what was most noticeable was that a lot of pedestals looked to be in bad shape. Many had been knocked over at some point in the past and were lying on the ground. Some were cracked open leaving the wiring exposed. But my favorite was a pedestal that was held up against a stop sign using duct tape.

A few months ago I reported on a telephone network on an Indian reservation where the carrier cabinets were hanging wide open exposing the electronics to the elements. And this wasn’t in the dry southwest desert but in the snowy northern plains. That same network had telephone cables draped for long distances over the tops of barbed wire fences.

I also did a Google search and it appears that it has become a common practice to leave cables running across sidewalks. I know when I got cable service at my house that they ran the cables across the sidewalk and the yard, but within a week a crew showed up and buried it. But I am reading about cases where telephone or coaxial cable has been left lying across sidewalks for years at a time.

I can’t recall ever hearing of this practice until sometime during the last decade. Before that the cable or telephone companies simply did not string cables over the ground for more than a day or so as part of a new installation.

It certainly is possible for poorly maintained cables to lead to disaster. Years ago when I worked for CP National, one of our customers in rural Nevada was killed when they strangled on a low hanging cable they ran into while horseback riding. That certainly is a rare occurrence, but it is not hard to foresee all sorts of problems arising when cables are left where they can touch people.

I also recently read an article at Stop the Cap that gave pictorial and video evidence of cables that have been draped permanently over backyard fences or left on the ground for long periods of time. That same article talks about how Cleveland has dozens of complaints about telephone wires that have been cut and are dangling to the ground. The City has tried to get AT&T to fix the dangling cables, but it turns out that due to deregulation the City has no legal authority to require the company to clean up its mess.

One can think of many reasons why we are seeing more and more of these kinds of situations. A lot of carriers are now using contract labor that is paid by the installation, which gives them the incentive to take shortcuts to finish jobs quickly. Years ago installations were done by trained employees who worked to good standards and who took the time to make sure that an installation was done correctly,

The issue in Cleveland is probably the result of competition. As competitors bring new service to a home they often just cut the old service drop without caring what happens to it. This seems like something that state Commissions could deal with, even in this day of declining regulation.

But the really bad networks like the ones I saw in Phoenix and the ones on the Indian reservation are due to the total neglect of the copper network by companies that plan to walk away from copper at some point.  Both AT&T and Verizon have made it clear to the FCC that they both tend to walk from large swaths of their rural networks within just a few years.

The large telcos have systematically ignored rural areas. They have closed customer service centers and cut back on maintenance staff to the point where an average rural installer is often in charge of a huge geographic area. It often takes more than a week to get a technician to the house when a customer has a problem.

The big telcos don’t neglect all copper, just the rural copper. CenturyLink serves the area where I live and the network looks to be in great shape. This is partly due to the fact that the town I live in was devastated a decade ago by hurricane Charley and much of the plant has been rebuilt. But this is also an upscale area where CenturyLink is pushing their Prism TV product, which requires a decent copper network. But I don’t have to travel too far inland away from the water in Florida to see older and more neglected networks.

The neglect of rural networks is not new. The large telcos have severely cut back on copper maintenance for years and even decades in rural areas. It was widely reported by people I know in West Virginia that Verizon basically walked away from the rural parts of the state almost 25 years ago when they decided to sell the whole network to somebody else. It took them almost two decades to find a buyer and in the meantime the copper network degraded significantly. I have a nephew there who is a lineman for Frontier, who now owns that network, and he is not sure that what is left can ever be made to work well.

It seems pretty clear that the telcos are going to walk away from copper. And so perhaps it does no real good to complain about the quality of the copper networks they plan to abandon. In just a few years we will instead be talking about a whole lot of rural people who won’t even be able to get dial-tone to access dial-up Internet. Very rural places are just going to have a harder and harder time being connected to the rest of us.