How’s Cable Doing?

Cord cuttingWith all of the talk of cord cutting, cord-shaving and the general demise of the cable industry I thought it would be useful to take a snapshot of the cable industry at the end of the third quarter of 2014 to see how the industry is doing. Here are some key facts for a numbers of major cable providers:

Comcast. For the quarter they lost 81,000 TV subscribers compared to losing 127,000 in the 3rd quarter of 2013. Meanwhile they gained 315,000 data customers compared to 297,000 customer a year before. Overall profits were up 4% over the year before. Comcast now has 22.4 million video customers and 21.6 million data customers.

Time Warner Cable. The company lost 184,000 cable subscribers in the third quarter compared to 122,000 in the previous year. But the company did add 92,000 residential data customers for the quarter. Earnings were up 3.6%, driven by cable rate increases and growth in the business services group. The company saw a 9.6% increase in programming costs, driven by a bad deal they made for the programming rights to the LA Dodgers.

Charter Communications. Charter lost 22,000 video customers for the quarter compared to 27,000 a year earlier. They saw data customers increase by 68,000 compared to 46,000 a year ago. Overall profits were up 8% driven by rate increases and data customer gains. Charter finished the quarter with 4.15 million cable customers.

CableVision. The company saw significant loss of 56,000 cable customers, Profits for the company dropped to $71.5 million for the quarter down from $294.6 million a year earlier.

Cable One. The company lost 14,000 video subs and ended with 476,000 at the end of the quarter. The company has not renewed programming from Viacom starting in April of this year

Suddenlink. The company added 2,200 video customers for the quarter compared to a loss the previous year of 3.200 subs even though they have dropped Viacom programming. Revenues increased by 6.6% compared to a year ago.

AT&T. U-verse added 216,000 cable customers for the quarter and added 601,000 data customers. The company now has more than 6 million video customers and 12 million data customers. U-verse profits were up 23.8% compared to a year earlier.

Verizon. The company added 114,000 new video customers and 162,000 new data customers for the quarter. The company now has 5.5 million video customers and 6.5 million data customers.

DirectTV. The company saw a decrease of 28,000 customers for the quarter while revenues grew by 6% due to rate increases. The average satellite bill is up to $107.27 per customer per month.

Netflix. Netflix added 1 milllion US subscribers and 2 million international subscribers for the quarter. They now have 37 million US customers and almost 16 million international ones. But these growth rates were less than their predictions and their stock tumbled 25% on the news.

Amazon Prime. The company does not report number of customers. But their earnings release says they gained significant customers even while increasing their annual fee from $79 to $99.

What does all of this mean? As can be seen by looking at all of the major players who make quarterly releases (companies like Cox do not), one can see that total video subs are down by maybe a net of 100,000 for the quarter. But cord cutting is growing when you consider that the industry used to routinely grow by 250,000 customers per quarter for now households being built. So it looks like cord cutting is growing by perhaps 1.5 million per year.

Within these numbers one can’t see the effects of cord shaving. It’s been widely reported that customers are downsizing their cable package as a way to save money. None of these companies report on their mix of types of customers.

Netflix and Amazon Prime continue to grow significantly along with other on-line content providers. It’s been reported that over half of the households in the country pay for at least one of the on-line services and many others watch free content available at Hulu and other sites.

One thing that is obvious is that broadband is still growing for all of the service providers. In fact, Comcast and other traditional cable providers are starting to refer to themselves more as ISPs than as cable companies.

The FCC to Unbundle Fiber?

FCC_New_LogoChairman Wheeler at the FCC announced last week that he would be bringing two proposals to the FCC meeting on November 21 associated with the IP Transition. The first involves some rules that will insure that 911 continues to function on an IP network and there ought to be no controversy with that. But his second idea is going to be very controversial, which is to give competitors access to RBOC fiber networks in the same manner that they have access today to the copper network.

The Chairman says that he doesn’t want customers, particularly business voice customers, to lose competitive options – and he believes that the unbundled network elements that are in place for copper today have brought competition to that market.

Let me step back and look at this idea at the big picture level. What the Chairman is proposing is a form of arbitrage. In general, telecom arbitrage comes when regulators force an artificial price on a product or service. In this specific case, the arbitrage would come from having the FCC or state commissions define the price, terms and conditions for a competitor to gain access to a fiber network. Arbitration is not necessarily good or bad, but if the price is set too low then there is an larger demand for the product than ought to be expected.

The industry does not have a very good history over the last two decades of dealing with arbitrage and the last mile network. There have been three times when FCC-administered arbitrage turned out bad for a lot of the industry and the public. First was unbundled network elements on copper that the Chairman is now acknowledging – the primary one being the unbundled T1. This was incredibly popular in the late 90’s and dozens of huge CLECs were funded to compete in this business. I had an office then in a business high-rise near the DC beltway and I remember a dozen different CLECs knocked on my door trying to sell a bundled T1/data connection.

After that came UNE-P. This was a creation that was a virtual unbundling of the network. With UNE-P a competitor didn’t have to collocate to get access to the RBOC copper. Instead they just bought all of the UNE elements and reconstructed a network. Finally, there was resale which forced the RBOCs to give set discounts on many retail products. Both UNE-P and resale were mostly used to compete for residential customers and some giant companies grew in the space. I remember Talk America, for example, that had well over a million residential customers on resale.

But for the most part all of the companies that leaped into these arbitrage situations failed. I remember well over a dozen UNE CLECs that went public with a few dozen more hoping to do so. Heck, the telecom industry was so juiced in the late 90s that there were even several telecom consulting firms that worked for the large CLECs who tried to go public. But in the end, the arbitrage opportunity became less attractive, as always happens, and all of these companies crashed and burned. The same thing happened with UNE-P and resale and all of the companies that tried to make a business using these arbitrage opportunities ultimately failed.

Arbitrage is rarely permanent and this makes it almost impossible to build a business plan to take long-term advantage of an arbitrage opportunity. The main reason for this is that the RBOCs are really good at resisting and challenging arbitrage. They file law suits and lobby and within 5-7 years after the start of an arbitrage situation they largely get it killed, or at least weakened to the point of being useless for a competitor.

Now we are looking at a new arbitrage opportunity of allowing competitors to get access to fiber networks. I have dozens of questions about how this might work, because it’s not as obvious on a physical basis how one unbundles a fiber network in the same way that has been done for copper. With copper, in essence, the copper line from a customer is physically redirected to a CLEC. But that is not going to easily work for a fiber connection.

How big this opportunity might be depends upon how the FCC implements it. For example, if they only allow fiber interconnection in places where there had once been a copper UNE connection then this is going to be very limited in scope. But it’s hard to see how they can stop there. After all, CLECs that compete using RBOC copper have always been allowed to grow, and if a competitor can’t ever add a new customer then this form of competition will be nearly worthless.

But if all of the fiber in the RBOC network comes available to competitors, then we are looking at the possibility of a whole new major push of competition. Competitors have largely been kept from the RBOC fiber network and this opens up huge market possibilities.

My advice to my clients is going to be to be cautious about leaping into this kind of opportunity. History has shown us that AT&T and Verizon will be working to kill this kind of arbitrage from the minute that it’s proposed – and so it’s likely that this will only remain lucrative for a few years before those companies squeeze the ability to use unbundled fiber.

Don’t get me wrong. As a consultant this opens up all sorts of new work for me. But having lived through the last arbitrage trials in the industry, my alarm bells are already going off and I am going to be advising caution. If the FCC tilts the arbitrage opportunity enough in favor of the competitor then there is going to be money to be made, but I will be reminding everybody that whatever the FCC giveth they can also someday take away.

What Makes Cellphone Coverage Vary?

HTC-Incredible-S-SmartphoneIt seems I have been writing about cellphones for a few days, so I thought I would cover a question that I have been asked many times. I travel a lot and it’s not unusual to sit next to somebody and note that the two of you are having a very different cellular experience. One of you may be getting one bar for data and voice while the other might be getting four, sitting only a few feet apart. What explains this difference in cellular performance? I will start with the obvious explanations, but sometimes the differences are due to more subtle issues.

Who is your carrier? Both people might have an iPhone, but if one has Verizon and the other has AT&T the experience is different because both are connected to completely different technologies and totally separate networks. AT&T and T-Mobile use GSM (Global System for Mobile) technology, the technology that is used in most of the rest of the world. But Verizon and Sprint use CDMA (Code Division Multiple Access) technology. These technologies are so different that a handset that is made only for one technology won’t work on the other. This is why you can’t take your Verizon handset to most of the rest of the world when you travel.

Who’s on the nearest tower? I’ve often been driving with somebody and hear them be glad to see an upcoming cell tower because they assume this means they’ll get better coverage. But you can’t assume this because not every carrier is on every cell tower. There are a large number of cell towers in the country. Some of these are owned by the wireless carriers but many are leased. The cellular companies look at available towers and then cobble together the combination of towers that make the most effective and cost-efficient network for them.

This task has gotten hard for the carriers because of the fact that cellphones now carry data. The original cell tower network with all of the giant towers was created back when cellphones only carried voice. But now that the networks are deploying data and using higher frequencies it turns out that a more ideal network would place the towers closer together than the traditional locations. This is causing massive reconfigurations of the networks as the carriers try to improve data reception.

Cell sites get busy. Or said another way, any one carrier on a tower might get busy while another carrier might not be busy. As cell sites get busy they do a couple of things to handle the excess traffic. Most carriers still give preference to voice over data, so as more voice calls are added to a network the amount of bandwidth allocated to data is often choked down (but not always). And eventually the tower site refuses to add new customers. But when sites get busy the performance normally degrades.

You might be roaming. Roaming is when a customer is riding a different network than the one to which they subscribe. If you are an AT&T customer and are roaming on a T-Mobile site, you will not get the same priority as a T-Mobile customer. This might mean getting slower data speeds if the site becomes busy, and it could also mean being booted from the site as it becomes full.

Spectrum is not created equal. There is not just one spectrum being used for cellular data. There are already nearly a dozen different slices of spectrum being used and the FCC is going to be auctioning more over the next two years. Every time you move to a different cell site you might be changing the frequency you are using. Carriers not only cobble together a network of the ideal cell sites, but they also play a chess game of deciding which spectrum to deploy at each tower. None of the carriers owns all of the different spectrum available, and the spectrums they own in different cities can be drastically different. This means getting four bars at your home might not give you the same experience as getting four bars when you are traveling.

What your phone allows. Perhaps one of the biggest differences in reception is that each cellphone maker decides what spectrum a given handset is going to receive. It costs a lot of energy, meaning battery time, for a phone to always search on all of the different frequencies. So different handsets allow different frequency bands. This is why LTE coverage differs so widely because there are many sets that don’t even see some of the LTE frequencies. All handsets look for the basic cellular bands, but only the most expensive sets are designed to look for almost everything out there. And as more cellular bands are allowed into the mix this will get more pronounced. Of course, you have to read very deep into the specifications of your phone to understand what it does and does not receive. Good luck asking that question at the cellphone store.

Plain old interference. Every cellular frequency has a different propagation characteristic. If you and the guy next to you are talking on different frequencies then you each will be dealing with a different set of interference. This is one of the reasons that cellular coverage is so wonky in complicated buildings like airports and hospitals. Each cellular frequency is likely to find a different set of problems in a complex environment and one frequency might get killed in a given part of the airport while another is fine. This is why you might find yourself walking around trying to find a signal while people around you are still talking.

The Upcoming AWS Spectrum Auction

Transmitter_tower_in_SpainThe FCC’s auction for new cellular data spectrum will begin on November 13. This is the first big spectrum auction in six years, so it’s worth watching. The spectrum being auctioned is being referred to as AWS or Advanced Wireless Spectrum. There are three separate bands being auctioned that go from 1,695MHz to 1,710MHz, from 1,755MHz to 1,780MHz and from 2,155MHz to 2,180MHz.

The FCC has set aside a reserve big for the auction at $10.5 billion. That means that if they don’t receive bids totaling at least that much in the first round that the FCC has the right to cancel the auction. Assuming that price is met, then the normal FCC bidding process will take place and one would expect the auction to go for a few more rounds.

The AWS spectrum is expected to be used almost entirely for data, and both Verizon and AT&T already own some spectrum that sits next to these new blocks. That is going to make it fairly easy for carriers to incorporate the spectrum into handsets. Further, this same spectrum is used in Europe for wireless data, meaning that there are already a wide array of handsets capable of using the spectrum.

Because it’s high frequency, this spectrum is capable of handling a lot of data. However, like other high frequencies it’s not great at penetrating building walls and other obstacles. Contrast this to the next auction that’s on the horizon. In two years the FCC will be auctioning chunks of the 600 MHz spectrum that is being vacated by television stations. This frequency can penetrate into elevators but doesn’t carry as much data per channel as the higher frequencies.

As you would expect the bulk of the spectrum is going to be auctioned to the largest carriers. It is expected that T-Mobile is going to be aggressive in the auction with AT&T and Verizon also buying a lot of spectrum. Sprint is expected to sit out the auction since they already own a lot of high frequency bandwidth. The wildcard player is going to be Dish Networks which may go after a lot of this spectrum. Dish has announced plans to offer a fixed data product using wireless spectrum that will also be used to deliver a cable TV line-up. This spectrum would give them more bandwidth for that offering.

The AWS spectrum is not immediately available since the Department of Defense and a few other government agencies still occupy some of the spectrum. It is expected that the bulk of the government usage will be gone in about two years, but these kinds of transitions almost invariably take longer than expected. This means that it’s unlikely that the bandwidth will have much of an impact on wireless data speeds until the two to three year time frame.

The spectrum is being auctioned off by market and as you would expect this means a wide variance in the interest by the carriers in any given market. In similar auctions in the past some markets went unclaimed, meaning that nobody was willing to pay the FCC’s minimum bid for the market, and if that happens again you can expect a second auction of the leftover, and certainly rural markets. This auction does have some incentives for small bidders and while the big carriers will grab the vast majority of the spectrum you can expect to see smaller companies going after secondary and rural markets.

The auction is expected to be tactical is that each carrier has holes they are trying to fill in certain markets. And the big carriers are keeping the upcoming 600 MHz auction in mind and may hold off on bidding now in markets where they would rather have that spectrum. This makes the auction a big chess game by market. The funny thing is that the carriers know exactly what each other already owns in terms of spectrum, so they know basically what each other is most interested in. But because there are two auctions close together or very different spectrum, nobody is going to know each other’s strategies until the first round bidding is done. The auction is often finished after the first round for a lot of markets and the following rounds are usually only for the prime markets.

I just looked at the amount of spectrum that cellphone users consume late last week. The current statistics show that the average landline connection is using almost 100 times more aggregate data in a month (download and upload combined) than the average cell phone. With that said, Cisco has predicted that the amount of wireless data usage will triple over the next five years, and many analysts think this is conservative.

It’s obvious that cellphone data is never going to rival landline data usage or even come close. I chuckle whenever I see somebody say that wireless data will win the bandwidth battle. There just is not enough wireless spectrum for that to ever happen. While cellular data usage is now doubling every five years, landline data is doubling every three years and one has to carry that trend out twenty years to see that the average landline home connection might be using nearly a terabit of data each month.

But we like using data on our cellphones. The wireless carriers have trained us to be very cautious in that usage because of the severe data caps and the horrendously high price for exceeding your data cap. But even with those restrictions, the wireless carriers need more spectrum and are expected to make this an interesting auction.

Comments to the FCC on Data Speeds

FCC_New_LogoI’ve been reading through the comments in FCC Docket 14-126 that asks the question if the FCC should increase the definition of broadband. The comments are  sticking mostly to the expected script. It seems that all of the large incumbents think the current definition of 4 Mbps download and 1 Mbps upload are just fine. And just about everybody else thinks broadband should be something faster. In the Docket the FCC suggested that a low-use home today needs 4 Mbps download, a moderate-use home needs 7.9 Mbps and a high-use home needs 10 Mbps.

AT&T says that the current definition of 4 Mbps is adequate to define ‘advanced telecommunications capability’ per Section 706 of the FCC rules. They argue that customers don’t use as much bandwidth as the FCC is suggesting. For example, they argue that most of their customers who pay for 12 Mbps service rarely hit a maximum of 10 Mbps during a typical month. They argue that the FCC is trying to change the definition of broadband by only looking at what the heaviest users of broadband are using.

AT&T goes on to say that they and other companies like Google and the large cable companies are now deploying gigabit-capable technology and so the FCC has no reason to worry about data speeds since the industry will take care of the problem by increasing speeds. I obviously disagree with AT&T on this argument. They are using the red herring of what is happening in places like Austin Texas and extrapolating that to mean that the whole country is seeing huge broadband upgrades. As I have written many times, small town America is not getting any of the new broadband investment that AT&T touts in their comments. And rural America is still often stuck with dial-up, satellite or cellphone data. Further, AT&T has been actively saying elsewhere that they want to kick millions of customers off copper and get rid of their DSL option.

Verizon took a different tactic in their filing. They also don’t want the definition increased from 4 Mbps. They first argue that they have made a lot of investments in broadband, and they certainly have done so with their FiOS fiber network in cities and suburbs. But they then go on to argue that cellular data ought to be counted as broadband and that they are offering a great cellular alternative to people. They cite that 97.5% of people in the country have access to LTE with broadband speeds greater than 10 Mbps download and that this should be counted as broadband.

There are a few problems with their claim. First, Akamai collects the speeds from millions of cellular data downloads and they report that the average cellular data speed actually achieved in the country is 4.4 Mbps and not Verizon’s theoretical 10 Mbps. And cellular data is bursty, meaning that it’s designed to be fastest for the first few seconds of download and then normally slows down. More interestingly, a few months back Comcast citied Verizon and AT&T cellular data as evidence that Comcast has robust broadband competition. Verizon Wireless’s CEO countered the Comcast’s claim and said, “LTE certainly can compete with broadband, but if you look at the physics and the engineering of it, we don’t see LTE being as efficient as fiber coming into the home.” Finally, everybody is aware that cellular data plans include tiny data caps of only a few cumulative gigabits of download per month and cellphone users know that they must park on WiFi from landlines data sources as much as possible to make their cellphones usable for video and other heavy data usage.

Verizon goes on to cite the National Broadband Map several times as justification that there is already great broadband coverage in the US today. They say that 99% of households already have access to broadband according to the map. I have written several times about the massive inaccuracies in that map due to the fact that all of the data in it is self-reported by the carriers.

The big cable companies did not make comments in the docket, but there is a filing from the National Cable Telecommunications Association on behalf of all of them. NCTA says that the definition of broadband should not be increased. Their major argument is that the FCC is not measuring broadband deployment correctly and should measure it every year and report within six months of such measurements. They also say that the FCC should take more consideration of the availability of cellular and satellite data which they say are broadband. I haven’t commented on satellite data for a while. Some parts of the country can now get a satellite connection advertised with a maximum download speed of 15 Mbps. It’s been reported to be a little slower than that, but like cellular data a satellite connection has tiny data caps that make it nearly impossible for a family with a satellite connection to watch video.

In a speech last week FCC Chairman Tom Wheeler said that 10 Mbps is too low to be considered broadband and that federal funds like the Connect America Fund should not be funding the construction of any broadband with speeds lower than that. It’s going to be interesting to see where the FCC comes out on this. Because if they raise the threshold too much then a whole lot of households are going to be declared to no longer have true broadband, which is pretty much the truth.

Retiring the Copper Networks

telephone cablesAttached is a copy of FCC Docket DA-14-1272 where Verizon is asking to discontinue copper service in the towns of Lynnfield, MA, Farmingdale, NJ, Belle Harbor, NY, Orchard Park, NY, Hummelstown, PA and Ocean View, VA. In this docket the FCC is asking for public comments before it will consider the request.

In these particular towns Verizon is claiming that almost all of the households are already served by fiber and they are seeking to move the remaining households to fiber so they can disconnect and discontinue the use of the copper networks there. And perhaps if there are only five percent of lines left on copper in these towns that might be a reasonable request by Verizon. But this does prompt me to talk about the whole idea of discontinuing older copper networks, because both Verizon and AT&T have said that they would like to eliminate most of their copper by 2020.

In the case of Verizon it’s a tall order to get rid of all copper because they still have 4.9 million customers on copper with 5.5 million customers that have been moved to fiber. AT&T has a much larger problem since they don’t use fiber to serve residential customers except in a few rare cases. But both big carriers have made it a priority to get people off copper.

Many customers are unhappy with the idea of losing their copper and many have complained that they are getting a lot of pressure from the big telcos to drop their copper. There are numerous Verizon customers who say they are contacted monthly to get off the copper and they feel like they are being harassed. There are a few different issues to consider when talking about this topic.

Not everybody that loses copper will get fiber. Of the big telcos only Verizon even owns a residential fiber network. But even the Verizon FiOS network doesn’t go everywhere and they are not expanding the fiber network to new neighborhoods. For customers that live where there is no fiber, the goal is to move them to a DSL-based service or, in the case of AT&T to cellular phones.

Interestingly when a telco moves a customer from POTs (Plain Old Telephone Service) on copper to VoIP on DSL the telco will keep using the identical old copper wires. They will have changed the technology being used from analog to digital. But more importantly in most cases they will have changed the customers from being on a regulated product to an unregulated one. And that is one of the primary thrusts to get people off POTs.

POTs service is fully covered by a slew of regulations that are aimed at protecting consumers, such as carrier-of-last-resort obligations that require telcos to connect anybody who asks for service.  But in most states those same protections don’t apply to VoIP or fiber service. The most important right that customers lose with VoIP are the capped prices, meaning that the prices for VoIP or fiber service could be raised at any time by any amount. And the carrier-of-last-resort obligations have real-life impact even for existing customers. If a customer is late paying their bill on a VoIP network, Verizon would be within their rights to refuse to connect them back to service when they pay.

There are customers who want to stay on POTs on copper for various reasons. One reason is that POTs phones are powered by the copper network and so they keep working when the power goes out. There are still parts of the country where the power goes out regularly or where there is a reasonable expectation of hurricanes or ice storms. For example, houses that still had copper could make calls for up to a week after hurricane Sandy.

Another reason to keep copper is for security and medical monitoring. The copper POTs network has always been very reliable. But it is much more common for households to lose Internet service. Once a phone is converted to VoIP, then any time the Internet is down for a customer then their security and medical monitoring services that use those phones don’t work.

The FCC is going to be flooded with requests like this one to disconnect people from POTs. Certainly the copper networks are getting old. There might be merit for disconnecting copper in towns that are almost entirely fiber and where the customer losing POTs will move to fiber. In most cases fiber seems to be as reliable as copper, although it cannot power the phones when the electricity goes out.

But it seems somewhat ludicrous for the FCC to approve shuttling people from POTs to DSL while still using the same old copper lines. That clearly is being done as way to avoid regulation and customer protections and not for the carrier to save money. And it is clearly not in the customer’s best interest to move customers from POTs to cellular.

The FCC and Peering

Zeus_peering_around_a_corner__(9386751334)As the politics of net neutrality keep heating up, Senator Pat Leahy and Representative Doris Matsui introduced the Online Competition and Consumer Choice Act of 2014.

This bill requires the FCC to forbid paid prioritization of data. But then, Senator Leahy was quoted in several media outlets talking about how the bill would stop things like the recent peering deal between Netflix and Comcast. I’ve read the proposed bill and it doesn’t seem to ban those kinds of peering arrangements. His comments point out that there is still a lot of confusion between paid prioritization (Internet fast lanes) and peering (interconnection between large carriers). The bill basically prohibits ISPs from creating internet fast lanes or in disadvantaging customers through commercial arrangements in the last mile

The recent deals between Netflix and Comcast, and Netflix and Verizon are examples of peering arrangements, and up to now the FCC has not found any fault with these kinds of arrangements. The FCC is currently reviewing a number of industry peering agreements as part of investigating the issue. These particular peering arrangements might look suspicious due to their timing during this net neutrality debate, but similar peering arrangements have been around since the advent of the Internet.

Peering first started as connection agreements between tier 1 providers. These are the companies that own most of the long haul fiber networks that comprise the Internet backbone. In this country that includes companies today like Level3, Cogent, Verizon and AT&T. And around the world it includes companies that you may not have heard of like TeliaSonera and Tata. The tier 1 providers carry the bulk of the Internet traffic and peering was necessary to create the Internet as these large carriers need to be connected to each other.

Most of the peering arrangements between the tier1 carriers have been transit-free or what is often referred to as bill-and-keep. The traffic between the major carriers tends to balance out in terms of originating and terminating volumes and in such cases it doesn’t make a lot of sense for two carriers to bill each other for swapping similar amounts of data traffic.

But over time there were peering arrangements made between the tier 1 carriers and tier 2 providers that includes the large ISPs and telcos. Peering was generally done in these cases to make the network more efficient. It makes more sense to interchange traffic between and ISP and somebody like Level3 at a few places rather than at hundreds of places. It’s always been typical for these kinds of peering arrangements to include a fee for the tier 2 carrier, something that is often referred to as a transit fee.

There is no industry standard arrangement for interconnection between tier 1 and tier 2 providers. And this is because tier 2 providers come in every configuration imaginable. Some of them own significant fiber assets of their own. Others, like Netflix have a mountain of one-directional content and own almost zero network. And so tier 2 providers scramble to find the best commercial arrangement they can in the marketplace. One thing that is almost universal is that tier 2 providers pay something to connect to the Internet. There is no standard level of payment and transit is a very fluid market. But payment generally recognizes the relative level of mutual benefit. If the traffic between two parties is balanced then the payments might be small or even free. If one party causes a lot of costs for the other then payments typically reflect that imbalance.

Netflix has complained about paying Comcast and Verizon. But those ISPs wanted payments from Netflix since the traffic from Netflix is large and totally one-directional. Comcast or Verizon needs to construct a lot of facilities in order to accept the Netflix traffic and they don’t get any offsetting benefit of being able to send traffic back to Netflix on the same connection.

In economic terms, on a national scale the peering market is referred to as an n-dimensional market, meaning that a large tier 2 provider has the ability to negotiate with multiple parties to achieve the same result. For example, Verizon has a lot of options for moving data from the east to the west coast. But eventually the Internet becomes local, and that is where the cost and the contention arises. As Internet traffic enters a local metropolitan market it begins to hit choke points where the traffic can overwhelm the local facilities and cause congestion. The payments that Comcast or Verizon want from Netflix are to build the facilities needed for getting Netflix movie traffic to and through these local hubs and chokepoints.

Peering arrangements like this make sense. I find it hard to believe that the FCC is going to get too deeply involved in peering arrangements. It’s an incredibly dynamic market and carriers are constantly rearranging the network as they find better prices or more efficient network arrangements. If there is any one place where the market works it is between the handful of large carriers that handle the majority of the Internet traffic. Most of the bad things that can happen to customers are going to happen in the last mile network, and that is where net neutrality should properly be focused.

And why the picture of the kitten? I work at home and at my very local part of the network this is the kind of peering that I often get.

Net Neutraility Comments at the FCC

Network_neutrality_poster_symbolThe FCC’s Net Neutrality docket got over 1 million comments, most from ordinary Americans who are worried about the large ISPs and web companies colluding to restrict or hijack their Internet experience. I read through some of these comments and people are universally worried about companies like Comcast and Google getting together to limit what they can do on the web. The public does not want to see a network provider have the ability to slow down their Internet experience or to dictate which web sites they can use.

Obviously I didn’t read all of the comments in this docket and one has to wonder if anybody at the FCC can or will read it all. That’s a tall task. But I did look at the comments of the larger carriers and web companies to see what they have to say. There were no surprises with the big ISPs on one side of the issue and almost everybody else on the other.

AT&T is in favor of no additional regulation of the Internet, meaning they would be free to prioritize traffic if they wish. This could obviously make them a lot of money. AT&T says if there must be regulation that they would prefer it to be through Section 706 regulation, which is the section of the FCC rules that talks about no blocking of Internet traffic. AT&T is totally against having a Title II classification of the Internet as a common carrier business. And not surprisingly, AT&T is not in favor of regulating data for wireless carriers.

Comcast is also against Title II classification as a common carrier and they prefer no regulation at all. Comcast says that they are already a good web citizen and don’t need to be regulated, but even if they were there would be loopholes that would allow carriers like them to discriminate. This seems like an odd argument to make from a company that wants approval for a giant merger. Comcast says that if there is regulation that it should also apply to public Wi-Fi and mobile broadband.

Verizon had the longest comments I saw. Verizon believes the best solution is the least amount of regulation possible. They think the market will control carriers because customers won’t accept being throttled. Verizon says the real threat to the Internet comes from companies like Google, Netflix and Amazon. And obviously they are very much against Title II regulation.

On the other side of the argument is, well, just about everybody else except a few other cable companies. There were a few filings that represented groups of Internet-based companies. The Information Technology Industry Council represented companies like Apple, Facebook, Google, Intel, Microsoft, Yahoo and many others. They argue that the FCC needs to put in rules to protect consumers, but also to protect both small and large web-based companies. They are not in favor of Title II regulation but instead would like to see something similar to the rules that were vacated by the courts.

The Internet Association represents Amazon, Ebay, Expedia, Facebook, Google, Linked-In, Twitter, Netflix, Yahoo, Yelp and many others. As you might have noticed, Google and Yahoo are in both industry groups. This group also doesn’t support full Title II regulation but thinks that the FCC needs to find ways to stop the ISPs from discriminating and wants the FCC to support application agnostic network management. They want the same rules to apply to wireless carriers.

Netflix is at the core of a current battle over network neutrality. Netflix is about the only big tech company I could find in favor of Title II regulation. They think anything short of full title II reclassification will just be asking for another court battle that the FCC will eventually lose.

One has to wonder if the volume of public comments means anything. It’s clear where the public stands on this issue and people are afraid that the Internet is going to change to their detriment. They already see the ongoing battle between Verizon and Netflix and they don’t want to see a future where their web experience is dependent upon how ISPs and content providers are getting along. When they buy an amount of bandwidth from an ISP they want whatever fits into that bandwidth to work.

What’s the Truth About Netflix?

Polk County SignClearly a lot of customers around the country are having trouble with NetFlix. The latest round of finger pointing is going on between Verizon, Netflix and some intermediate transport providers.

Netflix uses adaptive streaming for its standard quality video and this only requires about 2 Mbps at the customer end to get the quality that Netflix intends for the video. HD videos require more bandwidth, but customers are complaining about standard video. A Netflix download requires a burst of data up front so that the movie can load ahead of the customer. But after that it stays steady at the 2 Mbps rate and the download even pauses when the customer pauses. It’s getting hard to find an urban ISP that doesn’t deliver at least that much speed, so one would assume that any customer who subscribes to at least 2 Mbps data speeds should not to be having trouble watching Netflix.

But they are. On their blog Verizon talks about a customer who has a 75 Mbps product and who was not getting good Netflix quality. On that blog Verizon says that it checked every bit of its own network for possible choke points and found none. For those not familiar with how networks operate, a choke point is any place in a network where the amount of data traffic passing through could be larger than the capacity at the choke point. In most networks there are generally several potential chokepoints between a customer and the outside world. In this blog Verizon swears that there is nothing in its network for this particular customer that would cause the slowdown. They claim that the only thing running slow is Netflix.

This is not to say that there are no overloaded chokepoints anywhere in Verizon networks. It’s a big company and with the growth of demand for data they are bound to have choke points pop up – every network does. But one would think that their fiber FiOS network would have few chokepoints and so it’s fairly easy to believe Verizon in this instance.

Verizon goes on to say that the problem with this Los Angeles customer is either Netflix or the transit providers who are carrying Netflix traffic to Verizon. Verizon is not the only one who thinks it’s the transit interface between the networks. Here is a long article from Peter Sevcik of NetForecast Inc. that shows what happened to the Netflix traffic at numerous carriers both before and after Netflix started peering directly with Comcast. This data shows that traffic got better for everybody else immediately upon the Comcast transition, which certainly indicates that the problem is somewhere in the transit between Netflix and the ISPs.

Verizon says the problem is that Netflix, or the intermediate carriers don’t want to buy enough bandwidth to eliminate chokepoints. Sounds like a reasonable explanation for the troubles, right? But then Dave Schaffer, the CEO of Cogent came forward and pointed the finger back at Verizon. He says that the problem is indeed in the interface between Cogent and Verizon. But Schaffer claims this is Verizon’s fault since they won’t turn up additional ports to relieve the traffic pressure.

So now we are back to square one. The problem is clearly in the interface between Verizon and carriers like Cogent. But they are blaming each other publicly. And none of us outside of this squabble are going to know the truth. Very likely this is a tug-of-war over money, and that would fall in line with complaints made by Level3, who says that Verizon is holding traffic hostage to extract more money from the transit carriers.

The FCC is looking into this and it will be interesting to see what they find. It wouldn’t be surprising if there is a little blame on both sides, which is often the case when network issues devolve into money issues. Carriers don’t always act altruistically and sometimes these kinds of fights almost seem personal at the higher levels of the respective companies. The shame from a network perspective is that a handful of good technicians could solve this problem in a few hours. But in this case even the technicians at Verizon and the transit carriers might not know the truth about the situation.

Is There a Web Video Crisis – Part IV and Final

The InternetIn the previous three installments of this blog I looked at the issues behind the demands of Comcast and Verizon to charge content providers for creating an Internet ‘fast lane’. In particular I have focused on the recent actions between Comcast and NetFlix. In everything I have read about this issue I never saw any specific reason cited why Comcast thought they needed the extra payments from NetFlix, and this blog series has been about looking for such reasons.

In the earlier blogs I looked at the various components of the Comcast network and my conclusion is that end-user customer fees ought to be covering the cost of the wires, or at least that is how all of the companies smaller than Comcast and Verizon see the issue. I then looked at the issue of preparing the network for peak video usage during simulcasts. Again, my conclusion is that this is a function that is a normal part of making your network operational and doesn’t seem like a reason to charge a premium price to get what is supposed to be there. Finally, I looked at peering, data centers and the network of routers and switches. My conclusion there was that peering generally saves money for Comcast and Verizon and that their savings from peering are far larger than their costs.

In the months leading up to the announcement that the two parties had reached a deal, I had seen numerous complaints from customers who said that their NetFlix was not working well on both Comcast and Verizon. And there were numerous articles like this one asking if Comcast and Verizon were throttling NetFlix. There was clearly something fishy going on and it and it was clear that both Verizon and Comcast were somehow slowing down NetFlix bits as compared to other bits. The complaints were all coming from NetFlix traffic and we didn’t see the same complaint about AmazonPrime or other video providers. And I heard no complaints anywhere about the speeds on the TV Anywhere products offered directly by Comcast and Verizon. I know I was watching Game of Thrones online in HD through my Comcast subscription and it always worked perfectly.

Then, when there was an announcement, it was made to sound like NetFlix was the one who was requesting premium access from Comcast. The Verizon deal was done much more quietly and there was no similar insinuation there. But almost instantly after Comcast struck the deal with NetFlix the speeds popped back up to former levels

One has to ask if NetFlix really got premium treatment of their bits or if Comcast simply removed whatever impediments were slowing them down. I will be the first to admit that I, like almost everybody else, am an outsider and we really don’t know what the two parties discussed as part of this announcement. But when I look at the facts that are known to me, what I see is that Comcast and Verizon were flexing their monopoly powers and slowing NetFlix down to extract payment out of them

There is no doubt that the NetFlix traffic causes cost to these two companies. Video traffic has been growing rapidly on the Internet and NetFlix is the largest single provider of video. But I step back and have to ask the basic question of what end-user fees for Internet are supposed to cover. A customer pays for a connection of a given speed, and it seems to me like these companies have promised a customer that they could use that speed. There is the caveat that Comcast has a data cap – a topic of another blog – but as long as a customer stays under that data cap they ought to always get the speed they have purchased. It shouldn’t matter if that customer chooses to use that speed and capacity to watch NetFlix or read silly telecom blogs – they have paid for a certain level of performance.

For Comcast to say that their network is not capable of delivering the accumulated speeds they have sold to customers sounds to me like they have oversold the capacity of their network. They want customers to buy fast speeds, but they don’t actually want them to use it. I’m not a lawyer, but this starts sounding like fraud, or something similar to fraud.

I simply don’t understand why the FCC would listen to any argument that says that content providers have to somehow pay extra to get normal performance. Because that is what it looks like NetFlix had to do. I can imagine as part of that agreement that there was a nondisclosure signed of the terms, and this NetFlix is not out yelling like they probably ought to be

But the long-term results of what Comcast and Verizon have done is that end users are going to pay twice for video access. They already pay to get a data pipe which is large enough to receive video. And now the cost of movies or movie subscriptions is going to increase to cover what NetFlix has to pay to deliver those movies. NetFlix is certainly not going to eat such costs.

And so the consumer is being screwed by a clear case of corporate greed. I have come to the conclusion that Comcast extracted payments out of NetFlix simply because they are large enough to do so. That is an abuse of monopoly power, and that power is only going to get worse if they are allowed to buy Time Warner.