Charter Upgrading Broadband

We are now starting to see the results of cable companies upgrading to DOCSIS 3.1. Charter, the second biggest ISP in the country recently announced that it will be able to offer gigabit speeds to virtually it’s whole footprint of over 40 million passings.

DOCSIS 3.1 is the newest protocol from Cable Labs that allows bonding an unlimited number of spare channel slots for broadband. A gigabit data path requires roughly 24 channels on a cable network using the new DOCSIS protocol. In bigger markets this replaces DOCSIS 3.0 that was limited to maximum download speeds in the range of 250 Mbps. I know there are Charter markets with even slower speeds that either operate under older DOCSIS standards or that are slow for some other reason.

Charter has already begun the upgrades and is now offering gigabit speeds to 9 million passings in major markets like Oahu, Hawaii; Austin, Texas; San Antonio, Texas, Charlotte, North Carolina; Cincinnati, Ohio; Kansas City, Missouri; New York City; and Raleigh-Durham, North Carolina. It’s worth noting that those are all markets where there is fiber competition, so it’s natural they would upgrade these first.

The new increased speed won’t actually be a gigabit and will be 940 Mbps download and 35 Mbps upload. (It’s hard to think there is anybody who is really going to care about that distinction). Cable Labs recently came out with a DOCSIS upgrade that can increase upload speeds, but there’s been no talk from Charter about making that upgrade. Like the other big cable companies, Charter serves businesses that want faster upload speeds with fiber.

Along with the introduction of gigabit broadband the company also says it’s going to increase the speed of it’s minimum broadband product. In the competitive markets listed above Charter has already increased the speed of its base product to 200 Mbps download, up from 100 Mbps.

It’s going to be interesting to find out what Charter means by the promise to cover “virtually’ their whole footprint. Charter grew by purchasing systems in a wide range of conditions. I know of smaller Charter markets where customers don’t get more than 20 Mbps. There is also a well-known lawsuit against Charter in New York State that claims that a lot of households in upstate New York are getting speeds far slower than advertised due to having outdated cable modems.

The upgrade to DOCSIS 3.1 can be expensive in markets that have not yet been upgraded to DOCSIS 3.0. An upgrade might mean replacing power taps and other portions of the network, and in some cases might even require a replacement of the coaxial cable. My guess is that the company won’t rush to upgrade these markets the upgrade to DOCSIS 3.1 this year. I’m sure the company will look at them on a case-by-case basis.

The company has set a target price for a gigabit at $124.95. But already in the competitive markets like Oahu the company was selling introductory packages for $104.99. There is also a bundling discount for cable subscribers.

The pricing list highlights that they still have markets with advertised speeds as low as 30 Mbps – and the company’s price for the minim speeds is the same everywhere, regardless if that product is 30 Mbps or 200 Mbps. And as always with cable networks, these are ‘up to’ speeds and as I mentioned, there are markets that don’t meet these advertised speeds today.

Overall this ought to result in a lot of home and businesses getting faster broadband than today. We saw something similar back when the cable companies implemented DOCSIS 3.0 and the bigger companies unilaterally increased speeds to customers without increasing the prices. Like other Charter customers, I will be interested in what they do in my market. I have the 60 Mbps product and I’ll be interested to see if my minimum speeds is increased to 100 Mbps or 200 Mbps and if I’m offered a gigabit here. With the upgrade time frame they are promising I shouldn’t have to wait long to find out.

Spectrum and 5G

All of the 5G press has been talking about how 5G is going to be bringing gigabit wireless speeds everywhere. But that is only going to be possible with millimeter wave spectrum, and even then it requires a reasonably short distance between sender and receiver as well as bonding together more than one signal using multiple MIMO antennae.

It’s a shame that we’ve let the wireless marketeers equate 5G with gigabit because that’s what the public is going to expect from every 5G deployment. As I look around the industry I see a lot of other uses for 5G that are going to produce speeds far slower than a gigabit. 5G is a standard that can be applied to any wireless spectrum and which brings some benefits over earlier standards. 5G makes it easier to bond multiple channels together for reaching one customer. It also can increase the number of connections that can be made from any given transmitter – with the biggest promise that the technology will eventually allow connections to large quantities of IOT devices.

Anybody who follows the industry knows about the 5G gigabit trials. Verizon has been loudly touting its gigabit 5G connections using the 28 GHz frequency and plans to launch the product in up to 28 markets this year. They will likely use this as a short-haul fiber replacement to allow them to more quickly add a new customer to a fiber network or to provide a redundant data path to a big data customer. AT&T has been a little less loud about their plans and is going to launch a similar gigabit product using 39 GHz spectrum in three test markets soon.

But there are also a number of announcements for using 5G with other spectrum. For example, T-Mobile has promised to launch 5G nationwide using its 600 MHz spectrum. This is a traditional cellular spectrum that is great for carrying signals for several miles and for going around and through obstacles. T-Mobile has not announced the speeds it hopes to achieve with this spectrum. But the data capacity for 600 MHz is limited and binding numerous signals together for one customer will create something faster then LTE, but not spectacularly so. It will be interesting to see what speeds they can achieve in a busy cellular environment.

Sprint is taking a different approach and is deploying 5G using the 2.5 GHz spectrum. They have been testing the use of massive MIMO antenna that contain 64 transmit and 64 receive channels. This spectrum doesn’t travel far when used for broadcast, so this technology is going to be used best with small cell deployments. The company claims to have achieved speeds as fast as 300 Mbps in trials in Seattle, but that would require binding together a lot of channels, so a commercial deployment is going to be a lot slower in a congested cellular environment.

Outside of the US there seems to be growing consensus to use 3.5 GHz – the Citizens Band radio frequency. That raises the interesting question of which frequencies will end up winning the 5G race. In every new wireless deployment the industry needs to reach an economy of scale in the manufacture of both the radio transmitters and the cellphones or other receivers. Only then can equipment prices drop to the point where a 5G capable phone will be similar in price to a 4GLTE phone. So the industry at some point soon will need to reach a consensus on the frequencies to be used.

In the past we rarely saw a consensus, but rather some manufacturer and wireless company won the race to get customers and dragged the rest of the industry along. This has practical implications for early adapters of 5G. For instance, somebody buying a 600 MHz phone from T-Mobile is only going to be able to use that data function when near to a T-Mobile tower or mini-cell. Until industry consensus is reached, phones that use a unique spectrum are not going to be able to roam on other networks like happens today with LTE.

Even phones that use the same spectrum might not be able to roam on other carriers if they are using the frequency differently. There are now 5G standards, but we know from practical experience with other wireless deployments in the past that true portability between networks often takes a few years as the industry works out bugs. This interoperability might be sped up a bit this time because it looks like Qualcomm has an early lead in the manufacture of 5G chip sets. But there are other chip manufacturers entering the game, so we’ll have to watch this race as well.

The word of warning to buyers of first generation 5G smartphones is that they are going to have issues. For now it’s likely that the MIMO antennae are going to use a lot of power and will drain cellphone batteries quickly. And the ability to reach a 5G data signal is going to be severely limited for a number of years as the cellular providers extend their 5G networks. Unless you live and work in the heart of one of the trial 5G markets it’s likely that these phones will be a bit of a novelty for a while – but will still give a user bragging rights for the ability to get a fast data connection on a cellphone.

Edging Closer to Satellite Broadband

A few weeks ago Elon Musk’s SpaceX launched two test satellites that are the first in a planned low-orbit satellite network that will blanket the earth with broadband. The eventual network, branded as Starlink, will consist of 4,425 satellites deployed at 700 miles above earth and another 7,518 deployed at around 210 miles of altitude.

Getting that many satellites into orbit is a daunting logistical task. To put this into perspective, the nearly 12,000 satellites needed are twice the number of satellites that have been launched in history. It’s going to take a lot of launches to get these into the sky. SpaceX’s workhorse rocket the Falcon 9 can carry about ten satellites at a time. They also have tested a Falcon Heavy system that could carry 20 or so satellites at a time. If they can make a weekly launch of the larger rocket that’s still 596 launches and would take 11.5 years. To put that number into perspective, the US led the world with 29 successful satellite launches last year, with Russia second with 21 and China with 16.

SpaceX is still touting this as a network that can make gigabit connections to customers. I’ve read the FCC filing for the proposed network several times and it looks to me like that kind of speed will require combining signals from multiple satellites to a single customer and I have to wonder if that’s practical when talking about deploying this networks to tens of millions of simultaneous subscribers. It’s likely that their standard bandwidth offering is going to be something significantly less.

There is also a big question to me about the capacity of the backhaul network that carry signal to and from the satellites. It’s going to take some major bandwidth to handle the volume of broadband users that SpaceX has in mind. We are seeing landline long-haul fiber networks today that are stressed and reaching capacity. The satellite network will face the same backhaul problems as everybody else and will have to find ways to cope with a world where broadband demand doubles every 3 years or so. If the satellite backhaul gets clogged or if the satellites get over-subscribed then the quality of broadband will degrade like with any other network.

Interestingly, SpaceX is not the only one chasing this business plan. For instance, billionaire Richard Branson wants to build a similar network that would put 720 low-orbit satellites over North America. Telesat has launched two different test satellites and also want to deploy a large satellite network. Boeing also announced intentions to launch a 1,000-satellite network over North America. It’s sounding like our skies are going to get pretty full!

SpaceX is still predicting that the network is going to cost roughly $10 billion to deploy. There’s been no talk of consumer prices yet, but the company obviously has a business plan – Musk want to use this business as the primary way to fund the colonization of Mars. But pricing is an issue for a number of reasons. The satellites will have some finite capacity for customer connections. In one of the many articles I read I saw the goal for the network is 40 million customers (and I don’t know if that’s the right number, but there is some number of simultaneous connections the network can handle). 40 million customers sounds huge, but with a current worldwide population of over 7.6 billion people it’s miniscule for a worldwide market.

There are those predicting that this will be the salvation for rural broadband. But I think that’s going to depend on pricing. If this is priced affordably then there will be millions in cities who would love to escape the cable company monopoly, and who could overwhelm the satellite network. There is also the issue of local demand. Only a limited number of satellites can see any given slice of geography. The network might easily accommodate everybody in Wyoming or Alaska, but won’t be able to do the same anywhere close to a big city.

Another issue is worldwide pricing. A price that might be right in the US might be ten times higher than what will be affordable in Africa or Asia. So there is bound to be pricing differences based upon regional incomes.

One of the stickier issues will be the reaction of governments that don’t want citizens using the network. There is no way China is going to let citizens bypass the great firewall of China by going through these satellites. Repressive regimes like North Kora will likely make it illegal to use the network. And even democratic countries like India might not like the idea – last year they turned down free Internet from Facebook because it wasn’t an ‘Indian’ solution.

Bottom line is that this is an intriguing idea. If the technology works as promised, and if Musk can find the money and can figure out the logistics to get this launched it’s going to be another new source of broadband. But satellite networks are not going to solve the world’s broadband problems because they are only going to be able to help some small limited percentage of the world’s population. But with that said, a remote farm in the US or a village in Africa is going to love this when it’s available.

Abandoned Telecom Infrastructure

I saw an article about Merrill, Oregon where the city was wrestling about what to do with an abandoned cable TV network hanging on poles in the City. It’s actually a fairly common occurrence to have abandoned telecom property on poles and I’ve been contacted by a number of cities over the years wondering what how to deal with the situation.

In this particular case the historic cable system in the city was operated by Rapid Communications out of Texas. That company sold cable properties to a number of companies in 2008 and the Merrill system went to Almega Cable Company, which stopped offering service in the City and went out of business in 2011.

There are all sorts of telecom assets that have been abandoned on poles and defunct cable companies are only one example. I saw a lot of WiFi mesh networks abandoned fifteen years ago as operators folded and never retrieved their equipment. There are numerous CLECs that folded in the late 1990s and that walked away from fiber networks on poles.

Having an abandoned set of wires on poles complicates the lives of any other pole users in the market. The unused wires take up space on poles and make it hard for anybody else to add additional wires onto the pole.

Abandoned networks also create havoc for the normal pole attachment process. This process requires buy-in from existing utilities to move or rearrange cables to make room for a new attacher. A new attacher can be paralyzed if they are unable to create the required clearance from existing wires.

In the end I’ve almost always seen the responsibility for getting rid of the network fall to the local government. Somebody has to go through the process of making certain there is no remaining active owner of the network before it can be condemned. Generally the pole owner is not willing to take on that role unless they have a need of their own to add wires to the poles.

Merrill is now undertaking the task of condemning the network. They have to follow law and post public notices to make sure that nobody claims rights to the cables. In the case of a cable company the City not only has to deal with the wires on poles, but also with customer drops and pedestals scattered throughout the community.

Merrill is hoping that some new carrier will want to use the cable network for overlashing fiber. Overlashing is the process of tying the fiber to existing wires and is generally the lowest cost method of fiber construction. But even if they find a taker for the offer my guess is that the new fiber provider is not going to want to assume ownership for the coaxial cables since that would give them liability for any issues or problems with the old wiring. So the City might end up owning the cables in perpetuity. If they don’t find a buyer, the city will have to pay to have the cables removed – although in today’s market there might be enough value in the copper inside the coaxial cables to offset the cost of removal.

We are going to see a lot more abandoned assets on poles in the future. We are just now entering a period when numerous companies are going to want to hang wireless devices of all types on poles. Some of these devices are tiny and I’ve seen others that are the size of a dorm refrigerator. It’s inevitable that some of the wireless deployments will fail, or that the wireless companies will lose the customers served by a given device.

Over time a significant inventory of abandoned wireless devices will likely grow in most cities. And unlike an abandoned cable network, my guess is that it’s often going to be hard to know which wireless devices have been abandoned or even who owns many of them. Cities ought to be considering ordinances today that require the companies that deploy wireless devices to somehow notify them of what they are doing and to also clearly label the ownership of each device.

But there is a movement at the FCC, in Congress and in States legislatures to institute rules for wireless carriers that would override any local rules. Such global rules are going to hinder cities in the coming decades when they try to deal with abandoned assets clogging their pole lines. Most of the proposed new rules I’ve seen don’t address this issue, which will make it messy to deal with later.

5G is Fiber-to-the-Curb

The marketing from the wireless companies has the whole country buzzing with speculation that the whole world is going to go wireless with the introduction of 5G. There is a good chance that within five years that a good and reliable and pole-mounted technology could become the preferred way to go from the curb to homes and businesses. When that happens we will finally have wireless fiber-to-the-curb – something that I’ve heard talked about for at least 25 years.

I remember visiting an engineer in the horse country of northern Virginia in the 1990s who had developed a fiber-to-the-curb wireless technology that could deliver more than 100 Mbps from a pole to a house. His technology was limited in that there had to be one pole-mounted transmitter per customer, and there was a distance limitation of a few hundred feet for the delivery. But he was clearly on the right track and was twenty years ahead of his time. At that time we were all happy with our 1 Mbps DSL and 100 Mbps sounded like science fiction. But I saw his unit functioning at his home, and if he had caught the attention of a big vendor we might have had wireless fiber-to-the-curb a lot sooner than now.

I have to laugh when I read people talking about our wireless future, because it’s clear that this technology is going to require a lot of fiber. There is a lot of legislative and lobbying work going on to make it easier to mount wireless units on poles and streetlights, but I don’t see the same attention being put into making it easier to build fiber – and without fiber this technology is not going to work as promised.

It’s easy to predict that there are going to be a lot of lousy 5G deployments. ISPs are going to come to a town, connect to a single gigabit fiber and then serve the rest of the town from that one connection. This will be the cheap way to deploy this technology and those without capital are going to take this path. The wireless units throughout the town will be fed with wireless backhaul, with many of them on multiple wireless hops from the source. In this kind of network the speeds will be nowhere near the gigabit capacity of the technology, the latency will be high and the network will bog down in the evenings like any over-subscribed network. A 5G network deployed in this manner will not be a killer app that will kill cable networks.

However, a 5G fiber-to-the-curb network built the right way is going to be as powerful as an all-fiber network. That’s going to mean having neighborhood wireless transmitters to serve a limited number of customers, with each transmitter fed by fiber. When Verizon and AT&T talk about the potential for gigabit 5G this is what they are talking about. But they are not this explicit because they are not likely today to deploy networks this densely. The big ISPs still believe that people don’t really need fast broadband. They will market this new technology by stressing that it’s 5G while building networks that will deliver far less than a gigabit.

There are ISPs who will wait for this technology to mature before switching to it, and they will build networks the right way. In a network with fiber everywhere this technology makes huge sense. One of the problems with a FTTH network that doesn’t get talked about a lot is abandoned drops. Fiber ISPs build drops to homes and over time a substantial number of premises no longer use the network for various reasons. I know of some 10-year old networks where as many as 10% of fiber drops have been abandoned as homes that buy service from somebody else. A fiber-to-the-curb network solves this problem by only serving those who have active service.

I also predict that the big ISPs will make every effort to make this a customer-provisioned technology. They will mail customers a receiver kit to save on a truck roll, because saving money is more important to them than quality. This will work for many customers, but others will stick the receiver in the wrong place and never get the speed they might have gotten if the receiver was mounted somewhere else in the home.

There really are no terrible broadband technologies, but there are plenty of terrible deployments. Consider that there are huge number of rural customers being connected to fixed wireless networks. When those networks are deployed properly – meaning customers are not too far from the transmitter and each tower has a fiber feed – the speeds can be great. I know a colleague who is 4-miles from a wireless tower and is getting nearly 70 Mbps download. But there are also a lot of under-capitalized ISPs that are delivering speeds of 5 Mbps or even far less using the same technology. They can’t afford to get fiber to towers and instead use multiple wireless hops to get to neighborhood transmitters. This is a direct analogue of what we’ll see in poorly deployed 5G networks.

I think it’s time that we stop using the term 5G as a shortcut for meaning gigabit networks. 5G is going to vary widely depending upon the frequencies used and will vary even more widely depending on how the ISP builds their network. There will be awesome 5G deployments, but also a lot of so-so and even lousy ones. I know I will be advising my clients on building wireless fiber-to-the-curb – and that means networks that still need a lot of fiber.

Is AT&T Violating Net Neutrality?

I got a text on my AT&T cellphone last month that told me that my wireless plan now includes sponsored data. Specifically they told me that I could now stream movies and other content from DirecTV or U-Verse TV without the video counting against my monthly data cap. This has been available to AT&T post-paid customers for a while, but now is apparently available to all customers. What I found most interesting about the message was that it coincided with the official end of net neutrality.

AT&T is not the first cellular company to do this. Verizon tried this a few years ago, although that attempt was largely unsuccessful because they didn’t offer much content that people wanted to watch. T-Mobile does something similar with their Binge-on program, but since most of their data plans are unlimited, customers can watch anything on their phones, not just the Binge-on video.

The sponsored data from AT&T would be a direct violation of net neutrality if it was still in effect and is a textbook example of paid prioritization. By excusing the DirecTV content from cellular data caps they have created an advantage for DirecTV compared to competitors. It doesn’t really matter that AT&T also happens to own DirecTV, and I imagine that AT&T is now shopping this same idea around to other video providers.

So what is wrong with what AT&T is doing? Certainly their many customers that buy both AT&T cellphones and DirecTV will like the plan. Cellular data in the US is still some of the most expensive data in the world and letting customers watch unlimited video from a sponsored video provider is a huge benefit to customers. Most people are careful to not go over monthly data limits, and that means they carefully curtail watching video on cellphones. But customers taking advantage of sponsored video are going to watch video that would likely have exceeded their monthly data cap – it doesn’t take more than a handful of movies to do that.

AT&T has huge market power with almost 140 million cellphones users on their network at the end of last year. Any video provider they sponsor is going to gain a significant advantage over other video providers. AT&T customers that like watching video on their cellphones are likely to pick DirecTV over Comcast or any other video provider.

It’s also going to be extremely tempting for AT&T to give prioritized routing to DirecTV video – what means implementing the Internet fast lane. AT&T is going to want their cellular customers to have a quality experience, and they can do that by making sure that DirecTV video has the best connections throughout their network. They don’t necessarily have to throttle other video to make DirecTV better – they can just make sure that DirectTV video gets the best possible routing.

I know to many people the AT&T plan is going to feel somewhat harmless. After all, they are bundling together their own cellular and video products. But it’s a short step from here for AT&T to start giving priority to content from others who are willing to pay for it. It’s not to hard to imagine them offering the same plan to Netflix, YouTube or Facebook.

If this plan expands beyond AT&T’s own video, we’ll start seeing the negative impacts of paid prioritization:

  • Only the biggest companies like Netflix, Facebook or Google can afford to pay AT&T for the practice. This is going to shut out smaller video providers and start-ups. Already in the short history of the web we’ve seen a big turnover in the popular platforms on the web – gone or greatly diminished are earlier platforms like AOL, CompuServe and Prodigy. But with the boost given by paid prioritization the big companies today will get a step-up to remain as predominant players on the web. Innovation is going to be severely hampered.
  • This is also the beginning of a curated web where many people only see the world through the filter of the predominant web services. We already see that phenomenon a lot today, but when people are funneled to only using the big web services this will grow and magnify.
  • It’s not hard to imagine the next step where we see reduced price data plans that are ‘sponsored’ by somebody like Facebook. Such platforms will likely make it a challenge for customers to step outside their platform. And that will lead to a segmentation and slow death of the web as we know it.

Interestingly, the Tom Wheeler FCC told AT&T that this practice was unacceptable. But through the change of administration AT&T never stopped the practice and is now expanding it. It’s likely that courts are going to stay some or all of the net neutrality order until the various lawsuits on the issue get resolved. But AT&T clearly feels emboldened to move forward with this, probably since they know the current FCC won’t address the issue even if net neutrality stays in effect.

Industry Shorts – March 2018

Following are a few topics that I find interesting, but which are too short to cover in a full blog:

Surge in Online Video Subscriptions. The number of households buying online video is surging. Netflix added almost 2 million US and 6.36 million international customers in the 4th quarter of 2017. That’s 18% more than the same quarter from a year earlier. There are also a growing number of vMVPD customers. At the end of last year CBS All Access has nearly 5 million customers. Showtime OTT also has nearly 5 million customers. Sling TV now has nearly 2 million customers. AT&T DirecTV hit the 1 million customer mark in December. PlayStation Vue reported 670,000 customers in mid-December. The new YouTube service has about 300,000. Hulu is also growing but doesn’t separately report it’s live TV customers from it’s video on demand customers (reported at 17 million total in December). Note that Hulu let’s customers buy one TV series or movies without needed a subscription.

Cellphone Data Usage Growth. According to the research firm NPD the average US smartphone now is used for an average of 31.4 GB per month of data. This is combined usage between cellular and WiFi data and is evidence that people are starting to really accept the small screen for video. This is up over 25% from a year earlier. The firm reports that video accounts for 83% of the usage.

The number of people willing to use a cellphone for video has also surged. NPD reports that 67% of cellphone users watched at least one video during the 3rd quarter of 2017, up from 57% in the 2nd quarter. Another research firm, Strategic Analytics reported that worldwide cellular data usage grew 115% in 2017, or more than doubled.

Global Streaming Doubled in 2017. Conviva, which provides systems to monitor and analyze online usage also reports that online video content more than doubled last year. They report that there were 12.6 billion viewing hours of online video in 2017 measured across 2.4 billon viewing devices. They report that 58% of video viewing came from North America; 21% from Europe; 19% from Asia 2% from the rest of the world.

Satellite TV Taking the Brunt of Cord Cutting. For some reason cord cutting seemed to be hitting the two big satellite TV providers even harder than landline cable companies. Dish Networks and DirecTV together lost 4.7% of their subscribers in the fourth quarter of 2017. We can only speculate for the reasons for the shift. The bundles of the landline cable companies make it harder for customers to drop their cable subscription. But to offset this, many satellite customers are in rural areas where there is often not a good broadband alternative to cable. But perhaps the FCC’s CAF II and ACAM programs are speeding up rural broadband enough for households to consider cutting the cord. It should be noted that AT&T is pushing their DirecTV now product more than their satellite TV, which also might account for part of the shift from satellite TV.

Apple Jumps into Programming. Apple quietly has gotten into the programing business. They’ve allocated over $1 billion in 2018 for the creation of new content. They’ve landed some big-name talent such as Steven Spielberg, Jennifer Aniston and Reese Witherspoon for projects. Apple doesn’t have a content platform and the industry is buzzing with speculation on how they might market and distribute the content.

Pirated Video Content on Rise. Sandvine reports that 6.5% of North American households have accessed pirated video content in the last year. I’ve read reports from Canada of companies openly advertising pirated content, including providing settop boxes that can download IPTV content directly from the Internet. Yesterday’s blog talked about new efforts by content owners to force ISPs to enforce copyright infringement.

The Newest Battle of Copyright Infringement

For years the big ISPs have paid lip service to complaints about customers who violate copyrights by sharing content on the web for music and video. Every big ISP has had a process in place that was intended to police violation of the Digital Millennium Copyright Act (DMCA).

The owners of copyrighted materials have long complained that the ISP response to violators has been weak and ineffective. And they are right in that most ISPs notify customers that they are accused of violating copyrights, but there has been little or no consequences for violators.

However, that might now be changing due to a lawsuit that’s been in the courts for a few years. Music label BMG sued Cox Communications for not providing adequate protection of it’s copyrighted music. Recently the 4th Circuit Court, on appeal, reversed the original verdict against Cox. However, in doing so the court threw out Cox’s primary defense, which was that they were protected by the ‘safe harbor’ laws that are part of DMCA.

The safe harbor rules protect ISPs like Cox against damages from customer theft of copyrighted materials. Removing the safe harbor means that the owners of copyrighted materials can seek and win damages against ISPs if they don’t take adequate steps to protect copyrights. In the specific case against Cox, the BMG issue was that Cox didn’t do anything to deter repeat offenders.

There are apparently a lot of repeat offenders – customers who share a lot of copyrighted material – so this ruling instantly got the attention of other big ISPs. Comcast responded last week by notifying customers of a new policy for repeat offenders of copyright theft. The new policy has several progressive stages of severity:

  • Customers notified of DMCA violations might be forced to log in fresh to their broadband account, and in doing so will probably have to agree to abide by the company’s DMCA policy before getting access. Customers might also have to talk to Comcast customer service before they can log into their broadband account.
  • Customer that continue to violate DMCA policies after this first stage face termination of their broadband and all other Comcast services.

This is going to have a chilling effect on those that share copyrighted materials. A majority of people live in markets where the cable company offers the best broadband, and losing the home broadband connection is drastic. I have to assume that telcos will come up with similar policies, meaning that DSL also won’t be a refuge for anybody who continues to violate copyrights.

There has always been people who share content. The old public bulletin boards were full of copyrighted songs and pictures that could be shared. Over time this morphed into Napster and other file-sharing services. Today there are still a number of sharing sites on Tor and other places on the web. And people have figured out how to use Kodi and other technologies to capture and share copyrighted video files.

Although they don’t want to play the role of policeman, I suspect the big ISPs will be forced to at least somewhat enforce policies like the one Comcast just initiated. There has always been a big tug of war between ISPs and content owners. This new response from Comcast shows that content owners now have the upper hand. It certainly means that those who continue to share copyrighted materials will face eventually losing their broadband. In today’s world that’s a severe penalty.

Smaller ISPs need to pay attention to this and watch what the big companies are doing. I wouldn’t be surprised to see BMG or some other content owner sue a smaller ISPs to make a point that this applies to everybody – and nobody wants to be that ISP. If the big ISPs really enforce this, then small ISPs need to follow suit and figure out an effective way to police and deter repeat copyright violators.


Progress of the CAF II Program

If readers recall, the CAF II program is providing funds to the largest telcos to upgrade rural facilities in their incumbent operating territories to broadband speeds of at least 10 Mbps down and 1 Mbps up. The CAF II deployment began in the fall of 2015 and lasts for 6 years, so we are now almost 2.5 years into the deployment period. I was curious about how the bigger telcos are doing in meeting their CAF II build-out requirements. The FCC hasn’t published any progress reports on CAF II deployments, so I found the following from web searches:

AT&T. The company took $427 million annually for the six years ($2.56 billion) to bring broadband to 2.2 million rural customers. The company has said they are going to use a combination of improved DSL and fixed wireless broadband using their cellular frequencies to meet their build-out requirements. From their various press releases it seems like they are planning on more wireless than wireline connections (and they have plans in many rural places of tearing down the copper).

The only big public announcement of a wireless buildout for AT&T is a test in Georgia initiated last year. On their website the company says their goal at the end of 2018 is to offer improved broadband to 440,000 homes, which would mean a 17% CAF II coverage at just over the mid-point of their 6-year build-out commitment.

On a side note, AT&T had also promised the FCC, as a condition of the DirecTV merger that they would be pass 12.5 million homes and business with fiber by mid-2019. They report reaching only 4 million by the end of 2017.

CenturyLink. CenturyLink accepted $500 million annually ($3 billion) in CAF II funding to reach 1.2 million rural homes. In case you’re wondering why CenturyLink is covering only half of the homes as AT&T for roughly the same funding – the funding for CAF II varies by Census block according to density. The CenturyLink coverage area is obviously less densely populated than the areas being covered by AT&T.

FierceTelecom reported in January that CenturyLink has now upgraded 600,000 CAF II homes by the end of last year, or 37% of their CAF II commitment. The company says that their goal is to have 60% coverage by the end of this year. CenturyLink is primarily upgrading rural DSL, although they’ve said that they are considering using point-to-multipoint wireless for the most rural parts of the coverage areas. The company reports that in the upgrades so far that 70% of the homes passed so far can get 20 Mbps download or faster.

Frontier. The last major recipient of CAF II funding is Frontier. The company originally accepted $283 million per year to upgrade 650,000 passings. They subsequently acquired some Verizon properties that had accepted $49 million per year to upgrade 37,000 passings. That’s just under $2 billion in total funding.

FierceTelecom reported in January that Frontier reached 45% of the CAF II area with broadband speeds of at least 10/1 Mbps by the end of 2017. The company also notes that in making the upgrades for rural customers that they’ve also upgraded the broadband in the towns near the CAF II areas and have increased the broadband speeds of over 900,000 passings nationwide.

Frontier is also largely upgrading DSL, although they are also considering point-to-multipoint wireless for the more rural customers.

Other telcos also took major CAF II funding, but I couldn’t find any reliable progress reports on their deployments. This includes Windstream ($175 million per year), Verizon ($83 million per year), Consolidated ($51 million per year), and Hawaiian Telcom ($26 million per year).

The upcoming reverse auction this summer will provide up to another $2 billion in funding to reach nearly 1 million additional rural homes. In many cases these are the most remote customers, and many are found in many of the same areas where the CAF II upgrades are being made. It will be interesting to see if the same telcos will take the funding to finish the upgrades. There is a lot of speculation that the cellular carriers will pursue a lot of the reverse auction upgrades.

But the real question to be asked for these properties is what comes next. The CAF II funding lasts until 2021. The speeds being deployed with these upgrades are already significantly lower than the speeds available in urban America. A household today with a 10 Mbps download speed cannot use broadband in the ways that are enjoyed by urban homes. My guess is that there will be continued political pressure to continue to upgrade rural speeds and that we haven’t seen the end of the use of the Universal Service Fund to upgrade rural broadband.

States and Net Neutrality

We now know how states are going to react to the end of net neutrality. There are several different responses so far. First, a coalition of 23 states filed a lawsuit challenging the FCC’s ability to eliminate net neutrality and Title II regulation of broadband. The lawsuit is mostly driven by blue states, but there are red states included like Mississippi and Kentucky.

The lawsuit argues that the FCC has exceeded its authority in eliminating net neutrality. The lawsuit makes several claims:

  • The suit claims that under the Administrative Procedure Act (ACA) the FCC can’t make “arbitrary and capricious” changes to existing policies. The FCC has defended net neutrality for over a decade and the claim is that the FCC’s ruling fails to provide enough justification for abandoning the existing policy.
  • The suit claims that the FCC ignored the significant public record filed in the case that details the potential harm to consumers from ending net neutrality.
  • The suit claims that the FCC exceeded its authority by reclassifying broadband service as a Title I information service rather than as a Title II telecommunications service.
  • Finally, the suit claims that the FCC ruling improperly preempts state and local laws.

Like with past challenges of major FCC rulings, one would expect this suit to go through at least several levels of courts, perhaps even to the supreme court. It’s likely that the loser of the first ruling will appeal. This process is likely to take a year or longer. Generally, the first court to hear the case will determine quickly if some or all of the FCC’s ruling net neutrality order will be stayed until resolution of the lawsuit.

I lamented in a recent blog how partisan this and other FCCs have gotten. It would be a positive thing for FCC regulation in general if the courts put some cap on the ability of the FCC to create new policy without considering existing policies and the public record about the harm that can be caused by a shift in policy. Otherwise we face having this and future FCCs constantly changing the rules every time we get a new administration – and that’s not healthy for the industry.

A second tactic being used by states is to implement a state law that effectively implements net neutrality at the state level. The states of New York, New Jersey and Montana have passed laws that basically mimic the old FCC net neutrality rules at the state level. It’s an interesting tactic and will trigger a lawsuit about state rights if challenged (and I have to imagine that somebody will challenge these laws). I’ve read a few lawyers who opine that this tactic has some legs since the FCC largely walked away from regulating broadband, and in doing so might have accidentally opened up the door for the states to regulate the issue. If these laws hold up that would mean a hodgepodge of net neutrality rules by state – something that benefits nobody.

Another tactic being taken is for states, and even a few cities, to pass laws that change the purchasing rules so that any carrier that bids for government telecom business must adhere to net neutrality. This is an interesting tactic and I haven’t seen anybody that thinks this is not allowed. Governments have wide latitude in deciding the rules for purchasing goods and services and there are already many similar restrictions that states put onto purchasing. The only problem with this tactic is going to be if eventually all of the major carriers violate the old net neutrality rules. That could leave a state with poor or no good choice of telecom providers.

As usual, California is taking a slightly different tactic. They want to require that carriers must adhere to net neutrality if they use state-owned telecom facilities or facilities that were funded by the state. Over the years California has built fiber of its own and also given out grants for carriers to build broadband networks. This includes a recently announced grant program that is likely to go largely to Frontier and CenturyLink. If this law is upheld it could cause major problems for carriers that have taken state money in the past.

It’s likely that there are going to be numerous lawsuits challenging different aspects of the various attempts by states to protect net neutrality. And there are likely to also be new tactics tried by states during the coming year to further muddy the picture. It’s not unusual for the courts to finally decide the legitimacy of major FCC decisions. But there are so many different tactics being used here that we are likely to get conflicting rulings from different courts. It’s clearly going to take some time for this to all settle out.

One interesting aspect of all of this is how the FCC will react if their cancellation of net neutrality is put on hold by the courts. If that happens it means that some or all of net neutrality will still be the law of the land. The FCC always has the option to enforce or not enforce the rules, so you’d suspect that they wouldn’t do much about ISPs that violate the spirit of the rules. But more importantly, the FCC is walking away from regulating broadband as part of killing Title II regulation. They are actively shuttling some regulatory authority to the FTC for issues like privacy. It seems to me that this wouldn’t be allowed until the end of the various lawsuits. I think the one thing we can count on is that this is going to be a messy regulatory year for broadband.