A Doubling of Broadband Prices?

In what is bad news for consumers but good news for ISPs, a report by analyst Jonathan Chaplin of New Street Research predicts big increases in broadband prices. He argues that broadband is underpriced. Prices haven’t increased much for a decade and he sees the value of broadband greatly increased since it is now vital in people’s lives.

The report is bullish on cable company stock prices because they will be the immediate beneficiary of higher broadband prices. The business world has not really acknowledged the fact that in most US markets the cable companies are becoming a near-monopoly. Big telcos like AT&T have cut back on promoting DSL products and are largely ceding the broadband market to the big cable companies. We see hordes of customers dropping DSL each quarter and all of the growth in the broadband industry is happening in the biggest cable companies like Comcast and Charter.

I’ve been predicting for years that the cable companies will have to start raising broadband prices. The companies have been seeing cable revenues drop and voice revenues continuing to drop and they will have to make up for these losses. But I never expected the rapid and drastic increases predicted by this report. Chaplin sets the value of basic broadband at $90, which is close to a doubling of today’s prices.

The cable industry is experiencing a significant and accelerating decline in cable customers. And they are also facing significant declines in revenues from cord-shaving as customers elect smaller cable packages. But the cable products have been squeezed on margin because of programming price increases and one has to wonder how much the declining cable revenue really hurts their bottom line.

Chaplin reports that the price of unbundled basic broadband at Comcast is now $90 including what they charge for a modem. It’s even higher than that for some customers. Before I left Comcast last year I was paying over $120 per month for broadband since the company forced me to buy a bundle that included basic cable if I wanted a broadband connection faster than 30 Mbps.

Chaplin believes that broadband prices at Comcast will be pushed up to the $90 level within a relatively short period of time. And he expects Charter to follow.

If Chaplin is right one has to wonder what price increases of this magnitude will mean for the public. Today almost 20% of households still don’t have broadband, and nearly two-thirds of those say it’s because if the cost. It’s not hard to imagine that a drastic increase in broadband rates will drive a lot of people to use broadband alternatives like cellular data, even though it’s a far inferior substitute.

I also have to wonder what price increases of this magnitude might mean for competitors. I’ve created hundreds of business plans for markets of all sizes, and not all of them look promising. But the opportunities for a competitor improve dramatically if broadband is priced a lot higher. I would expect that higher prices are going to invite in more fiber overbuilders. And higher prices might finally drive cities to get into the broadband business just to fix what will be a widening digital divide as more homes won’t be able to afford the higher prices.

Comcast today matches the prices of any significant cable competitor. For instance, they match Google Fiber’s prices where the companies compete head-to-head. It’s not hard to foresee a market where competitive markets stay close to today’s prices while the rest have big rate increases. That also would invite in municipal overbuilders in places with the highest prices.

Broadband is already a high-margin product and any price increases will go straight to the bottom line. It’s impossible for any ISP to say that a broadband price increase is attributable to higher costs – as this report describes it, any price increases can only be justified by setting prices to ‘market’.

All of this is driven, of course, by the insatiable urge of Wall Street to see companies make more money every quarter. Companies like Comcast already make huge profits and in an ideal world would be happy with those profits. Comcast does have other ways to make money since they are also pursuing cellular service, smart home products and even now bundling solar panels. And while most of the other cable companies don’t have as many options as Comcast, they will gladly follow the trend of higher broadband prices.

Broadband Regulation is in Limbo

We have reached a point in the industry where it’s unclear who regulates broadband. I think a good argument can be made that nobody is regulating broadband issues related to the big ISPs.

Perhaps the best evidence of this is a case that is now in Ninth Circuit Court of Appeals in San Francisco. This case involves a 2014 complaint against AT&T by the Federal Trade Commission based on the way that AT&T throttled unlimited wireless data customers. The issue got a lot of press at the time when AT&T started restricting data usage in 2011 for customers when they hit some arbitrary (and unpublished) data threshold in a month. Customers got shuttled back to 3G and even 2G data speeds and basically lost the ability to use their data plans. The press and the FTC saw this as an attempt by AT&T to drive customers off their grandfathered unlimited data plans (which were clearly not unlimited).

AT&T had argued at the FTC that they needed to throttle customers who use too much data as a way to manage and protect the integrity of their networks. The FTC didn’t buy this argument ruled against AT&T. As they almost always do the company appealed the decision. The District Court in California affirmed the lower court ruling and AT&T appealed again, which is the current case in front of the Ninth Circuit. AT&T is making some interesting claims in the case and is arguing that the Federal Trade Commission rules don’t allow the FTC to regulate common carriers.

There are FTC rules called the ‘common carrier exemption’ that were established in Part 5 of the original FTC Act that created the agency. These exemptions are in place to recognize that telecom common carriers are regulated instead by the FCC. There are similar carve-outs in the FTC rules for other industries that are regulated in part by other federal agencies.

The common carrier exemption doesn’t relieve AT&T and other telecom carriers from all FTC regulation – it just means that the FTC can’t intercede in areas where the FCC has clear jurisdiction. But any practices of telecom carriers that are not specifically regulated by the FCC then fall under FTC regulations since the agency is tasked in general with regulating all large corporations.

AT&T is making an interesting argument in this appeals case. They argue since they are now deemed to be a common carrier for their data business under the Title II rules implemented in the net neutrality order that they should be free of all FTC oversight.

But there is an interesting twist to this case because the current FCC filed an amicus brief in the appeal saying that they think that the FTC has jurisdiction over some aspects of the broadband business such as privacy and data security issues. It is this FCC position that creates uncertainty about who actually regulates broadband.

We know this current FCC wants to reverse the net neutrality order, and so they are unwilling right now to tackle any major issues that arise from those rules. In this particular case AT&T’s throttling of customers occurred before the net neutrality decision and at that time the FCC would not have been regulating cellular broadband practices.

But now that the FCC is considered to be a common carrier it’s pretty clear that the topic is something that the FCC has jurisdiction of today. But we have an FCC that is extremely reluctant to take on this issue because it would give legitimacy to the net neutrality rules they want to eliminate.

The FCC’s position in this case leads me to the conclusion that, for all practical purposes, companies like AT&T aren’t regulated at all for broadband issues. The prior FCC made broadband a common carrier service and gave themselves the obligation to regulate broadband and to tackle issues like the one in this case. But the new FCC doesn’t want to assert that authority and even goes so far as to argue that many broadband related issues ought to be regulated by the FTC.

This particular case gets a little further muddled by the timing since AT&T’s practices predate Title II regulation – but the issue at the heart of the case is who regulates the big ISPs. The answer seems to be nobody. The FCC won’t tackle the issue and AT&T may be right that the FTC is now prohibited from doing so. This has to be a huge challenge for a court because they are now being asked who is responsible for regulating the case in front of them. That opens up all sorts of possible problems. For example, what happens if the court rules that the FCC must decide this particular case but the agency refuses to do so? And of course, while this wrangling between agencies and the courts is being settled it seems that nobody is regulating AT&T and other broadband providers.

The End of Satellite TV?

Randall Stephenson, the CEO of AT&T, recently announced that the company will be working to replace their satellite TV (DirecTV) with an OTT offering over the web. The company plans to launch the first beta trials by the end of this year. The ultimate goal will be for the online offering to eventually replace the satellite offering.

He didn’t provide any specific details of the planned offering other than comparing it to the current DirecTV Now offering that carries about 100 channels and is a direct competitor to landline cable TV.

Obviously the company has a lot of details to work out. DirecTV currently has over 20 million customers and along with Comcast is the only other cable provider that added customers over the last year ending in the second quarter. The biggest online live broadcast offering today is Dish Network’s Sling TV with around 2 million customers. AT&T faces numerous technical challenges if they want to transfer their huge customer base onto the web.

People always speculate why AT&T bought DirecTV and perhaps now we finally have the answer. The product will be marketed nationwide, not just in the AT&T footprint. The big advantage for AT&T is that they are not saddled with FCC rules that create the large cable bundles of 200 channels, and so perhaps they have found a way to make online bundles of cable channels profitable again. It seems that there are probably more profits in a 100-channel line-up than in traditional cable offerings. The same may not be true for skinny bundles and there is a lot of speculation that low-price OTT offerings like Sling TV at $20 don’t make any money.

This move would enable AT&T to leap forward and to easily keep up with the latest video technology. Almost all legacy video is using dated technology like the satellite DBS, the QAM on cable networks and even AT&T’s own first-generation IPTV headends. With an online product the company can get completely out of the settop box and the installation business for TV. They can also easily keep up with new formats and standards, such as the ability to immediately be able to offer 4K video everywhere. Going online makes it a lot easier to meet future customer demands as the industry continues to change rapidly.

But this has to be scary news for rural America. AT&T and Verizon have both made it clear they would like to tear down legacy copper networks, which will make it hard or impossible for some parts of rural America to make voice calls. If copper wires disappear then Cable TV over satellite is the only other modern telecom product available in a lot of rural America. If it’s phased out then much of rural America falls off the telecom map entirely.

While we have no idea if Dish Networks has similar plans, but the fact that they are migrating customers to Sling TV indicates that they might. This could turn ugly for rural America.

Obviously a quality OTT video product requires a quality broadband connection – something that is not available in millions of rural homes. It’s not hard to envision a future in which a home without good broadband might be isolated from the outside world.

It’s clear that the big companies like AT&T are focused only on bottom-line, and perhaps they should be. But one of the primary benefits of having incumbent regulated providers was that everybody in the country was offered the same choice of products. But unfortunately, the never-ending growth of broadband demand has broken the old legacy system. It was one thing to make sure that everybody was connected to the low-bandwidth voice network, but it’s something altogether different to make sure that rural America gets the same broadband as everybody else.

I can remember a time when I was a kid that a lot of rural homes didn’t have cable TV. Some rural homes were lucky enough to get a few TV stations over the air if they had a tall antenna. But many homes had no TV options due to the happenstance of their location. Satellite TV came along and fixed this issue and one expects when visiting a farm today to see a satellite dish in the yard or on the roof. This might become soon another of those quaint memories that are a thing of the past. But in doing so it will add to the political pressure to find a workable rural broadband solution.

Decommissioning Rural Copper, Part 2

In the last blog I wrote about my belief that AT&T and Verizon want out of the rural wireline business. They both have plans to largely walk away from their rural copper networks and replace landline copper services with cellular service. Today I want to talk about what regulators ought to do with those networks.

When these two giant telcos walk away from rural copper they will inevitably harm rural America. While many homes will get the ‘privilege’ of now buying highly-priced cellular-based broadband, other homes are going to find themselves without telephone service if they happen to live in one of the many cellular dead zones. Such homes will not only be unable to benefit from cellular broadband, but if they have poor cell service they will find themselves cut off from voice communications as well.

As somebody who has traveled extensively in rural America I can tell you that there are a lot more cellular dead zones than people realize. And it’s not only farms, and there are county seats in rural America where it’s difficult to get a working cellphone signal inside of buildings.

As part of this transition both companies are going to walk away from a huge amount of existing copper cable. I think this copper cable is an incredibly valuable asset and that regulators ought not to allow them to tear it down.

The copper wire network today goes almost everywhere in rural America. Congressional laws and FCC policies led to most homes in the country getting access the the copper network. These copper wires occupy a valuable space on existing telephone poles – on the majority of rural poles the only two wires are the power lines at the top and the telephone wires at the bottom.

If these copper wires are kept in place they could greatly reduce the cost of building rural fiber. It is far cheaper when building fiber to ‘lash’ the fiber onto an existing set of cables than to hang fiber from scratch. It was this construction technique that allowed Verizon to build a lot of its FiOS fiber network – they lashed fiber onto existing telephone wires. And my guess is that when Verizon decommissions urban copper they are still going to leave a lot of the copper wires in place as a guidewire for their fiber.

If these telcos are going to walk away from these copper wires, then they ought to be required to keep them in place for use by somebody else to hang fiber. Many states might force the big telcos to tear down the copper wires since they will eventually create safety hazards as they break away from poles if they aren’t maintained. But if somebody else is willing to take over that maintenance then it shouldn’t be an issue.

I can picture a regulatory process whereby some other carrier is allowed to come in and ‘claim’ the abandoned wires once they are empty of customers. That would provide fiber overbuilders or rural communities to claim this copper as an asset.

There is some salvage value to copper wires and and it’s possible, but not probable that the value of the copper could exceed the cost to tear it down. So I can see the telcos fighting such an idea as a confiscation of their assets. But these rural wires have been fully depreciated for decades and the telcos have earned back the cost of these copper lines many times over. I believe that by the act of abandoning the wires and depriving some homes of wireline service that the big telcos will have forfeited any rights they might have to the remaining assets.

Anybody claiming the abandoned copper could use it in two ways. First, in many cases there is still existing life left in the copper, as witnessed by Frontier and CenturyLink rehabbing old rural copper with upgraded DSL. Local communities or small carriers could use the copper to bring the better services that the big telcos have refused to do over the last few decades.

But more importantly these wires represent the cheapest path forward for building rural fiber. Anybody taking over the old copper can save a lot of fiber construction costs by lashing fiber onto the existing copper. If our nationwide goal is really to get better broadband to rural America, then offering abandoned copper to fiber builders might be one of the easiest tools available to help the process along.

The big telcos abandoned rural America dacades ago. They stopped doing routine maintenance on rural copper and slashed the number of rural technicians. They now want to walk away from that copper and instead force rural America to buy cellular services at inflated prices. We owe it to the folks who paid for this copper many times over to get some benefit from it and to offer an alternative to the new rural cellular monopolies.

Decommissioning Rural Copper

I’ve been watching AT&T and Verizon since I’ve been in the industry (including a short stint at Southwestern Bell in the early 80s). We are about to see both of these companies unravel their rural telco properties.

Verizon got ahead of the curve and has been selling off rural properties for a few decades, many of which ending up with Frontier. Verizon still serves some rural areas and probably has shed  half of their rural customers. But there are still big swaths or rural Verizon customers in Pennsylvania, New York, Maryland and other northeastern states. Verizon benefitted from these sell-offs by selling completely depreciated and poorly maintained networks at high prices – as can be evidenced by how much Frontier is struggling to cover their massive debts. AT&T has sold almost no rural properties and still serves gigantic rural areas in dozens of states.

Both companies are clearly on a path to tear down the remaining rural copper networks and replace them with cellular wireless networks. There are both pros and cons for these transitions for rural customers.

On the plus side, many of these rural areas have never had broadband since these big telcos never extended their DSL to their rural service areas. We know that they could have extended DSL, because we have hundreds of examples of independent telephone companies that brought DSL to all of their customers, no matter how remote. But the big companies stopped spending money on rural properties decades ago. The remaining copper is now in terrible shape and one has to imagine that cellular voice is probably often as good or better than voice over these old copper lines.

There will now many customers who can buy fixed cellular broadband. This uses the same frequencies as the broadband for smartphones, but the cellular companies are pricing it to be a little less expensive. For many households the fixed-cellular broadband will be the first real broadband alternative they have ever had.

But there are also big downsides to this shift from old copper to cellular networks. First, cellular networks are effective for only a few miles from any given cell site. Anybody who has driven in rural America knows that there are cellular dead spaces everywhere. Any customers living in the cellular dead spaces are going to be left with no communications to the outside world. They’ll lose their copper and they won’t have cellular voice or data. This will be a huge step backwards for many homes.

The big telcos will be taking advantage of the fact that, as a cellular provider, they have no obligations to try to serve everybody. One of the reasons that we had nearly ubiquitous telephone coverage in the country is that telcos were the carriers of last resort in their service areas. They were required by law to extend telephone service to all but extremely remote customers. But that obligation doesn’t apply to a cellular carrier. We already have tons of evidence that the cellular carriers make no apologies to homes that happen to live out of range of their cellular towers. With no copper landlines left we will now have rural communications dead zones. It will be hard for anybody living in these dead zones to stay there and certainly nobody is going to build new homes in a place that doesn’t have cellular service.

There is a downside even for those households that get fixed-cellular broadband. The speeds on this service are going to be slow by today’s standards, in the range of 10 – 15 Mbps for those that live relatively close to a cellular tower, but considerably slower for customers at greater distances. The real downside to getting cellular data is that the speeds are not likely to get better in rural America for many years, even decades. The whole industry is abuzz with talk about 5G cellular making a big difference, but it’s hard to see that technology making much impact in rural areas.

I think this transition away from copper is going to catch a lot of rural people by surprise. These two big telcos have already started the process of decommissioning copper and once that gets full FCC approval the speed of decommissioning copper is likely to soon accelerate. I think a lot of homes are going to be surprised when they find out that the telcos no longer have an obligation to serve them.

What’s the Next FTTP Technology?

There is a lot of debate within the industry about the direction of the next generation of last mile fiber technology. There are three possible technologies that might be adopted as the preferred next generation of electronics – NG-PON2, XGS-PON or active Ethernet. All of these technologies are capable of delivering 10 Gbps streams to customers.

Everybody agrees that the current widely deployed GPON is starting to get a little frayed around the edges. That technology delivers 2.4 Gbps downstream and 1 Gbps upstream for up to 32 customers, although most networks I work with are configured to serve 16 customers at most. All the engineers I talk to think this is still adequate technology for residential customers and I’ve never heard of a neighborhood PON being maxed out for bandwidth. But many ISPs already use something different for larger business customers that demand more bandwidth than a PON can deliver.

The GPON technology is over a decade old, which generally is a signal to the industry to look for the next generation replacement. This pressure usually starts with vendors who want to make money pushing the latest and greatest new technology – and this time it’s no different. But after taking all of the vendor hype out of the equation it’s always been the case that any new technology is only going to be accepted once that new technology achieves and industry-wide economy of scale. And that almost always means being accepted by at least one large ISP. There are a few exceptions to this, like what happened with the first generation of telephone smart switches that found success with small telcos and CLECs first – but most technologies go nowhere until a vendor is able to mass manufacture units to get the costs down.

The most talked about technology is NG-PON2 (next generation passive optical network). This technology works by having tunable lasers that can function at several different light frequencies. This would allow more than one PON to be transmitted simultaneously over the same fiber, but at different wavelengths. But that makes this a complex technology and the key issue is if this can ever be manufactured at price points that can match other alternatives.

The only major proponent of NG-PON2 today is Verizon which recently did a field trial to test the interoperability of several different vendors including Adtran, Calix, Broadcom, Cortina Access and Ericsson. Verizon seems to be touting the technology, but there is some doubt if they alone can drag the rest of the industry along. Verizon seems enamored with the idea of using the technology to provide bandwidth for the small cell sites needed for a 5G network. But the company is not building much new residential fiber. They announced they would be building a broadband network in Boston, which would be their first new construction in years, but there is speculation that a lot of that deployment will use wireless 60 GHz radios instead of fiber for the last mile.

The big question is if Verizon can create an economy of scale to get prices down for NG-PON2. The whole industry agrees that NG-PON2 is the best technical solution because it can deliver 40 Gbps to a PON while also allowing for great flexibility in assigning different customers to different wavelengths. But the best technological solution is not always the winning solution and the concern for most of the industry is cost. Today the early NG-PON2 electronics is being priced at 3 – 4 times the cost of GPON, due in part to the complexity of the technology, but also due to the lack of economy of scale without any major purchaser of the technology.

Some of the other big fiber ISPs like AT&T and Vodafone have been evaluating XGS-PON. This technology can deliver 10 Gbps downstream and 2.5 Gbps upstream – a big step up in bandwidth over GPON. The major advantage of the technology is that is uses a fixed laser which is far less complex and costly. And unlike Verizon, these two companies are building a lot more FTTH networks that Verizon.

And while all of this technology is being discussed, ISPs today are already delivering 10 Gbps data pipes to customers using active Ethernet (AON) technology. For example, US Internet in Minneapolis has been offering 10 Gbps residential service for several years. The active Ethernet technology uses lower cost electronics than most PON technologies, but still can have higher costs than GPON due to the fact that there is a dedicated pair of lasers – one at the core and one at the customer site – for each customer. A PON network instead uses one core laser to serve multiple customers.

It may be a number of years until this is resolved because most ISPs building FTTH networks are still happily buying and installing GPON. One ISP client told me that they are not worried about GPON becoming obsolete because they could double the capacity of their network at any time by simply cutting the number of customers on a neighborhood PON in half. That would mean installing more cards in the core without having to upgrade customer electronics.

From what everybody tells me GPON networks are not experiencing any serious problems. But it’s obvious as the household demand for broadband keeps doubling every three years that the day will come when these networks will experience blockages. But creative solutions like splitting the PON could keep GPON working great for a decade or two. And that might make GPON the preferred technology for a long time, regardless of the vendors strong desire to get everybody to pay to upgrade existing networks.

The Louisville Pole Attachment Lawsuit

There has been a major legislative push lately to make it easier for wireless companies to get onto poles in order to deploy the small cell sites needed for 5G deployment. AT&T and Verizon have been leading the fight for easier access and there have been attempts at both the federal and state level to enact ‘one-touch’ rules. Proposed legislation not only sets a low price for compensating pole owners, but proposed legislation also removes the ability for pole owners or municipalities to slow down wireless deployments.

There is a lot of debate in the industry about the one-touch issue. As I have discussed in various blogs, issues with getting onto poles is still one of the major roadblocks to many fiber deployments. And from the examples cited by the cellular carriers they are seeing huge delays in deploying urban small cell sites.

Like any debate there are legitimate issues to be considered on both sides of the issues. Proponents of one-touch cite the extraordinary costs of wading through the paperwork-heavy pole attachment process as well as the dollar and cents costs of delaying construction projects.

But on the other side are pole owners and current networks hung on wires. Carriers are legitimately worried about safety issues for their technicians if large boxes the size of refrigerators are hung on poles without constraint. They legitimately worry about how such devices could cause problems during repairs from storm damage. And carriers are also worried about network outages if a new attacher is allowed and able to move their wires without their knowledge or permission.

A court decision a few weeks ago might be a first step into putting some clarity to the issue. In that suit AT&T had sued the City of Louisville in order to stop them from passing a one-touch make-ready ordinance. The ordinance was aimed at making it easier for Google Fiber and other competitive providers to get onto poles in the City. The City of Louisville owns most of the poles in the city and the City has been working with Google Fiber to deploy a fiber network to everybody in the City.

You have to let the irony of AT&T’s lawsuit sink in for a minute. This is a company that is spending millions right now lobbying for one-touch rules. AT&T not only wants to deploy small cell sites, but they are also in the process of building a huge amount of fiber to support those sites. And yet AT&T felt compelled to fight against the very kind of ordinance they are promoting because it would help one of their competitors.

It turns out that not all one-touch ordinances are the same. The ordinances that AT&T and Verizon are pushing are crafted very carefully to help them while still not making it quite so easy for their competitors. The Louisville ordinance made it easier for any new attacher to get onto poles, including AT&T.

The US District Court Judge of Kentucky completely rejected all of AT&T’s claims and tossed the lawsuit. The court basically said that all of AT&T’s claims in the suit were false. It’s ironic that many of the issues raised by the City in defense of the suit sound the same as the claims that AT&T makes elsewhere when lobbying for one-touch legislation.

I’ve always said that being in the regulatory department at AT&T has to be the hardest job in our industry. It’s a company that wears too many hats. AT&T owns a huge monopoly landline network and wants to protect itself from competitors. In some markets AT&T is a major pole owner. AT&T is also a huge wireless company that now wants access to poles. And AT&T is a huge builder of fiber, much of it now outside of its monopoly telco territory.

Any regulatory position the company takes to benefit one of these business lines is likely to not be in the best interest of other parts of the company. When looking at the big picture one has to think that AT&T will get far more benefit than harm from one-touch rules. Such rules will make it a lot easier to build more fiber and to deploy cell sites. And yet, a company with this many tentacles in the industry could not restrain itself from filing a lawsuit that probably was not in its own best long-term interest. The monopoly side of the company felt it could not sit back and let a competitor like Google Fiber build without the company taking steps to slow them down.

More Pressure on WiFi

As if we really needed more pressure put onto our public WiFi spectrum, both Verizon and AT&T are now launching Licensed Assisted Access (LAA) broadband for smartphones. This is the technology that allows cellular carriers to mix LTE spectrum with the unlicensed 5 GHz spectrum for providing cellular broadband. The LAA technology allows for the creation of ‘fatter’ data pipes by combining multiple frequencies, and the wider the data pipe the more data that makes it to the end-user customer.

When carriers combine frequencies using LAA they can theoretically create a data pipe as large as a gigabit while only using 20 MHz of licensed frequency. The extra bandwidth for this application comes mostly from the unlicensed 5 GHz band and is similar to the fastest speeds that we can experience at home using this same frequency with 802.11AC. However, such high-speed bandwidth is only useful for a short distance of perhaps 150 feet and the most practical use of LAA is to boost cellphone data signals for customers closest to a cell tower. That’s going to make LAA technology most beneficial in dense customer environments like busy downtown areas, stadiums, etc. LAA isn’t going to provide much benefit to rural cellphone towers or those along interstate highways.

Verizon recently did a demonstration of the LAA technology that achieved a data speed of 953 Mbps. They did this using three 5 GHz channels combined with one 20 megahertz channel of AWS spectrum. Verizon used a 4X4 MIMO (multiple input / multiple output) antenna array and 256 QAM modulation to achieve this speed. The industry has coined the new term of four-carrier aggregation for the technology since it combines 4 separate bands of bandwidth into one data pipe. A customer would need a specialized MIMO antenna to receive the signal and also would need to be close to the transmitter to receive this kind of speed.

Verizon is starting to update selected cell sites with the technology this month. AT&T has announced that they are going to start introducing LAA technology along with 4-way carrier aggregation by the end of this year. It’s important to note that there is a big difference between the Verizon test with 953 Mbps speeds and what customers will really achieve in the real world. There are numerous factors that will limit the benefits of the technology. First, there aren’t yet any handsets with the right antenna arrays and it’s going to take a while to introduce them. These antennas look like they will be big power eaters, meaning that handsets that try to use this bandwidth all of the time will have short battery lives. But there are more practical limitations. First is the distance limitation and many customers will be out of range of the strongest LAA signals. A cellular company is also not going to try to make this full data connection using all 4 channels to one customer for several reasons, the primary one being the availability of the 5 GHz frequency.

And that’s where the real rub comes in with this technology. The FCC approved the use of this new technology last year. They essentially gave the carriers access to the WiFi spectrum for free. The whole point of licensed spectrum is to provide data pipes for all of the many uses not made by licensed wireless carriers. WiFi is clearly the most successful achievement of the FCC over the last few decades and providing big data pipes for public use has spawned gigantic industries and it’s hard to find a house these days without a WiFi router.

The cellular carriers have paid billions of dollars for spectrum that only they can use. The rest of the public uses a few bands of ‘free’ spectrum, and uses it very effectively. To allow the cellular carriers to dip into the WiFi spectrum runs the risk of killing that spectrum for all of the other uses. The FCC supposedly is requiring that the cellular carriers not grab the 5 GHz spectrum when it’s already busy in use. But to anybody that understands how WiFi works that seems like an inadequate protection, because any of the use of this spectrum causes interference by definition.

In practical use if a user can see three or more WiFi networks they experience interference, meaning that more than one network is trying to use the same channel at the same time. It is the nature of this interference that causes the most problems with WiFi performance. When two signals are both trying to use the same channel, the WiFi standard causes all competing devices to go quiet for a short period of time, and then both restart and try to grab an open channel. If the two signals continue to interfere with each other, the delay time between restarts increases exponentially in a phenomenon called backoff. As there are more and more collisions between competing networks, the backoff increases and the performance of all devices trying to use the spectrum decays. In a network experiencing backoff the data is transmitted in short bursts between the times that the connection starts and stops from the interference.

And this means that when the cellular companies use the 5 GHz spectrum they will be interfering with the other users of that frequency. That’s what WiFi was designed to do and so the interference is unavoidable. This means other WiFi users in the immediate area around an LAA transmitter will experience more interference and it also means a degraded WiFi signal for the cellular users of the technology – and they reason they won’t get speeds even remotely close to Verizon’s demo speeds. But the spectrum is free for the cellular companies and they are going to use it, to the detriment of all of the other uses of the 5 GHz spectrum. With this decision the FCC might well have nullified the tremendous benefits that we’ve seen from the 5 GHz WiFi band.

Where’s the Top of the Broadband Market?

Last week I looked at the performance of the cable TV industry and today I’m taking a comparative look at broadband customers for all of the large ISPs in the country. Following are the comparative results comparing the end of 2Q 2017 to 2Q 2016.

2017 2016 Change
Comcast 25,306,000 23,987,000 1,319,000 5.5%
Charter 23,318,000 21,815,000 1,503,000 6.9%
AT&T 15,686,000 15,641,000 45,000 0.3%
Verizon 6,988,000 7,014,000 (26,000) -0.4%
CenturyLink 5,868,000 5,990,000 (122,000) -2.0%
Cox 4,845,000 4,745,000 100,000 2.1%
Frontier 4,063,000 4,552,000 (489,000) -10.7%
Altice 4,004,000 4,105,000 (101,000) -2.5%
Mediacom 1,185,000 1,128,000 57,000 5.1%
Windstream 1,025,800 1,075,800 (50,000) -4.6%
WOW 727,600 725,700 1,900 0.3%
Cable ONE 521,724 508,317 13,407 2.6%
Fairpoint 307,100 311,440 (4,340) -1.4%
Cincinnati Bell 304,193 296,700 7,493 2.5%
94,149,417 91,894,957 2,254,460 2.5%

All of these figures come from reports published each quarter by Leichtman Research Group. Just like with cable subscribers, these large companies control over 95% of the broadband market in the country – so looking at them provides a good picture of all broadband. Not included in these numbers are the broadband customers of the smaller ISPs, the subscribers of WISPs (wireless ISPs) and customers of the various satellite services. It’s always been fuzzy about how MDUs are included in these numbers. The MDUs served by the major ISPs above are probably counted fairly well. But today there are numerous MDU owners who are buying a large broadband pipe from a fiber provider and then giving broadband to tenants. These customers are a growing demographic and are likely not included accurately in these numbers.

One of the biggest stories here is that the overall market is still growing at a significant rate of almost 2.5% per year. A little over half of the growth is coming from sales of broadband to new housing units. In the last year, with a good economy the country added almost 1.5 million new living units. But there are obviously still other homes buying broadband for the first time.

There has been a debate for years in the country about where the broadband market will top out. Those that don’t have broadband today can be put into four basic categories: 1) those that can’t afford broadband, 2) those that don’t want it 3) those that are happy with a substitute like cellular broadband, and 4) those who have zero broadband available, such as much of rural America.

It’s obvious that cable companies are outperforming telcos and Comcast, Charter and Mediacom gained more than 5% new broadband customers over the last year. But compared to more recent years the telcos have largely held their own, except for Frontier – which had numerous problems during the year including a botched transition for customers purchased from Verizon.

There are a number of industry trends that will be affecting broadband customers over the next few years:

  • We should start seeing rural customers getting broadband for the first time due to the FCC’s CAF II program. We are now in the third year of that program. The number of customers could be significant and CenturyLink estimates it will get at least a 60% penetration where it is expanding its DSL. I have seen reports from all over the country of fixed cellular wireless customers being connected by AT&T and Verizon.
  • The introduction of ‘unlimited’ cellular plans ought to make cellular broadband more attractive, at least to some demographics. While not really unlimited, the data caps of 20 GB or more per month are a huge increase over data caps from prior years.
  • There are almost a dozen companies that have filed requests with the FCC to launch new broadband satellites. The first major such launch was done recently by ViaSat which will use the new satellite to beef up its Excede product. There’s no telling how many of the other FCC filings represent real satellites or just vaporware, but there should be more competition from satellites, particular those that launch in low orbits to reduce the latency issue. The really big unknown is if Elon Musk will be able to launch the massive satellite network he has promised.
  • Lifeline programs. Companies like Comcast and AT&T have quietly launched low-price broadband options for low-income homes. The companies don’t advertise the plans broadly, but there are communities where significant numbers of customers have been added to these programs.

Big ISPs Want to be Regulated

I’ve always contended that the big ISPs, regardless of their public howling, want to be regulated. It is the nature of any company that is regulated to complain about regulation. For the last decade as AT&T and Verizon made the biggest telecom profits ever they have released press release after press release decrying how regulation was breaking their backs. The big telcos and cable companies spent the last few years declaring loudly that Title II regulation was killing incentives to make investments, while spending record money on capital.

A few months ago Comcast, Charter, and Cox filed an amicus brief in a lawsuit making its way through the US. Court of Appeals for the Ninth Circuit. In that brief they asked the federal appeals court to restore the Federal Trade Commission’s jurisdiction over AT&T. The specific case being reviewed had to do with deceptive AT&T marketing practices when they originally offered unlimited cellular data plans. It turns out that AT&T throttled customer speeds once customers reached the meager threshold of 3 – 5 GB per month.

In 2014 the FTC sued AT&T for the practice and that’s the case now under appeal. It’s a bit extraordinary to see big ISPs siding with the government over another ISP, and the only reason that can be attributed to the suit is that these companies want there to be a stable regulatory environment. In the brief the cable companies expressed the desire to “reinstate a predictable, uniform, and technology-neutral regulatory framework that will best serve consumers and businesses alike.”

That one sentence sums up very well the real benefit of regulation to big companies. As much as they might hate to be regulated, they absolutely hate making huge investments in new product lines in an uncertain regulatory environment. When a big ISP knows the rules, they can plan accordingly.

One scenario that scares the big ISPs is living in an environment where regulations can easily change. That’s where we find ourselves today. It’s clear that the current FCC and Congress are planning on drastically reducing the ‘regulatory burden’ for the big ISPs. That sounds like an ideal situation for the ISPs, but it’s not. It’s clear that a lot of the regulations are being changed for political purposes and big companies well understand that the political pendulum swings back and forth. They dread having regulations that change with each new administration.

We only have to go back a few decades to see this in action. The FCC got into and then back out of the business of regulating cable TV rates several times in the late 1970s and the 1980s. This created massive havoc for the cable industry. It created uncertainty, which hurt their stock prices and made it harder for them to raise money to expand. The cable industry didn’t become stable and successful until Congress finally passed several pieces of cable legislation to stop these regulatory swings.

Big companies also are not fond of being totally deregulated. That is the basis for the amicus brief in the AT&T case. The big ISPs would rather be regulated by the FTC instead of being unregulated. The FTC might occasionally slap them with big fines, but the big companies are smart enough to know that they have more exposure without regulations. If the FTC punishes AT&T for its marketing practices that’s the end of the story. But the alternative is for AT&T to have to fend off huge class action lawsuits that will seek damages far larger than what the FTC will impose. There is an underlying safety net by being regulated and the big ISPs understand and can quantify the risk of engaging in bad business practices.

In effect, as much as they say that hate being regulated, big companies like the safety of hiding behind regulators who protect them as much as they protect the public. It’s that safety net that can allow a big ISP to invest billions of capital dollars.

I really don’t think the FCC is doing the big ISPs any favors if they eliminate Title II regulations. Almost every big ISP has said publicly that they are not particularly bothered by the general principles of net neutrality – and I largely believe them. Once those rules were put into place the big companies made plans based upon those rules. The big ISPs did fear that some future FCC might use Title II rules to impose rate regulation – much as the disaster with the cable companies in the past. But overall the regulation gives them a framework to safely invest in the future.

I have no doubt that the political pendulum will eventually swing the other way – because it always does. And when we next get a democratic administration and Congress, we are likely to see much of the regulations being killed by the current FCC put back into place by a future one. That’s the nightmare scenario for a big ISP – to find that they have invested in a business line that might be frowned upon by future regulators.