The Cable Industry – 4Q 2017

It was just a year ago where there were numerous industry articles asking if cord cutting was real. There were many who thought that cord cutting would fizzle out and would not be a big deal for the cable industry. But the numbers are not from Leichtman Research Group for the end of 2017 and it shows that cord cutting is now quite real. The following numbers compare the fourth quarters of 2017 and 2016.

4Q 2017 4Q 2016 Change
Comcast 22,357,000 22,508,000 (151,000) -0.7%
DirecTV 20,458,000 21,012,000 (554,000) -2.6%
Charter 16,997,000 17,236,000 (239,000) -1.4%
Dish 11,030,000 12,025,000 (995,000) -8.3%
AT&T 3,657,000 4,281,000 (624,000) -14.6%
Cox 4,200,000 4,290,000 (90,000) -2.1%
Verizon 4,619,000 4,694,000 (75,000) -1.6%
Altice 3,405,500 3,534,500 (129,000) -3.6%
Frontier 961,000 1,145,000 (184,000) -16.1%
Mediacom 821,000 835,000 (14,000) -1.7%
Cable ONE 283,001 320,246 (37,245) -11.6%
 Total 88,788,501 91,880,746 (3,092,245) -3.4%

These companies represent roughly 95% of the entire cable market, so these numbers tell the story of the whole market. From what I can see from many of my clients, many small cable companies are likely doing even worse than the big companies.

What’s probably the most significant from these numbers to me is that the overall industry cable penetration dropped to 70% by the end of 2017, down from a high of a few years ago of 75%. There were 126.2 million households at the end of 2017, per statistica, and only 70% of them are buying traditional cable – and that number has certainly dropped more into 2018.

The rate of growth of cord cutting is increasing. In 2016 the industry lost just over 1 million customers and in one year that grew to over 3 million.

It’s not hard to see where these customer went. FierceCable reported recently that 5% (over 6 million) of US households subscribe to a vMVPD service – these are online services that carry smaller bundles of traditional cable channels like Sling TV, Playstation Vue and DirecTV Now. It’s easy to forget that just a year ago most of these services were just getting started.

It’s worth noting that AT&T overall saw only a minor drop in total cable subscribers. While AT&T and their DirecTV subsidiary lost 1.2 million customers, DirecTV now has just over 1.1 million customers. But this still has to be hurting the company since analysts all believe that the margins on the vMVPD services are much slimmer than traditional cable.

Of other note are the large percentage losses of cable customers at Dish, Frontier and Cable One.

Another way to consider these losses is on a daily basis, and the industry lost nearly 8,500 customers per calendar day during the year.

It’s obvious in looking at these number that the cable industry is now in the same kind of free fall we saw a decade ago with landline telephones. The phenomenon is widespread and 3 million cord cutters means this is every neighborhood in the country. I believe that the pace of cord cutting will continue to accelerate. It’s looked around my own neighborhood and I can’t find anybody who hasn’t either cut the cord or is thinking about doing so.

What surprises me the most is that the big cable companies are not in screaming to the Congress and the FCC to change the rules governing traditional cable. Those rules force the big channel line-ups, and the cord cutting shows that people can be happy with far less than what the programmers are selling. The cable company could be offering more of the skinny bundles offered by the vMVPDs and could retain more bundled customers.

Dig Once Rules Coming

US Representative Anna Eshoo of California has submitted a ‘dig once’ bill every year since 2009, and the bill finally passed in the House. For this to become law the bill still has to pass the Senate, but it got wide bipartisan support in the House.

Dig Once is a simple concept that would mandate that when roads are under construction that empty conduit is places in the roadbed to provide inexpensive access for somebody that wants to bring fiber to an area.

Here are some specifics included in the bill:

  • This would apply to Federal highway projects, but also to state projects that get any federal funding. It encourages states to apply this more widely.
  • For any given road project there would be ‘consultation’ with local and national telecom providers and conduit would be added if there is an expected demand for fiber within 15 years.
  • The conduit would be installed under the hard surface of the road at industry standard depths.
  • The conduits would contain pull tape that would allow for easy pulling of fiber in the future.
  • Handholes would be placed at intervals consistent with industry best practices.

This all sounds like good stuff, but I want to play devil’s with some of the requirements.

The initial concept of dig once was to never pass up the opportunity to place conduit into an ‘open ditch’. The cost of digging to put in conduit probably represents 80% of the cost of deployment in most places. But this law is not tossing conduit into open construction ditches. It instead requires that the conduit be placed at depths that meet industry best practices. And that is going to mean digging at a foot or more deeper than the construction that was planned for the roadbed.

To understand this you have to look at the lifecycle of roads. When a new road is constructed the road bed is typically dug from 18 inches deep to 3 feet deep depending upon the nature of the subsoil and also based upon the expected traffic on the road (truck-heavy highways are built to a higher standard than residential streets). Typically roads are then periodically resurfaced several times when the road surface deteriorates. Resurfacing usually requires going no deeper than a few inches into the roadbed. But at longer intervals of perhaps 50 years (differs by local conditions) a road is fully excavated to the bottom of the roadbed and the whole cycle starts again.

This means that the conduit needs to be placed lower than the planned bottom of the roadbed. Otherwise, when the road is finally rebuilt all of the fiber would be destroyed. And going deeper means additional excavation and additional cost. This means the conduit would not be placed in the ‘open ditch’. The road project will have dug out the first few feet of the needed excavation, but additional, and expensive work would be needed to put the conduit at the safe depth. In places where that substrate is rock this could be incredibly expensive, but it wouldn’t be cheap anywhere. It seems to me that this is shuttling the cost of deploying long-haul fiber projects to road projects, rather than to fiber providers. There is nothing wrong with that if it’s the national policy and there are enough funds to pay for it – but I worry that in a country that already struggles to maintain our roads that this will just means less road money for roads since every project just got more expensive.

The other issue of concern to me is handholes and access to the fiber. This is pretty easy for an Interstate and there ought to be fiber access at every exit. There are no customers living next to Interstates and these are true long-haul fibers that stretch between communities.

But spacing access points along secondary roads is a lot more of a challenge. For instance, if you want a fiber route to be used to serve businesses and residents in a city this means an access point every few buildings. In more rural areas it means an access point at every home or business. Adding access points to fiber is the second most labor-intensive part of the cost after the cost of construction. If access points aren’t where they are needed, in many cases the fiber will be nearly worthless. It’s probably cheaper in the future to build a second fiber route with the proper access points than it is to try to add them to poorly designed existing fiber route.

This law has great intentions. But it is based upon the concept that we should take advantage of construction that’s already being paid for. I heartily support the concept for Interstate and other long-haul highways. But the concept is unlikely to be sufficient on secondary roads with lots of homes and businesses. And no matter where this is done it’s going to add substantial cost to highway projects.

I would love to see more fiber built where it’s needed. But this bill adds a lot of costs to building highways, which is already underfunded in the country. And if not done properly – meaning placing fiber access points where needed – this could end up building a lot of conduit that has little practical use for a fiber provider. By making this a mandate everywhere it is likely to mean spending a whole lot of money on conduit that might never be used or used only for limited purposes like feeding cellular towers. This law is not going to create fiber that’s ready to serve neighborhoods or those living along highways.

Virtual Reality and Broadband

For the second year in a row Turner Sports, in partnership with CBS and the NCAA will be streaming March Madness basketball games in virtual reality. Watching the games has a few catches. The content can only be viewed on two VR sets – the Samsung Gear VR and the Google Daydream View. Viewers can buy individual games for $2.99 or buy them all for $19.99. And a viewer must be subscribed to the networks associated with the broadcasts – CBS, TNT, TBS and truTV.

Virtual reality viewers get a lot of options. They can choose which camera to watch from or else opt for the Turner feed that switches between cameras. When the tournament reaches the Sweet 16 viewers will receive play-by-play from a Turner team broadcasting only for VR viewers. The service also comes with a lot of cool features like the ability to see stats overlays on the game or on a particular player during the action. Games are not available for watching later, but there will be a big library of game highlights.

Last year Turner offered the same service, but only for 6 games. This year the line-up has been expanded to 21 games that includes selected regionals in the first and second round plus Sweet Sixteen and Elite Eight games. The reviews from last year’s viewers were mostly great and Turner is expecting a lot more viewers this year.

Interestingly none of the promotional materials mention the needed bandwidth. The cameras being used for VR broadcasts are capable of capturing virtual reality in 4K. But Turner won’t be broadcasting in 4K because of the required bandwidth. Charles Cheevers, the CTO of Arris said last year that a 720p VR stream in 4K requires at least a 50 Mbps connection. That’s over 30 times more bandwidth than a Netflix stream.

Instead these games will be broadcast in HD video at 60 frames per second. According to Oculus that requires a data stream of 14.4 Mbps for ideal viewing. Viewing at slower speeds results in missing some of the frames. Many VR viewers complain about getting headaches while watching VR, and the primary reason for that the headaches is missing frames. While the eye might not be able to notice the missing frames the brain apparently can.

One has to ask if this is the future of sports. The NFL says it’s not ready yet to go to virtual reality until there is more standardization between different VR sets – they fear for now that VR games will have a limited audience due to the number of viewers with the right headsets. But the technology has been tried for football and Fox broadcast the Michigan – Notre Dame game last fall in virtual reality.

All the sports networks have to be looking at the Turner pricing of $2.99 per game and calculating the potential new revenue stream from broadcasting more games in VR in addition to traditional cable broadcasts. Some of the reviews I read of last year’s NCAA broadcasts said that after watching a game in VR that normal TV broadcasts seemed boring. Many of us familiar with this feeling. I can’t watch linear TV any more. It’s not just sitting through the commercials, but it’s being captive to the stream rather than watching the way I want. We can quickly learn to love a better experience.

Sports fans are some of the most intense viewers of any content. It’s not hard to imagine a lot of sports fans wanting to watch basketball, football, hockey or soccer in VR. Since the format favors action sports it’s also not hard to imagine the format also drawing viewers to rugby, lacrosse and other action sports.

It’s possible that 4K virtual reality might finally be the app that justifies fast fiber connections. There is nothing else on the Internet today that requires that much speed plus low latency. Having several simultaneous viewers in a home watching 4K VR would require speeds of at least a few hundred Mbps. You also don’t need to look out too far to imagine virtual reality in 8K, requiring a data stream of at least 150 Mbps – which might be the first home application that can justify a gigabit connection.

Preemption of Local Telephone Rules

The FCC voted yesterday that telecom deployments are now exempt from any environmental or historic preservation reviews. This is seen as the first step at the FCC making it easier to deploy 5G.

It’s an interesting rule change because in my experience those rules have more often applied to federally funded broadband projects than to local ones. For example, the BTOP stimulus grants added costs to every project by requiring both environmental and historic preservation reviews – even when it was obvious they didn’t apply. The vast majority of telecom deployments want to put fiber or telecom equipment into already-established rights-of-way. It’s hard to justify doing an environmental review when fiber is to be laid on the shoulder of an existing road or on poles. And the vast majority of 5G equipment will be on utility poles, light poles or buildings, so it’s hard to think that there can be much environmental impact from using established rights-of-way.

But that doesn’t mean that there is never a reason for a locality to have these requirements. Consider my town of Asheville, NC. There is a neighborhood around the Biltmore mansion that has strict zoning codes to keep it looking historic. The City ought to have the option to review and approve 5G or any utility deployments that might clutter the historic nature of such an area. Cities have often forced utilities into the ground in historic districts, but 5G transmitters can’t be put underground, by definition. I’ve seen some proposed small cell transmitters that are large and unsightly, and it doesn’t seem unreasonable for a community to have some say into where such gear can be used. Do we really need to see unsightly telecom equipment in Williamsburg, the Gettysburg battlefield or near the Liberty Bell? Local communities also need to have some say before a telecom deployment disturbs graves or archaeological sites.

The same goes for an environmental review. A better rule would be to only allow an environmental review when new telecom facilities are to be built into virgin rights-of-way – and where there is some local concern. We don’t really want to allow somebody to lay fiber through a sensitive wetland or bird sanctuary without some local say in the deployment.

These rules are the first step in what is perceived as the FCC desire to preempt all local control over 5G deployments. This FCC created various Broadband Deployment Advisory Committees (BDAC) to look at various industry issues and one of these committees looked at ‘Removing State and Local Regulatory Barriers”. It’s fairly obvious from the name of the group that they made a long list of local regulations that should be preempted.

That BDAC group essentially recommended that the FCC override all local control of rights-of-ways or any kind of review of telecom infrastructure deployment. Their recommendations read like a wish list from the lobbyists of the large cellular carriers and ISPs. If the FCC enacts all of the BDAC group’s recommendations, they will have handed over control of the 5G deployment process to wireless carriers and ISPs with no local say in the process.

I am certainly sympathetic to carriers that encounter major barriers to infrastructure deployment. I will have a blog coming soon on a particular egregious abuse of local authority that is greatly increasing the cost of a rural fiber deployment. But I’ve worked with hundreds of fiber deployments and mostly local rules are sensible and realistic. For example, cities have legitimate concerns over fiber deployments. They usually insist in getting records so that they have some record of what is deployed under their streets. They often require contractors to use sensible traffic control and to clean up after construction. And they often have fees which compensate the city for processing permits, for locating existing utilities and for inspecting the construction. If these kinds of rules are overridden by the FCC we’ll soon see horror stories of fiber builders who dig up streets and then walk away with no consequences. In my experience local rules are needed to stop utilities from taking shortcuts to save money.

I was talking to a colleague about this topic and they asked if we really need to be concerned as much about 5G as we are about fiber deployments. After some thought my answer is yes – the same sort of common sense local rules need be allowed for 5G. I picture a future where there will be multiple companies deploying 5G into neighborhoods. It’s not hard to picture wireless devices of various sizes hanging from every available pole and structure. It’s not hard to envision wireless providers erecting 100’ poles on streets to reach above the tree canopy. It’s not hard to envision 5G providers drastically trimming trees to give them line of sight to homes. I know I want my city to have some say in this before AT&T and Verizon make a mess out of my own street.

I am sure these new rules will be challenged in court. The legal question will be if the FCC has the authority to override local laws on these issues. I have no idea of how the law might apply to environmental or historic preservation reviews. But if the FCC tries to do the same with 5G pole attachments they run smack into the Telecommunications Act of 1996 which gives States (and by inference, localities) the ability to craft their own local laws concerning poles, conduits and rights-of-way. It’s always a tug of war when the FCC tries to override states and the courts are almost always the final arbiter of these attempts.

Data Caps Again?

My prediction is that we are going to see more stringent data caps in our future. Some of the bigger ISPs have data caps today, but for the most part the caps are not onerous. But I foresee data caps being reintroduced as another way for big ISPs to improve revenues.

You might recall that Comcast tried to introduce a monthly 300 GB data cap in 2015. When customers hit that mark Comcast was going to charge $10 for every additional 50 GB of download, or $30 extra for unlimited downloading.

There was a lot of public outcry about those data caps. Comcast backed down from the plan due to pressure from the Tom Wheeler FCC. At the time the FCC probably didn’t have the authority to force Comcast to kill the data caps, but the nature of regulation is that big companies don’t go out of their way to antagonize regulators who can instead cause them trouble in other areas.

To put that Comcast data cap into perspective, in September of 2017 Cisco predicted that home downloading of video would increase 31% per year through 2021. They estimated the average household data download in 2017 was already around 130 GB per month. You might think that means that most people wouldn’t be worried about the data caps. But it’s easy to underestimate the impact of compound growth and at a 31% growth rate the average household download of 130 GB would grow to 383 gigabits by 2021 – considerably over Comcast’s propose data cap.

Even now there are a lot of households that would be over that caps. It’s likely that most cord cutters use more than 300 GB per month – and it can be argued that the Comcast’s data caps would punish those who drop their video. My daughter is off to college now and our usage has dropped, but we got a report from Comcast when she was a senior that said we used over 600 GB per month.

So what are the data caps for the largest ISPs today?

  • Charter, Altice, Verizon and Frontier have no data caps.
  • Comcast moved their data cap to 1 terabyte, with $10 for the first 50 GB and $50 monthly for unlimited download.
  • AT&T has almost the stingiest data caps. The cap on DSL is 150 GB, on U-verse is 250 GB, on 300 Mbps FTTH is 1 TB and is unlimited for a Gbps service. They charge $10 per extra 50 GB.
  • CenturyLink has a 1 TB cap on DSL and no cap on fiber.
  • Cox has a 1 TB cap with $30 for an extra 500 GB or $50 unlimited.
  • Cable One has no charge but largely forces customers who go over caps to upgrade to more expensive data plans. Their caps are stingy – the cap on a 15 Mbps DSL connection is 50 GB.
  • Mediacom has perhaps the most expensive data caps – 60 Mbps cap is 150 GB, 100 Mbps is 1 TB. But the charge for violating the cap is $10 per GB or $50 for unlimited.

Other than AT&T, Mediacom and Cable One none of the other caps sound too restrictive.

Why do I think we’ll see data caps again? All of the ISPs are looking forward just a few years and wondering where they will find the revenues to increase the demand from Wall Street for ever-increasing earnings. The biggest cable companies are still growing broadband customers, mostly by taking customers from DSL. But they understand that the US broadband market is approaching saturation – much like has happened with cellphones. Once every home that wants broadband has it, these companies are in trouble because bottom line growth for the last decade has been fueled by the growth of broadband customers and revenues.

A few big ISPs are hoping for new revenues from other sources. For instance, Comcast has already launched a cellular product and also is seeing good success with security and smart home service. But even they will be impacted when broadband sales inevitably stall – other ISPs will feel the pinch before Comcast.

ISPs only have a few ways to make more money once customer growth has stalled, with the primary one being higher rates. We saw some modest increases earlier this year in broadband rates – something that was noticeable because rates have been the same for many years. I fully expect we’ll start seeing sizable annual increases in broadband rates – which go straight to the bottom line for ISPs. The impact from broadband rate increases is major for these companies – Comcast and Charter, for example, make an extra $250 million per year from a $1 increase in broadband rates.

Imposing stricter data caps can be as good as a rate increase for an ISPs. They can justify it by saying that they are charging more only for those who use the network the most. As we see earnings pressure on these companies I can’t see them passing up such an easy way to increase earnings. In most markets the big cable companies are a near monopoly and consumers who need decent speeds have fewer alternative as each year passes.Since the FCC has now walked away from broadband regulations there will be future regulatory hindrance to the return of stricter data caps.

Charter Upgrading Broadband

We are now starting to see the results of cable companies upgrading to DOCSIS 3.1. Charter, the second biggest ISP in the country recently announced that it will be able to offer gigabit speeds to virtually it’s whole footprint of over 40 million passings.

DOCSIS 3.1 is the newest protocol from Cable Labs that allows bonding an unlimited number of spare channel slots for broadband. A gigabit data path requires roughly 24 channels on a cable network using the new DOCSIS protocol. In bigger markets this replaces DOCSIS 3.0 that was limited to maximum download speeds in the range of 250 Mbps. I know there are Charter markets with even slower speeds that either operate under older DOCSIS standards or that are slow for some other reason.

Charter has already begun the upgrades and is now offering gigabit speeds to 9 million passings in major markets like Oahu, Hawaii; Austin, Texas; San Antonio, Texas, Charlotte, North Carolina; Cincinnati, Ohio; Kansas City, Missouri; New York City; and Raleigh-Durham, North Carolina. It’s worth noting that those are all markets where there is fiber competition, so it’s natural they would upgrade these first.

The new increased speed won’t actually be a gigabit and will be 940 Mbps download and 35 Mbps upload. (It’s hard to think there is anybody who is really going to care about that distinction). Cable Labs recently came out with a DOCSIS upgrade that can increase upload speeds, but there’s been no talk from Charter about making that upgrade. Like the other big cable companies, Charter serves businesses that want faster upload speeds with fiber.

Along with the introduction of gigabit broadband the company also says it’s going to increase the speed of it’s minimum broadband product. In the competitive markets listed above Charter has already increased the speed of its base product to 200 Mbps download, up from 100 Mbps.

It’s going to be interesting to find out what Charter means by the promise to cover “virtually’ their whole footprint. Charter grew by purchasing systems in a wide range of conditions. I know of smaller Charter markets where customers don’t get more than 20 Mbps. There is also a well-known lawsuit against Charter in New York State that claims that a lot of households in upstate New York are getting speeds far slower than advertised due to having outdated cable modems.

The upgrade to DOCSIS 3.1 can be expensive in markets that have not yet been upgraded to DOCSIS 3.0. An upgrade might mean replacing power taps and other portions of the network, and in some cases might even require a replacement of the coaxial cable. My guess is that the company won’t rush to upgrade these markets the upgrade to DOCSIS 3.1 this year. I’m sure the company will look at them on a case-by-case basis.

The company has set a target price for a gigabit at $124.95. But already in the competitive markets like Oahu the company was selling introductory packages for $104.99. There is also a bundling discount for cable subscribers.

The pricing list highlights that they still have markets with advertised speeds as low as 30 Mbps – and the company’s price for the minim speeds is the same everywhere, regardless if that product is 30 Mbps or 200 Mbps. And as always with cable networks, these are ‘up to’ speeds and as I mentioned, there are markets that don’t meet these advertised speeds today.

Overall this ought to result in a lot of home and businesses getting faster broadband than today. We saw something similar back when the cable companies implemented DOCSIS 3.0 and the bigger companies unilaterally increased speeds to customers without increasing the prices. Like other Charter customers, I will be interested in what they do in my market. I have the 60 Mbps product and I’ll be interested to see if my minimum speeds is increased to 100 Mbps or 200 Mbps and if I’m offered a gigabit here. With the upgrade time frame they are promising I shouldn’t have to wait long to find out.

Spectrum and 5G

All of the 5G press has been talking about how 5G is going to be bringing gigabit wireless speeds everywhere. But that is only going to be possible with millimeter wave spectrum, and even then it requires a reasonably short distance between sender and receiver as well as bonding together more than one signal using multiple MIMO antennae.

It’s a shame that we’ve let the wireless marketeers equate 5G with gigabit because that’s what the public is going to expect from every 5G deployment. As I look around the industry I see a lot of other uses for 5G that are going to produce speeds far slower than a gigabit. 5G is a standard that can be applied to any wireless spectrum and which brings some benefits over earlier standards. 5G makes it easier to bond multiple channels together for reaching one customer. It also can increase the number of connections that can be made from any given transmitter – with the biggest promise that the technology will eventually allow connections to large quantities of IOT devices.

Anybody who follows the industry knows about the 5G gigabit trials. Verizon has been loudly touting its gigabit 5G connections using the 28 GHz frequency and plans to launch the product in up to 28 markets this year. They will likely use this as a short-haul fiber replacement to allow them to more quickly add a new customer to a fiber network or to provide a redundant data path to a big data customer. AT&T has been a little less loud about their plans and is going to launch a similar gigabit product using 39 GHz spectrum in three test markets soon.

But there are also a number of announcements for using 5G with other spectrum. For example, T-Mobile has promised to launch 5G nationwide using its 600 MHz spectrum. This is a traditional cellular spectrum that is great for carrying signals for several miles and for going around and through obstacles. T-Mobile has not announced the speeds it hopes to achieve with this spectrum. But the data capacity for 600 MHz is limited and binding numerous signals together for one customer will create something faster then LTE, but not spectacularly so. It will be interesting to see what speeds they can achieve in a busy cellular environment.

Sprint is taking a different approach and is deploying 5G using the 2.5 GHz spectrum. They have been testing the use of massive MIMO antenna that contain 64 transmit and 64 receive channels. This spectrum doesn’t travel far when used for broadcast, so this technology is going to be used best with small cell deployments. The company claims to have achieved speeds as fast as 300 Mbps in trials in Seattle, but that would require binding together a lot of channels, so a commercial deployment is going to be a lot slower in a congested cellular environment.

Outside of the US there seems to be growing consensus to use 3.5 GHz – the Citizens Band radio frequency. That raises the interesting question of which frequencies will end up winning the 5G race. In every new wireless deployment the industry needs to reach an economy of scale in the manufacture of both the radio transmitters and the cellphones or other receivers. Only then can equipment prices drop to the point where a 5G capable phone will be similar in price to a 4GLTE phone. So the industry at some point soon will need to reach a consensus on the frequencies to be used.

In the past we rarely saw a consensus, but rather some manufacturer and wireless company won the race to get customers and dragged the rest of the industry along. This has practical implications for early adapters of 5G. For instance, somebody buying a 600 MHz phone from T-Mobile is only going to be able to use that data function when near to a T-Mobile tower or mini-cell. Until industry consensus is reached, phones that use a unique spectrum are not going to be able to roam on other networks like happens today with LTE.

Even phones that use the same spectrum might not be able to roam on other carriers if they are using the frequency differently. There are now 5G standards, but we know from practical experience with other wireless deployments in the past that true portability between networks often takes a few years as the industry works out bugs. This interoperability might be sped up a bit this time because it looks like Qualcomm has an early lead in the manufacture of 5G chip sets. But there are other chip manufacturers entering the game, so we’ll have to watch this race as well.

The word of warning to buyers of first generation 5G smartphones is that they are going to have issues. For now it’s likely that the MIMO antennae are going to use a lot of power and will drain cellphone batteries quickly. And the ability to reach a 5G data signal is going to be severely limited for a number of years as the cellular providers extend their 5G networks. Unless you live and work in the heart of one of the trial 5G markets it’s likely that these phones will be a bit of a novelty for a while – but will still give a user bragging rights for the ability to get a fast data connection on a cellphone.

Edging Closer to Satellite Broadband

A few weeks ago Elon Musk’s SpaceX launched two test satellites that are the first in a planned low-orbit satellite network that will blanket the earth with broadband. The eventual network, branded as Starlink, will consist of 4,425 satellites deployed at 700 miles above earth and another 7,518 deployed at around 210 miles of altitude.

Getting that many satellites into orbit is a daunting logistical task. To put this into perspective, the nearly 12,000 satellites needed are twice the number of satellites that have been launched in history. It’s going to take a lot of launches to get these into the sky. SpaceX’s workhorse rocket the Falcon 9 can carry about ten satellites at a time. They also have tested a Falcon Heavy system that could carry 20 or so satellites at a time. If they can make a weekly launch of the larger rocket that’s still 596 launches and would take 11.5 years. To put that number into perspective, the US led the world with 29 successful satellite launches last year, with Russia second with 21 and China with 16.

SpaceX is still touting this as a network that can make gigabit connections to customers. I’ve read the FCC filing for the proposed network several times and it looks to me like that kind of speed will require combining signals from multiple satellites to a single customer and I have to wonder if that’s practical when talking about deploying this networks to tens of millions of simultaneous subscribers. It’s likely that their standard bandwidth offering is going to be something significantly less.

There is also a big question to me about the capacity of the backhaul network that carry signal to and from the satellites. It’s going to take some major bandwidth to handle the volume of broadband users that SpaceX has in mind. We are seeing landline long-haul fiber networks today that are stressed and reaching capacity. The satellite network will face the same backhaul problems as everybody else and will have to find ways to cope with a world where broadband demand doubles every 3 years or so. If the satellite backhaul gets clogged or if the satellites get over-subscribed then the quality of broadband will degrade like with any other network.

Interestingly, SpaceX is not the only one chasing this business plan. For instance, billionaire Richard Branson wants to build a similar network that would put 720 low-orbit satellites over North America. Telesat has launched two different test satellites and also want to deploy a large satellite network. Boeing also announced intentions to launch a 1,000-satellite network over North America. It’s sounding like our skies are going to get pretty full!

SpaceX is still predicting that the network is going to cost roughly $10 billion to deploy. There’s been no talk of consumer prices yet, but the company obviously has a business plan – Musk want to use this business as the primary way to fund the colonization of Mars. But pricing is an issue for a number of reasons. The satellites will have some finite capacity for customer connections. In one of the many articles I read I saw the goal for the network is 40 million customers (and I don’t know if that’s the right number, but there is some number of simultaneous connections the network can handle). 40 million customers sounds huge, but with a current worldwide population of over 7.6 billion people it’s miniscule for a worldwide market.

There are those predicting that this will be the salvation for rural broadband. But I think that’s going to depend on pricing. If this is priced affordably then there will be millions in cities who would love to escape the cable company monopoly, and who could overwhelm the satellite network. There is also the issue of local demand. Only a limited number of satellites can see any given slice of geography. The network might easily accommodate everybody in Wyoming or Alaska, but won’t be able to do the same anywhere close to a big city.

Another issue is worldwide pricing. A price that might be right in the US might be ten times higher than what will be affordable in Africa or Asia. So there is bound to be pricing differences based upon regional incomes.

One of the stickier issues will be the reaction of governments that don’t want citizens using the network. There is no way China is going to let citizens bypass the great firewall of China by going through these satellites. Repressive regimes like North Kora will likely make it illegal to use the network. And even democratic countries like India might not like the idea – last year they turned down free Internet from Facebook because it wasn’t an ‘Indian’ solution.

Bottom line is that this is an intriguing idea. If the technology works as promised, and if Musk can find the money and can figure out the logistics to get this launched it’s going to be another new source of broadband. But satellite networks are not going to solve the world’s broadband problems because they are only going to be able to help some small limited percentage of the world’s population. But with that said, a remote farm in the US or a village in Africa is going to love this when it’s available.

Abandoned Telecom Infrastructure

I saw an article about Merrill, Oregon where the city was wrestling about what to do with an abandoned cable TV network hanging on poles in the City. It’s actually a fairly common occurrence to have abandoned telecom property on poles and I’ve been contacted by a number of cities over the years wondering what how to deal with the situation.

In this particular case the historic cable system in the city was operated by Rapid Communications out of Texas. That company sold cable properties to a number of companies in 2008 and the Merrill system went to Almega Cable Company, which stopped offering service in the City and went out of business in 2011.

There are all sorts of telecom assets that have been abandoned on poles and defunct cable companies are only one example. I saw a lot of WiFi mesh networks abandoned fifteen years ago as operators folded and never retrieved their equipment. There are numerous CLECs that folded in the late 1990s and that walked away from fiber networks on poles.

Having an abandoned set of wires on poles complicates the lives of any other pole users in the market. The unused wires take up space on poles and make it hard for anybody else to add additional wires onto the pole.

Abandoned networks also create havoc for the normal pole attachment process. This process requires buy-in from existing utilities to move or rearrange cables to make room for a new attacher. A new attacher can be paralyzed if they are unable to create the required clearance from existing wires.

In the end I’ve almost always seen the responsibility for getting rid of the network fall to the local government. Somebody has to go through the process of making certain there is no remaining active owner of the network before it can be condemned. Generally the pole owner is not willing to take on that role unless they have a need of their own to add wires to the poles.

Merrill is now undertaking the task of condemning the network. They have to follow law and post public notices to make sure that nobody claims rights to the cables. In the case of a cable company the City not only has to deal with the wires on poles, but also with customer drops and pedestals scattered throughout the community.

Merrill is hoping that some new carrier will want to use the cable network for overlashing fiber. Overlashing is the process of tying the fiber to existing wires and is generally the lowest cost method of fiber construction. But even if they find a taker for the offer my guess is that the new fiber provider is not going to want to assume ownership for the coaxial cables since that would give them liability for any issues or problems with the old wiring. So the City might end up owning the cables in perpetuity. If they don’t find a buyer, the city will have to pay to have the cables removed – although in today’s market there might be enough value in the copper inside the coaxial cables to offset the cost of removal.

We are going to see a lot more abandoned assets on poles in the future. We are just now entering a period when numerous companies are going to want to hang wireless devices of all types on poles. Some of these devices are tiny and I’ve seen others that are the size of a dorm refrigerator. It’s inevitable that some of the wireless deployments will fail, or that the wireless companies will lose the customers served by a given device.

Over time a significant inventory of abandoned wireless devices will likely grow in most cities. And unlike an abandoned cable network, my guess is that it’s often going to be hard to know which wireless devices have been abandoned or even who owns many of them. Cities ought to be considering ordinances today that require the companies that deploy wireless devices to somehow notify them of what they are doing and to also clearly label the ownership of each device.

But there is a movement at the FCC, in Congress and in States legislatures to institute rules for wireless carriers that would override any local rules. Such global rules are going to hinder cities in the coming decades when they try to deal with abandoned assets clogging their pole lines. Most of the proposed new rules I’ve seen don’t address this issue, which will make it messy to deal with later.

5G is Fiber-to-the-Curb

The marketing from the wireless companies has the whole country buzzing with speculation that the whole world is going to go wireless with the introduction of 5G. There is a good chance that within five years that a good and reliable and pole-mounted technology could become the preferred way to go from the curb to homes and businesses. When that happens we will finally have wireless fiber-to-the-curb – something that I’ve heard talked about for at least 25 years.

I remember visiting an engineer in the horse country of northern Virginia in the 1990s who had developed a fiber-to-the-curb wireless technology that could deliver more than 100 Mbps from a pole to a house. His technology was limited in that there had to be one pole-mounted transmitter per customer, and there was a distance limitation of a few hundred feet for the delivery. But he was clearly on the right track and was twenty years ahead of his time. At that time we were all happy with our 1 Mbps DSL and 100 Mbps sounded like science fiction. But I saw his unit functioning at his home, and if he had caught the attention of a big vendor we might have had wireless fiber-to-the-curb a lot sooner than now.

I have to laugh when I read people talking about our wireless future, because it’s clear that this technology is going to require a lot of fiber. There is a lot of legislative and lobbying work going on to make it easier to mount wireless units on poles and streetlights, but I don’t see the same attention being put into making it easier to build fiber – and without fiber this technology is not going to work as promised.

It’s easy to predict that there are going to be a lot of lousy 5G deployments. ISPs are going to come to a town, connect to a single gigabit fiber and then serve the rest of the town from that one connection. This will be the cheap way to deploy this technology and those without capital are going to take this path. The wireless units throughout the town will be fed with wireless backhaul, with many of them on multiple wireless hops from the source. In this kind of network the speeds will be nowhere near the gigabit capacity of the technology, the latency will be high and the network will bog down in the evenings like any over-subscribed network. A 5G network deployed in this manner will not be a killer app that will kill cable networks.

However, a 5G fiber-to-the-curb network built the right way is going to be as powerful as an all-fiber network. That’s going to mean having neighborhood wireless transmitters to serve a limited number of customers, with each transmitter fed by fiber. When Verizon and AT&T talk about the potential for gigabit 5G this is what they are talking about. But they are not this explicit because they are not likely today to deploy networks this densely. The big ISPs still believe that people don’t really need fast broadband. They will market this new technology by stressing that it’s 5G while building networks that will deliver far less than a gigabit.

There are ISPs who will wait for this technology to mature before switching to it, and they will build networks the right way. In a network with fiber everywhere this technology makes huge sense. One of the problems with a FTTH network that doesn’t get talked about a lot is abandoned drops. Fiber ISPs build drops to homes and over time a substantial number of premises no longer use the network for various reasons. I know of some 10-year old networks where as many as 10% of fiber drops have been abandoned as homes that buy service from somebody else. A fiber-to-the-curb network solves this problem by only serving those who have active service.

I also predict that the big ISPs will make every effort to make this a customer-provisioned technology. They will mail customers a receiver kit to save on a truck roll, because saving money is more important to them than quality. This will work for many customers, but others will stick the receiver in the wrong place and never get the speed they might have gotten if the receiver was mounted somewhere else in the home.

There really are no terrible broadband technologies, but there are plenty of terrible deployments. Consider that there are huge number of rural customers being connected to fixed wireless networks. When those networks are deployed properly – meaning customers are not too far from the transmitter and each tower has a fiber feed – the speeds can be great. I know a colleague who is 4-miles from a wireless tower and is getting nearly 70 Mbps download. But there are also a lot of under-capitalized ISPs that are delivering speeds of 5 Mbps or even far less using the same technology. They can’t afford to get fiber to towers and instead use multiple wireless hops to get to neighborhood transmitters. This is a direct analogue of what we’ll see in poorly deployed 5G networks.

I think it’s time that we stop using the term 5G as a shortcut for meaning gigabit networks. 5G is going to vary widely depending upon the frequencies used and will vary even more widely depending on how the ISP builds their network. There will be awesome 5G deployments, but also a lot of so-so and even lousy ones. I know I will be advising my clients on building wireless fiber-to-the-curb – and that means networks that still need a lot of fiber.