A Deeper Look at 5G

The consulting firm Bain & Company recently looked at the market potential for 5G. They concluded that there is an immediate business case to be made for 5G deployment. They go on to conclude that 5G ‘pessimists’ are wrong. I count myself as a 5G pessimist, but I admit that I look at 5G mostly from the perspective of the ability of 5G to bring better broadband to small towns and rural America. I agree with most of what Bain says, but I take the same facts and am still skeptical.

Bain says that the most immediate use for 5G deployment is in urban areas. They cite an interesting statistic I’ve never seen before that says that it will cost $15,000 – $20,000 to upgrade an existing cell site with 5G, but will cost between $65,000 and $100,000 to deploy a new 5G node. Until the cost for new 5G cell sites comes way down it’s going to be hard for anybody to justify deploying new 5G cell sites except in those places that have potential business to support the high investment cost.

Bain recommends that carriers should deploy 5G quickly in those places where it’s affordable in order to be the first to market with the new technology. Bain also recommends that cellular carriers take advantage of improved mobile performance, but also look at hard at the fixed 5G opportunities to deliver last mile broadband. They say that an operator that maximizes both opportunities should be able to see a fast payback.

A 5G network deployed on existing cell towers is going to create small circles of prospective residential broadband customers – and that circle isn’t going to be very big. Delivering significant broadband would mean small circles delivering broadband for 1000 to 1,500 feet from a transmitter. Cell towers today are much farther apart than those distances, and this means a 5G delivery map consisting of scattered small circles.

There are not many carriers willing to tackle that business plan. It means selectively marketing only to those households within range of a 5G cell site. AT&T is the only major ISP that already uses this business plan. AT&T currently offers fiber to any homes or businesses close to their numerous fiber nodes. They could use that same sales plan to sell fixed broadband to customers close to each 5G cell site. However, AT&T has said that, at least for now, they don’t see a business case for 5G similar to their fiber roll-out.

Verizon could do this, but they have been walking away from a lot of their residential broadband opportunities, going so far as to sell a lot of their fiber FiOS customers to Frontier. Vericaon says they will deploy 5G in several cities starting next year but has never talked about the number of potential households they might cover. This would require a major product roll-out for T-Mobile or Sprint, but in the document they filed with FCC to support their merger they said they would tackle this market. Both companies currently don’t have the fleet of needed technicians or the backoffice ready to support the fixed residential broadband market.

The report skims past the the question of the availability of 5G technology. Like any new technology the first few generations of field equipment are going to have problems. Most players in the industry have learned the lesson of not widely deploying any new technology until it’s well-proven in the field. Verizon says their early field trials have gone well and we’ll have to wait until next year to see how 5G they are ready to deploy with first generation technology.

Bain also says there should be no ‘surge’ in capital expenditures if companies deploy 5G wisely – but the reverse is also true, and bringing 5G small cells to places without current fiber is going to be capital intensive. I agree with Bain that, technology concerns aside, that the only place where 5G makes sense for the next few years is urban areas and mostly on existing cell sites.

I remain a pessimist of 5G being feasible in more rural areas. The cost of the electronics will need to drop to a fraction of today’s cost. There are going to always be pole issues for deploying smaller cells in rural America – even should regulators streamline the hanging of small cell sites, those 5G devices can’t be placed onto the short poles we often see in rural America. While small circles of broadband delivery might support an urban business model, the low density in rural America might never make economic sense.

I certainly could be wrong, but I don’t see any companies sinking huge amounts of money into 5G deployments until the technology has been field-proven and until the cost of the technology drops and stabilizes. I hope I am proven wrong and that somebody eventually finds a version of the technology that will benefit rural America – but I’m not going to believe it until I can kick the tires.

5G Cellular for Home Broadband?

Sprint and T-Mobile just filed a lengthy document at the FCC that describes the benefits of allowing the two companies to merge. This kind of filing is required for any merger that needs FCC approval. The FCC immediately opened a docket on the merger and anybody that opposes the merger can make counterarguments to any of the claims made by the two companies.

The two companies decided to highlight a claim that the combined Sprint and T-Mobile will be able to roll out a 5G network that can compete with home broadband. They claim that by 2024 they could gain as much as a 7% total market penetration, making them the fourth biggest ISP in the country.

The filing claims that their 5G network will provide a low-latency broadband product with speeds in excess of 100 Mbps within a ‘few years’. They claim that customers will be able to drop their landline broadband connection and tether their home network to their unlimited cellular data plan instead. Their filing claims that the this will only be possible with a merger. I see a lot of holes that can be poked into this claim:

Will it Really be that Fast? The 5G cellular standard calls for eventual speeds of 100 Mbps. If 5G follows the development path of 3G and 4G, then those speeds probably won’t be fully met until near the end of the next decade. Even if 5G network can achieve 100 Mbps in ideal conditions there is still a huge challenge to meet those speeds in the wild. The 5G standard achieves 100 Mbps by bonding multiple wireless paths, using different frequencies and different towers to reach a customer. Most places are not receiving true 4G speeds today and there is no reason to think that using a more complicated delivery mechanism is going to make this easier.

Cellphone Coverage is Wonky.  What is never discussed when talking about 5G is how wonky all wireless technologies are in the real world. Distance from the cell site is a huge issue, particular with some of the higher frequencies that might be used with 5G. More important is local interference and propagation. As an example, I live in Asheville, NC. It’s a hilly and wooded town and at my house I have decent AT&T coverage, but Verizon sometimes has zero bars. I only have to go a few blocks to find the opposite situation where Verizon is strong and AT&T doesn’t work. 5G is not going to automatically overcome all of the topographical and interference issues that affect cellular coverage.

Would Require Significant Deployment of Small Cell Sites. To achieve the 100 Mbps in enough places to be a serious ISP is going to require a huge deployment of small cell sites, and that means the deployment of a lot of fiber. This is going to be a huge hurdle for any wireless company that doesn’t have a huge capital budget for fiber. Many analysts still believe that this might be a big enough hurdle to quash a lot of the grandiose 5G plans.

A Huge Increase in Wireless Data Usage. Using the cellular network to provide the equivalent of landline data means a magnitude increase in the bandwidth that will be carried by the cellular networks. FierceWireless along with Strategic Analytics recently did a study on how the customers of the major cellular companies use data. They reported that the average T-Mobile customer today uses 18.4 GB of data per month with 5.3 GB on the cellular network and the rest on WiFi. Sprint customers use 18.2 GB per month with 4.4 GB on the cellular networks. Last year Cisco reported that the average residential landline connection used over 120 GB per month – a number that is doubling every three or four years. Are cellular networks really going to be able to absorb a twenty or thirty times increase in bandwidth demand? That will require massive increases in backhaul bandwidth costs along with huge capital expenditures to avoid bottlenecks in the networks.

Data Caps are an Issue.  None of the cellular carriers offers truly unlimited data today. T-Mobile is the closest, but their plan begins throttling data speeds when a customer hits 50 GB in a month. Sprint is stingier and is closer to AT&T and Verizon and starts throttling data speeds when a customer hits 23 GB in a month. These caps are in place to restrict data usage on the network (as opposed to the ISP data caps that are meant to generate revenue). Changing to 5G is not going to eliminate network bottlenecks, particularly if we see millions of customers using cellular networks instead of landline networks. All of the carriers also have a cap on tethering data – making it even harder to use as a landline substitute – T-Mobile caps tethering at 10 GB per month.

Putting it all into Context. To put this into context, John Legere already claims today that people ought to be using T-Mobile as a landline substitute. He says people should buy a multi-cellphone plan and use one of the phones to tether to landline. 4G networks today have relatively high latency and 4G speeds today can reach 15 Mbps in ideal conditions but are usually slower. 4G also ‘bursts’ today and offers faster speeds for the first minute or two and then slows down to a crawl (you see this when you download phone apps). I think we have to take any claims made by T-Mobile with a grain of salt.

I’m pretty sure that concept of using the merger to create a new giant ISP is mostly a red herring. No doubt 5G will eventually offer an alternative to landline broadband for those homes that aren’t giant data users – but it’s also extremely unlikely that a combined T-Mobile / Sprint could somehow use 5G cellular to become the fourth biggest ISP starting ‘a few years from now’. I think this claim is being emphasized by the two companies to provide soundbites to regulators and politicians who want to support the merger.

Will 5G Phones Need WiFi?

Our cellular networks have become heavily reliant on customers using WiFi. According to Cisco’s latest Virtual Network Index about 60% of the data generated from cellphones is carried over WiFi and landline broadband connections. Most of us have our cellphones set to grab WiFi networks that we are comfortable with, particularly in the home and office.

The move to use WiFi for data was pushed by the cellular companies. As recently as just a few years ago they were experiencing major congestion at cell sites. This congestion was due to a combination of cell sites using older versions of 4G technology and of inadequate backhaul data pipes feeding many cell sites. The cellular carriers and manufacturers made it easy to switch back and forth between cellular and WiFi and most people quickly got adept at minimizing data usage on the cellular network.

Many people have also started using WiFi calling. This is particularly valuable to those who live or work in a building with poor indoor cellular coverage, and WiFi calling allows a phone to process voice through the WiFi connection. But this has always been a sketchy technology and WiFi calling is often susceptible to poor voice quality and unexpected call droppage due to WiFi fluctuations. WiFi calling also doesn’t roam, so anybody walking out of the range of their WiFi router automatically drops the call.

However, recently we’ve seen possibly the start of a trend of more broadband traffic staying on the cellular network. In a recent blog I cited evidence that unlimited cellular customers are using less WiFi and are instead staying on their cellular data network even when WiFi is available. Since most people use WiFi to preserve usage on their cellular data plans, as more people feel comfortable about not hitting a data caps we ought to see many people sticking more to cellular.

5G ought to make it even easier to keep traffic on the cellular network. The new standard will make it easier to make and hold a connection to a cell site due to a big increase in the number of possible simultaneous connections available at each cell site. This should finally eliminate not being able to make a cellular connection in crowded locations.

The 5G improvements are also going to increase the available bandwidth to cellphones through the use of multiple antennas and frequencies. The expectations are that cellphone download speeds will creep up with each incremental improvement in the coming 5G networks and that speeds will slowly improve over the next decade.

Unfortunately this improved performance might not make that big of a difference within buildings with poor cellular coverage today, because for the most part the frequencies used for 5G cellular will be the same ones used today. We keep reading about the coming use of millimeter waves, but the characteristics of those frequencies, such as the short distances covered are going to best fit urban areas and it’s likely to be a long while until we see these frequencies being used everywhere in the cellular networks. Even where used, those higher frequencies will have an even harder time penetrating buildings than today’s lower frequencies.

Overall, the improvements of 5G ought to mean that cellular customers ought to be able to stay more easily with cellular networks and not need WiFi to the same extent as today. A transition to less use of WiFi will be accelerated if the cellular marketing plans continue to push unlimited or large data-cap plans.

This all has big implications on network planning. Today’s cellular networks would be instantly swamped if people stopped using WiFi. The use of cellular data is also growing at a much faster pace than the use of landline data. Those two factors together portends a blazingly fast growth in the backhaul needed for cell sites. We are likely to see geometric rates of growth, making it expensive and difficult for the cellular carriers to keep up with data demand. It’s sounding to me like being a cellular network planner might be one of the hardest jobs in the industry right now.

Prices for Wireless Pole Attachments

The FCC’s advisory BDAC group looking at pole attachment issues gathered prices for wired and wireless pole attachments. Their goal was to understand the range of attachments prices and to see if the group could come up with a consensus pricing recommendation to the FCC. Wired pole attachments are the annual fee that the owner of a wired network (telco, cable company, fiber) pays for each place one of their wires connects to a pole. Wireless pole attachments are the fees charged for putting some kind of wireless transmitter on a pole.

There were no surprises for wired pole attachments. The group looked at 577 different attachments and found that the average price was $17.58 per year for each wired pole attachment while the median was $15.56. These are similar to the prices I see all over the country.

Wireless attachments varied a lot more. The BDAC group looked at 407 samples of wireless pole attachment prices from around the country and the average price was $505.56 while the median price was $56.60. For the median to be that low means that the sample was stacked with low readings.

That’s easy to understand if you look at wireless pole attachment rules around the country. Three states – Arizona, Indiana and North Carolina have capped the annual price of a wireless pole attachment at $50 per year, while Texas capped it at $20. Other states like Colorado, Delaware and Virginia cap rates at actual cost. For the median price to be that low means that just less than half of the of the 407 same prices were likely from this group of states. And this means that that no conclusions can be drawn from the results of the BDAC’s sampling – it was definitely not a random or representative sample – yet the BDAC group summarized the results as if it was, and even calculates a standard deviation.

Thirteen states have already acted to limit the cost for wireless attachments, mostly through legislation. Florida and Rhode Island have capped the cost of a wireless pole attachment at $150; Minnesota set the rate at $175 and Ohio set the maximum rate at $200. Kansas says the rate must be ‘competitively neutral’ and Iowa caps the rate at the FCC rate.

One of the biggest issues with arbitrarily setting wireless pole attachment rates is that the wireless devices being put onto poles vary by size and can use between 1 and 10 feet of pole space. Regulators have traditionally used the concept of allocating costs by the amount of usable space taken by a given connector, and in fact uses the term ‘pole real estate’ to describe the relationship between space used and price paid. Any attachment that uses more of the pole real estate should expect to pay more for the attachment – largely in a linear relationship.

The results of the sample might have been more valid has the group not included prices for places where the legislators have capped or limited rates. Also, the big wireless companies are part of the BDAC group and I have to suspect that they brought in the worst case examples they could find where they are paying the highest prices. This exercise proved nothing other than that the price for wireless connections are higher in states where the rates are not capped.

It’s not surprising, but the BDAC group was unable to secure a consensus on prices or pricing methodology for the FCC. Unsurprisingly the network operator – those who attach to poles – think rates should be cost based. Pole owners think rates ought to be market based.

There are, of course, many other factors to consider in setting pole attachment rates. In the case of wireless connections there are concerns about the safety of working near the wireless devices after storm damage. There are also significant concerns in cities about aesthetics.

The battle in setting these rates is still heating up. An additional fifteen states – AK, CA, CT, GA, HI, IL, ME, MO, NE, NM, PA, WA and WI – have considered pole attachment legislation that didn’t pass. There is the possibility of the FCC trying to set rates and there have been drafts of several bills in Congress that have considered the idea. Since this seems to be the primary focus of the wireless companies there will be a lot of lobbying on the issue.

The False 5G Narrative in DC

The FCC and some members of Congress have adopted a false narrative about our need for the rapid deployment of 5G. The narrative says that rest of the world is already ahead of the US with 5G deployment and warns about the huge downsides to our economy should we not sweep aside all barriers for deploying 5G.

This is the narrative being used to justify giving wireless carriers cheap and ubiquitous access to poles for 5G transmitters. The FCC and others want to sweep away all state and local rules for pole-related issues. They want rules that will allow wireless carriers to deploy electronics first and straighten out the paperwork later. They argue that all of this is needed so that the country can keep up with the rest of the world in 5G deployment, with some horrific, yet unspecific disastrous result should we fail to make this happen.

The big problem with this narrative is that it’s based upon false premises. The narrative is nothing more than a fairy tale spun by the wireless industry as a way to justify bypassing the normal regulatory process, to hand them fast and cheap connections on poles for wireless devices.

First, there is no big impending needed to deploy huge numbers of 5G devices, because the technology doesn’t yet exist. There are two distinct 5G technologies – 5G cellular and 5G millimeter wave broadband. The industry agrees that it’s going to take a decade until we have a 5G-compliant cellular technology available. There are thirteen key aspects of the new 5G standard that must now be tackled by engineers and then woven into the next generation electronics. We made numerous gradual incremental improvements in technology to evolve from 3G to 4G and it was only last year that we finally saw the first deployments of 4G technology that meets most of the original 4G specifications. There is no reason to think that we are going to progress any faster towards 5G and we will upgrade over time to 4.1G, 4.2G, etc. until a decade from now we finally have a 5G cellular network. By then we will no doubt start over and begin implementing 6G.

There is similarly no pressing need to deploy millimeter wave 5G. This is a technology that promises to potentially offer a gigabit alternative in residential neighborhoods. We have a long way to go before we are going to see wide-spread deployments of this technology. We are just now seeing the first early trials of the technology and it’s going to take years before electronics are widely available and affordable. Further, this technology is going to require a lot of concurrent fiber deployment, and that is likely to be the biggest cost barrier to deployment – not getting onto poles. I even have to wonder who is going to be deploying the 5G millimeter wave radios on a big scale – every one of the big telcos has made it clear that they are backing away from residential broadband, and the big cable companies have, or will soon have, gigabit-capable networks. We might never see the gigabit wireless networks that are the bait being used to tout 5G, because there might not be any deep-pocket ISPs willing to tackle such a large infrastructure investment.

What the wireless carriers are starting to deploy today are 4G small cell sites. These cell sites are being used to supplement and boost the existing cellular networks. The original big-tower cellular network was built to provide voice services and the cell site spacing is terrible for delivering broadband, which uses frequencies that don’t carry as far as the lower frequencies used for voice. The exploding demand for cellular broadband is driving the need for more cell sites just to accommodate the number of users and the amount of bandwidth that can be deployed in a given neighborhood.

The existing cellular networks are clearly under stress in urban areas. But the real issue we should be talking about is how to bolster 4G networks, not how we are already behind in the mythical 5G race. The cellular carriers are crafty and they are using the 5G race narrative as a way to get politicians to support their demands. They are promising wireless gigabit cellular speeds in just the next few years and cheap wireless gigabit broadband soon coming into every home. They have created a feigned panic that the current regulatory rules will stop this progress dead in it’s track unless carriers get fast and cheap pole access.

If this 5G narrative was true we’d be seeing a collapse of cable company stock prices. Cable companies have the most to lose if they are suddenly faced with gigabit cellular and gigabit wireless to the home. We are probably decades away from seeing cellular speeds approaching anything close to a gigabit – that’s the biggest myth in this narrative. And even when the new technology is developed for wireless gigabit to the home one has to ask what ISPs are going to spend the huge billions needed to build that network to compete against the entrenched cable companies.

I don’t want to minimize some of the barriers faced by wireless companies when trying to get onto poles today. Wireless carriers have cited a few horror stories in FCC filings. But like anything else brand new, most pole owners aren’t sure yet how to respond to requests for wireless attachments. There are a lot of issues to work through including safety, pricing, aesthetics and the long-term impact on the real estate space on poles. These are all issues that need solutions, but I can’t find one reason why we need to tackle this at breakneck speed or why we need to give the wireless carriers everything on their wish list. It’s important to bolster the stressed 4G network and we will want to be ready for the 5G technology when it is finally available. We have the time to make the needed regulatory changes in the deliberative manner that makes sure that all aspects of the issues are considered. We don’t need a fast knee-jerk response to a false 5G narrative that might create more problems than it solves.

Are There Any Level Playing Fields?

If you follow regulatory filings, one of the most common arguments you will encounter from the big ISPs is the concept of a level playing field. The idea behind the level playing field is that every competitor in the industry should be working from the same set of rules and nobody should have a market advantage due to regulatory rules. AT&T and Verizon have both rolled out the argument many times when arguing to tighten rules against potential competitors.

There are several good examples of the level playing field argument anywhere that the big ISPs fight to keep municipal entities from building fiber networks. They argue, for example, that municipal entities have an unfair market advantage because they don’t pay state and federal income taxes. But this argument falls apart quickly under examination. First, many municipal ventures such as electric or broadband entities pay in lieu of taxes. This is a tax-like fee that the local government charge to a municipal business. While it’s not really a tax, the fees ac like taxes and can be substantial.

Even more importantly, I can remember many years when AT&T or Verizon made the news due to paying no federal income taxes. Big corporations have numerous tax shelters that allow them to shield income from taxes, and the telcos have gotten numerous favorable rules into the tax code to allow them to walk away from most of their expected tax obligations. You can’t really fault a big corporation for legally avoiding taxes (unless you fault them for the lobbying that slanted the tax codes in their favor to begin with). It’s dishonest for these big ISPs to claim that a municipality has an advantage due to their tax-free status when they pay little or no taxes themselves. Under deeper examination, a municipal fiber venture paying 5% of revenues for in lieu of taxes is often paying a larger percentage of taxes than the big ISPs.

The big ISPs also claim that municipalities have an unfair advantage due to being able to finance fiber networks with municipal bonds. While it’s true that bonds often have a lower interest rate, I have compared bond and bank financing side-by-side many times and for various reasons that are too long to discuss in a blog, bond financing is usually more expensive than commercial loans. It’s also incredibly difficult for a municipality to walk away from a bond obligation while we have numerous examples, such as the Charter bankruptcy a few years back that let a big ISP walk away from repaying the debt used to build their networks.

The big ISPs don’t only use this argument against municipal competitors. AT&T is using the argument as a way to justify hanging 5G wireless devices on poles everywhere. They think there should be a level playing field for pole access, although at this early stage they are one of the few companies looking to deploy 5G small cells. Interestingly, while AT&T wants the right to easy and cheap pole access everywhere, in those places where they own the poles they fight vigorously to keep competitors from getting access. They effectively stopped Google Fiber plans to build in Silicon Valley by denying them access to AT&T poles.

Every time I hear the level playing field argument my first thought is that I would love it if we really had a level playing field. I look at the way that the current FCC is handing the big ISPs their wish list of regulatory rule changes and wish that my clients could get the same kind of favorable regulatory treatment.

A good case in point is again the 5G small cell deployment issue. The FCC has already said that they are in favor of making it cheap and easy for wireless carriers to deploy 5G cell sites. It seems likely that the FCC is going to pass rules to promote 5G deployments unless Congress beats them to the punch. Yet these regulatory efforts to make it easier to deploy 5G conveniently are not asking to make it easier to deploy fiber. If things go in favor of the big ISPs they will have a market advantage where it’s easier to deploy last mile 5G instead of last mile fiber. This will give them a speed-to-market advantage that will let them try to squash anybody trying to compete against them with a FTTP network.

The FCC is supposedly pro-competition, and so if we really had a level playing field they would be passing rules to make it easier to deploy all broadband technologies. They have had decades to fix the pole attachment issues for fiber deployment and have not done so. But now they are in a rush to allow for 5G deployments, giving 5G ISPs a market advantage over other technologies. The consequences for this will be less competition, not more, because we’ve already seen how AT&T and Verizon don’t really compete with the cable companies. In markets where we have both Verizon FiOS and Comcast cable networks both companies charge high prices and are happy with high-margin duopoly competition. There is no reason to think these big ISPs won’t do the same with 5G.

I look around and I don’t see any level playing fields – particularly not any that give small competitors any advantages over the big ISPs. I do, however, so scads of regulatory rules that provide unequal protection for the big ISPs, and with the current FCC that list of advantages is expanding quickly. The big ISPs don’t really want a level playing field because they don’t want actual competition. There are many reasons why other countries have far more last-mile fiber deployed than the US – but one of the biggest reasons are regulatory rules here that protect the big ISPs.

The Looming Backhaul Crisis

I look forward a few years and I think we are headed towards a backhaul crisis. Demand for bandwidth is exploding and we are developing last-mile technologies to deliver the needed bandwidth, but we are largely ignoring the backhaul network needed to feed customer demand. I foresee two kinds of backhaul becoming a big issue in the next few years.

First is intercity backhaul. I’ve read several predictions that we are already using most of the available bandwidth on the fibers that connect major cities and the major internet POPs. It’s not hard to understand why. Most of the fiber between major cities was built in the late 1990s or even earlier, and much of that construction was funded by the telecom craze of the 90s where huge money was dumped into the sector.

But there has been very little new fiber construction on major routes since then, and I don’t see any carriers with business plans to build more fiber. You’d think that we could get a lot more bandwidth out of the existing fiber routes by upgrading the electronics on those fiber, but that’s not the long-haul fiber network operates. Almost all of the fiber pairs on existing routes have been leased out to various entities for their own private uses. The reality is that nobody really ‘owns’ these fiber routes since the routes are full of carriers that each have a long-term contract to use a few of the fibers. As long as any of these entities has enough bandwidth for their own network purposes they are not going to sink the big money into upgrading to terabit lasers, which are still very expensive.

Underlying that is a problem that nobody wants to talk about. Many of those fibers are aging and deteriorating. Over time fiber runs into problems and gets opaque. This can come from having too many splices in the fiber, or from accumulated microscopic damage from stress during fiber construction or due to temperature fluctuations. Fiber technology has improved tremendously since the 1990s – contractors are more aware of how to handle fiber during the construction period and the glass itself has improved significantly through improvements by the manufacturers.

But older fiber routes are slowly getting into physical trouble. Fibers go bad or lose capacity over time. This is readily apparent when looking at smaller markets. I was helping a client look at fibers going to Harrisburg, PA and the fiber routes into the city are all old and built in the early 90s and are experiencing regular outages. I’m not pointing out Harrisburg as a unique case, because the same is true for a huge number of secondary communities.

We are going to see a second backhaul shortage that is related to the intercity bandwidth shortage. All of the big carriers are talking about building fiber-to-the-home and 5G networks that are capable of delivering gigabit speeds to customers. But nobody is talking about how to get the bandwidth to these neighborhoods. You are not going to be able to feed hundreds of 5G fixed wireless transmitters using the existing bandwidth that is available in most places.

Today the cellular companies are paying a lot of money to get gigabit pipes to the big cell towers. Most recent contracts include the ability for these connections to burst to 5 or 10 gigabits. Getting these connections is already a challenge. Picture multiplying that demand by hundreds and thousands of new cell sites. To use the earlier example of Harrisburg, PA – picture somebody trying to build a 100-node 5G network there, each with gigabit connections to customers. This kind of network might initially work with a 10 gigabit backhaul connection, but as bandwidth demand keeps growing (doubling every three years), it won’t take long until this 5G networks will need multiple 10 gigabit connections, up to perhaps 100 gigabits.

Today’s backhaul network is not ready to supply this kind of bandwidth. You could build all of the fiber you want locally in Harrisburg to feed the 5G nodes, but that won’t make any difference if you can’t feed that whole network with sufficient bandwidth to get back to an Internet POP.

Perhaps a few carriers will step up and build the needed backhaul network. But I don’t see that multi-billion dollar per year investment listed in anybody’s business plans today – all I hear about are plans to rush to capture the residential market with 5G. Even if carriers step up and bolster the major intercity routes (and somebody probably will), that is only a tiny portion of the backhaul network that stretches to all of the Harrisburg markets in the country.

The whole backhaul network is already getting swamped due the continued geometric growth of broadband demand. Local networks and backhaul networks that were vigorous just a few years ago can get overwhelmed by a continuous doubling of traffic volume. If you look at any one portion of our existing backhaul network you can already see the stress today, and that stress will turn into backhaul bottlenecks in the near future.

The Lack of Broadband Competition

There is one statistic from the FCC annual report on the state of broadband that I’ve been meaning to write about. There is still a massive lack of broadband competition at speeds that most households are coming to think of as broadband.

Here are the key statistics from that report:

  • 13% of all households can’t get broadband that meets the FCC’s definition of 25/3 Mbps
  • 31% of homes have access to 25/3 Mbps, but not speeds of 100 Mbps
  • 15% have access to 100 Mbps from more than one provider
  • 41% have access to 100 Mbps from only one provider

It’s the last statistic that I find astounding. The current FCC declared with this report that the state of broadband in the country is healthy and that the market is taking care of the country’s broadband needs. I’ve written number blogs about the households in the bottom 13% that have little or no broadband, but I want to look closer at the top two categories.

Households in the 15% category are in markets where there is a fiber provider in addition to the incumbent cable company. The biggest fiber provider is still Verizon FiOS, but there are numerous others building fiber like AT&T, CenturyLink, Google Fiber, smaller telcos, small fiber overbuilders and municipalities.

This means that 41% of households (51 million homes) only have one option for fast broadband – the cable company. I see numerous problems related to this huge monopoly that has been won by the big cable companies. Consider the following:

  • The US already has some of the most expensive broadband in the developed world. The high prices are directly the result of the lack of competition.
  • This lack of competition is likely the driving factor for why most of the big ISPs in the US are rated at the bottom of all US corporations in terms of customer service. We know that customer service improves in markets where is broadband competition, but the big ISPs don’t make the same effort elsewhere.
  • We also know that competition between a cable company and a smaller fiber overbuilder lowers broadband prices. For example, there are markets where competitors like Google have set the price of a gigabit connection at $70, and the cable companies generally come close to matching the lower price. But preliminary pricing from Comcast and Charter for their new gigabit products where there are no competitors will be significantly north of $100 per month.
  • Even where there are competing networks, if both networks are owned by large ISPs we see duopoly competition where the big ISPs don’t push each other on price. For example, Comcast largely is able to offer the same prices when competing against Verizon FiOS as it does in markets where there is no fiber provider.
  • Industry analysts expect the big ISPs to start raising broadband rates for various reasons. The ISPs continue to lose telephone and cable customers and the national penetration rate for broadband is nearing a market saturation point. In order to satisfy Wall Street the big ISPs will have little choice other than raising broadband prices to maintain earnings growth.

I’m sure that the households in the bottom 13% of the market that can’t get good broadband are not sympathetic to those who can only buy fast broadband from one provider. But these statistics say that 41% of the whole market are dealing with a monopoly situation for fast broadband. Telecom is supposed to be a competitive business – but for the majority of the country the competitors have never showed up. For the FCC to declare that we have a healthy broadband market astounds me when so many households are hostage to a broadband monopoly.

There is always the chance that over the next decade that fixed 5G will bring more broadband competition. My guess, however, is that at least for a few years that this is going to be a lot more competition by press release than real competition. Deploying gigabit 5G like the big ISPs are all touting is going to require a lot more fiber than we have in place today. Deploying 5G without fiber backhaul might still result in decent broadband, but it’s not going to be the robust gigabit product that the ISPs are touting. But even poorly deployed 5G networks might bring 100+ Mbps broadband to a lot more homes after the technology gets a little more mature.

Unfortunately there is also the risk that 5G might just result in a lot more duopoly competition instead of real competition. If 5G is mostly deployed by big ISPs like Verizon and AT&T there is no reason to think that they will compete on price. Our only hope for real market competition is to see multiple non-traditional ISPs who will compete on price. However, it’s so tempting for ISPs to ride the coattails of the big ISPs in terms of pricing that 5G might bring more of the same high prices rather than real competition.

Charter’s Plans for 6G

It didn’t take long for somebody say they will have a 6G cellular product. Somebody has jumped the gun every time there has been migration to a new cellular standard, and I remember the big cellular companies making claims about having 4G LTE technology years before it was actually available.

But this time it’s not a cellular company talking about 6G – it’s Charter, the second largest US cable company. Charter is already in the process of implementing LTE cellular through the resale of wholesale minutes from Verizon – so they will soon be a cellular provider. If we look at the early success of Comcast they might do well since Charter has almost 24 million broadband customers.

Tom Rutledge, the Charter CEO made reference to 5G trials being done by the company, but also went on to tout a new Charter product as 6G. What Rutledge is really talking about is a new product that will put a cellular micro cell in a home that has Charter broadband. This hot spot would provide strong cellular coverage within the home and use the cable broadband network for backhaul for the calls.

Such a network would benefit Charter by collecting a lot of cellular minutes that Charter wouldn’t have to buy wholesale from Verizon. Outside of the home customers would roam on the Verizon network, but within the home all calls would route over the landline connection. Presumably, if the home cellular micro transmitters are powerful enough, neighbors might also be able to get cellular access if they are Charter cellular customers. This is reminiscent of the Comcast WiFi hotspots that broadcast from millions of their cable modems.

This is not a new idea. For years farmers have been buying cellular repeaters from AT&T and Verizon to boost their signal if they live near the edge of cellular coverage. These products also use the landline broadband connection as backhaul – but in those cases the calls route to one of the cellular carriers. But in this configuration Charter would intercept all cellular traffic and presumably route the calls themselves. There are also a number of cellular resellers who have been using landline backhaul to provide low-cost calling.

This would be the first time that somebody has ever contemplated this on a large scale. One can picture large volumes of Charter cellular micro sites in areas where they are the incumbent cable company. When enough homes have transmitters they might almost create a ubiquitous cellular network that is landline based – eliminating the need for cellular towers.

It’s an interesting concept. A cable company in some ways is already well positioned to implement a more traditional small cell cellular network. Once they have upgraded to DOCSIS 3.1 they can place a small cell site at any pole that is already connected to the cable network. For now the biggest hurdle to such a deployment is the small data upload speeds for the first generation of DOCSIS 3.1, but cable labs has already released a technology that will enable faster upload speeds, up to synchronous connections. Getting faster upload speeds means finding some more empty channel slots on the cable network and could be a challenge in some networks.

The most interesting thing about this idea is that anybody with a broadband network could offer cellular service in the same way if they can make a deal to buy wholesale minutes. But therein lies the rub. While there are now hundreds of ‘cellular’ companies, only a few of them own their own cellular networks and everybody else is reselling. Charter is large enough to probably feel secure about having access to long-term cellular minutes from the big cellular companies. But very few other landline ISPs are going to get that kind of locked arrangement.

I’ve always advised clients to be wary of any resell opportunity because the business can change on a dime when the underlying provider changes the rules of the game. Our industry is littered with examples of companies that went under when the large resale businesses they had built lost their wholesale product. The biggest such company that comes to mind was Talk America that had amassed over a million telephone customers on resold lines from the big telcos. But there are many other examples of paging resellers, long distance resellers and many other telco product reselling that only lasted as long as the underlying network providers agreed to supply the commodity. But this is such an intriguing idea that many landline ISPs are going to look at what Charter is doing and wonder why they can’t do the same.

Spectrum and 5G

All of the 5G press has been talking about how 5G is going to be bringing gigabit wireless speeds everywhere. But that is only going to be possible with millimeter wave spectrum, and even then it requires a reasonably short distance between sender and receiver as well as bonding together more than one signal using multiple MIMO antennae.

It’s a shame that we’ve let the wireless marketeers equate 5G with gigabit because that’s what the public is going to expect from every 5G deployment. As I look around the industry I see a lot of other uses for 5G that are going to produce speeds far slower than a gigabit. 5G is a standard that can be applied to any wireless spectrum and which brings some benefits over earlier standards. 5G makes it easier to bond multiple channels together for reaching one customer. It also can increase the number of connections that can be made from any given transmitter – with the biggest promise that the technology will eventually allow connections to large quantities of IOT devices.

Anybody who follows the industry knows about the 5G gigabit trials. Verizon has been loudly touting its gigabit 5G connections using the 28 GHz frequency and plans to launch the product in up to 28 markets this year. They will likely use this as a short-haul fiber replacement to allow them to more quickly add a new customer to a fiber network or to provide a redundant data path to a big data customer. AT&T has been a little less loud about their plans and is going to launch a similar gigabit product using 39 GHz spectrum in three test markets soon.

But there are also a number of announcements for using 5G with other spectrum. For example, T-Mobile has promised to launch 5G nationwide using its 600 MHz spectrum. This is a traditional cellular spectrum that is great for carrying signals for several miles and for going around and through obstacles. T-Mobile has not announced the speeds it hopes to achieve with this spectrum. But the data capacity for 600 MHz is limited and binding numerous signals together for one customer will create something faster then LTE, but not spectacularly so. It will be interesting to see what speeds they can achieve in a busy cellular environment.

Sprint is taking a different approach and is deploying 5G using the 2.5 GHz spectrum. They have been testing the use of massive MIMO antenna that contain 64 transmit and 64 receive channels. This spectrum doesn’t travel far when used for broadcast, so this technology is going to be used best with small cell deployments. The company claims to have achieved speeds as fast as 300 Mbps in trials in Seattle, but that would require binding together a lot of channels, so a commercial deployment is going to be a lot slower in a congested cellular environment.

Outside of the US there seems to be growing consensus to use 3.5 GHz – the Citizens Band radio frequency. That raises the interesting question of which frequencies will end up winning the 5G race. In every new wireless deployment the industry needs to reach an economy of scale in the manufacture of both the radio transmitters and the cellphones or other receivers. Only then can equipment prices drop to the point where a 5G capable phone will be similar in price to a 4GLTE phone. So the industry at some point soon will need to reach a consensus on the frequencies to be used.

In the past we rarely saw a consensus, but rather some manufacturer and wireless company won the race to get customers and dragged the rest of the industry along. This has practical implications for early adapters of 5G. For instance, somebody buying a 600 MHz phone from T-Mobile is only going to be able to use that data function when near to a T-Mobile tower or mini-cell. Until industry consensus is reached, phones that use a unique spectrum are not going to be able to roam on other networks like happens today with LTE.

Even phones that use the same spectrum might not be able to roam on other carriers if they are using the frequency differently. There are now 5G standards, but we know from practical experience with other wireless deployments in the past that true portability between networks often takes a few years as the industry works out bugs. This interoperability might be sped up a bit this time because it looks like Qualcomm has an early lead in the manufacture of 5G chip sets. But there are other chip manufacturers entering the game, so we’ll have to watch this race as well.

The word of warning to buyers of first generation 5G smartphones is that they are going to have issues. For now it’s likely that the MIMO antennae are going to use a lot of power and will drain cellphone batteries quickly. And the ability to reach a 5G data signal is going to be severely limited for a number of years as the cellular providers extend their 5G networks. Unless you live and work in the heart of one of the trial 5G markets it’s likely that these phones will be a bit of a novelty for a while – but will still give a user bragging rights for the ability to get a fast data connection on a cellphone.