The Licensed Wireless Dilemma

As we head into the final set of map challenges leading up to BEAD grants, State broadband offices are going to be wrestling with a host of sticky issues for the map challenges. One of the stickiest issues is how to recognize the service areas for ISPs that use licensed spectrum to deliver rural broadband.

The licensed wireless issue comes from a ruling from the NTIA that, for purposes of BEAD grants, fixed wireless networks using unlicensed spectrum are deemed to be unreliable. That means that WISPs that serve customers with unlicensed spectrum are assumed to be unserved – regardless of the speeds being delivered. This ruling set off a firestorm of comments for and against the NTIA decision, but the original ruling still stands for purposes of determining BEAD grant-eligible locations.

The corollary to that decision is that any area covered by wireless technology using licensed spectrum is considered to be served if the ISP can deliver speeds of at least 100/20 Mbps. There are two sets of ISPs using licensed spectrum to deliver broadband – cellular carriers and WISPs using licensed CBRS spectrum. The sticky question for a State broadband office is how to verify the service area of an existing wireless ISP using licensed spectrum. It’s not a straightforward answer.

Can a Customer Really Be Served? All ISPs are only supposed to claim locations in the FCC maps that the ISP can install in ten working days. It’s fairly easy to determine if a wireline ISP can serve a location just by looking at the presence of a physical wired network. It’s much harder to apply the 10-day test to define the coverage area of a wireless network. The starting assumption for BEAD grants is that claims of coverage made by ISPs in the latest FCC maps are considered to be accurate unless somebody challenges them.

There is a natural distance limitation set by physics for how far a given spectrum can deliver a strong wireless signal. All wireless transmissions get weaker as the distance from a tower increases, and there is some distance where a radio can’t deliver a guaranteed 100/20 Mbps signal. It doesn’t take much searching through the FCC mapping to find wireless ISPs that claim coverage across large expanses. A cellular carrier tends to only be on the tall towers in rural areas, but a WISP might be covering a large area by using secondary towers on grain elevators, monopoles, or other structures to increase coverage. A Broadband office needs to know the location of radios as a starting point to understand coverage.

The more challenging issue is to know if a specific customer can be covered. Physical impediments like hills can make it impossible to reach customers from a given tower site. While radios have gotten better at going through trees, heavy woods can still knock down the speeds below the 100/20 Mbps threshold required to be considered as served. WISPs that operate in challenging topographies will tell you that they often don’t know if they can serve a customer until they visit the customer and try to find a signal.

WISPs might point to wireless propagation studies to prove their claimed coverage area, but anywhere other than a flat, treeless plain, a propagation study is more theoretical than reality. If there is a map challenges, a State broadband office still ought to ask for any propagation studies. It would be a lot easier to believe claimed coverage areas (for all technologies) if the FCC had kept the rule that FCC map reporting must be certified by a licensed engineer – unfortunately, that requirement has been waived at a time when we’re striving to get the maps right for BEAD.

The Stickiest Issue. The biggest challenge for State broadband offices is how to apply the NTIA rules for licensed and unlicensed spectrum. Most WISPs using licensed 3.5 GHz CBRS are also using unlicensed 2.4 GHz, unlicensed 5.7 GHz, and unlicensed CBRS spectrum. WISPs are also anticipating the upcoming ability to use the unlicensed 6 GHz spectrum that brings gigantic channels and promises much faster speeds.

This raises a lot of questions about how to apply the NTIA ruling on unlicensed spectrum. I think all of the following questions can validly be asked, but since the NTIA hasn’t provided any specific guidance, I haven’t the slightest idea how to answer any of them:

  • Does a WISP that claims to be licensed have to serve every customer using licensed spectrum? Does a WISP using licensed spectrum meet the NTIA rule if it also serves some customers with unlicensed spectrum?
  • Is there some threshold percentage of licensed versus unlicensed customers that must be met to be considered as licensed? Is there a minimum threshold? What if a WISP serves only a few customers with licensed, or even just one – would it still be considered as using licensed spectrum? What if the WISP owns the spectrum license but serves all customers with unlicensed spectrum?
  • In defining what is served today, can the WISP only claim customers as served with licensed spectrum that can also reach a speed of 100/20 Mbps?
  • Another interesting issue to consider is when a tower using licensed spectrum delivers broadband to customers outside of the spectrum license footprint – are those customers considered to be served if the WISP is violating its license?

Why This Matters. BEAD has an obligation to bring broadband to every unserved location in each state. It’s incredibly difficult for a State broadband office to verify the claimed coverage footprint of an ISP that is using licensed spectrum. It’s hard to define the distance from a radio where speeds of 100/20 Mbps can’t be reached since that distance will vary according to obstacles in the signal path like trees. It’s hard to identify homes that don’t have a good enough line-of-sight be to served. And it feels overwhelming to know what to do about a WISP that mixes licensed and unlicensed spectrum. I don’t envy a State broadband office that gets challenges on this issue because it’s not an easy issue to understand or resolve.

Are There Superior Technologies?

It’s easy to fall into a lazy mental habit and say that some technologies are better than others. I know I tend to do this. It’s easy to say that fiber is better than cable technology or fixed wireless when in real life, broadband customers make this decision.

My firm does a lot of broadband surveys every year, and we find customers who are happy with most broadband technologies. I say most because I don’t think I ever found a customer who praised their cellular hotspot of high-orbit satellite service on HughesNet or Viasat. But other broadband technologies and the ISPs that deploy them have their fans.

As an example, we recently talked to a bunch of businesses in a community that buy broadband from the cable company. This is a small rural town where the cable company is still using the older DOCSIS 3.0 technology. Speed tests show download speeds at a maximum of 150 Mbps download and less than 10 Mbps upload. Most of the businesses complained about the cable company. They said that service was spotty and was sometimes good and sometimes bad. They complained about the inability to perform functions that needed upload speeds, such as using cloud software, making Zoom calls, or using VoIP.

But there were several businesses that were happy with the cable company. They said they rarely had problems and had nothing negative to say about the cable company. There are two possible reasons for this. First, the happy customers might not be using the broadband in the same way as other businesses. However, the satisfied customers included a law firm and an insurance agent, who both said they worked all day with cloud software.

There is another possible reason why these customers are happier. There is a chance that the network in their part of town performs better than other parts of the network. We tend to think of networks as ubiquitous, but that is not the case. The neighborhood with satisfied customers might have fewer customers sharing a node. It might have newer coaxial cable. It might not be configured with a lot of amplifiers. It might have a faster fiber connection feeding the node. It might have suffered fewer cable cuts over the years. It might be superior in a number of ways to the parts of the network serving the businesses that complained about service.

Fiber is not always great. I have an ISP client that built one of the first fiber BPON networks. BPON delivered 622 Mbps download and 155 Mbps upload bandwidth to share with up to 32 customers. Over time, this network got full and many PONs were completely subscribed. Before this ISP finally upgraded, the network performance grew terrible. PONs with business customers delivered terrible speeds in the daytime, and residential PONs bogged down badly in the evenings. Fiber does not automatically mean a great network – any network where there is more demand for broadband than is being delivered will see big problems and unhappy customers.

I still find business customers who are happy with DSL. They live close to the DSLAM core, and their part of the network isn’t oversubscribed. Telcos are delivering as much as 100 Mbps download speeds to selected DSL customers for a decent price.

The same goes for fixed wireless. I run across customers who hate the technology and others who love it. A lot of this difference is the underlying philosophy and customer service of the local WISP. Some WISPs  do everything right while others oversubscribe sectors, try to sell more bandwidth than is available, or try to serve customers who are too far away from the tower.

I’ve found the same with Starlink. There are customers who love the service and others who tried and dropped it. I’m already starting to see this same dichotomy with FWA cellular wireless, with customers who love it or hate it.

The bottom line is that any broadband technology or ISP that a customer likes is good for them. For a customer to remain happy for a long time requires technology that works, customer service that is responsive, and a price that customers are happy with. ISPs often build a new network and wonder why they don’t instantly get a huge penetration rate. Some of this is due to customers who don’t want to put any effort into changing ISPs – but other customers are happy with the quality, service, or price of the existing broadband.

Are there superior technologies? Some networks clearly outperform competitors in a given neighborhood. But the superior technology for any given customer is the one they choose to buy that they are satisfied with. Who am I to argue with a happy customer?

Fixed Wireless in Cities

I am often asked by cities about the option of building a municipal fixed wireless broadband network. As a reminder, fixed wireless in this case is not a cellular system but is the point-to-multipoint technology used by WISPs. My response has been that it’s possible but that the resulting network is probably not going to satisfy the performance goals most cities have in mind.

There are several limitations of fixed wireless technology in an urban that must be considered. The first is the spectrum to be used. Cities tend to be saturated with unlicensed WiFi signals, and the amount of interference will make it a massive challenge to use unlicensed WiFi for broadband purposes. Most folks don’t realize that cellular carriers can snag a lot of the free WiFi spectrum in cities to supplement their cellular data networks – meaning that the free public spectrum is even more saturated than what might be expected.

Licensed spectrum can provide better broadband results. But in cities of any size, most of the licensed spectrum is already spoken for and belongs to cellular companies or somebody else that plans to use it. It never hurts to see if there is spectrum that can be leased, but often there will not be any.

Even if licensed spectrum is available, there are other factors that affect performance of fixed wireless in highly populated areas. The first is that most fixed wireless radios can only serve a relatively small number of customers. Cities are probably not going to be willing to make an investment that can only serve a limited number of people.

Another issue to consider is line-of-sight. In practical terms, this means that neighbor A’s home might block the signal to reach neighbor B. In the typical city, there are going to be a lot of homes that cannot be connected to a fixed wireless network unless there are a lot of towers – and most cities are averse to building more towers.

Even when there is decent line-of-sight, an urban wireless signal can be disturbed by the many routine activities in the city, such as seasonal foliage, bad weather, and even traffic.  One of the more interesting phenomenons of spectrum in an urban setting is how the signal will reflect in scatter in unexpected ways as it bounces off buildings. These factors tend to cause a lot more problems in a dense neighborhood than in a rural setting.

A point-to-multipoint fixed wireless system is also not a great solution for multi-tenant buildings. These networks are designed to provide bandwidth connections to individual users, and there is not enough bandwidth to deliver broadband from one connection to serve multiple tenants. There are also challenges in where to place antennas for individual apartments.

The combination of these issues means that fixed wireless can only serve a relatively small number of customers in an urban area. The speeds are going to bounce around due to urban interference. Speeds are not likely going to be good enough to compete with cable technology.

There is a good analogy to understand the limitations on wireless technologies in cities. Cell carriers have one advantage over many WISPs by owning licensed spectrum. But even with licensed spectrum there are usually numerous small dead spots in cities where the signals can’t reach due to line-of-sight. Cellular radios can serve a lot more customers than fixed wireless radios, but there are still limitations on the number of customers who can buy cellular FWA broadband in a given neighborhood. Any issues faced by cellular networks are worse for a point-to-multipoint network.

The bottom line is that there are a lot of limitations on urban fixed wireless networks that make it a risky investment. Tower space is usually at a premium in cities, and it’s hard to build a network that will reach many customers. There is a lot more interference and line-of-sight issues in a city that makes it hard to maintain a quality connection.

But this doesn’t mean there are no applications that make sense. For example, a fixed wireless network might be ideal for creating a private network for connecting to city locations that don’t need a lot of broadband, like sensor monitoring. That makes a lot more sense than trying to use the technology as an alternative ISP connection for residences and businesses.

Lets Stop Talking About Technology Neutral

A few weeks ago, I wrote a blog about the misuse of the term overbuilding. Big ISPs use the term to give politicians a phrase to use to shield the big companies from competition. The argument is always phrased about how federal funds shouldn’t be used to overbuild where an ISP is already providing fast broadband. What the big ISPs really mean is that they don’t want to have competition anywhere, even where they still offer outdated technologies or where they have neglected networks.

Today I want to take on the phrase ‘technology neutral’. This phrase is being used to justify building technologies that are clearly not as good as fiber. The argument has been used a lot in recent years to say that grants should be technology neutral so as not to favor only fiber. The phrase was used a lot to justify allowing Starlink into the RDOF reverse auction. The phrase has been used a lot to justify allowing fixed wireless technology to win grants, and lately, it’s being used more specifically to allow fixed wireless using unlicensed spectrum into the BEAD grants.

The argument justifies allowing technologies like satellite or fixed wireless using unlicensed spectrum to get grants since the technologies are ‘good enough’ when compared to the requirement of grant rules.

I have two arguments to counter that justification. The only reason the technology neutral argument can be raised is that politicians set the speed requirements for grants at ridiculously low levels. Consider all of the current grants that set the speed requirement for technology at 100/20 Mbps. The 100 Mbps speed requirement is an example of what I’ve recently called underbuilding – it allows for building a technology that is already too slow today. At least 80% of folks in the country today can buy broadband from a cable company or fiber company. Almost all of the cable companies offer download speeds as fast as a gigabit. Even in older cable systems, the maximum speeds are faster than 100 Mbps. Setting a grant speed requirement of only 100 Mbps download is saying to rural folks that they don’t deserve broadband as good as what is available to the large majority of people in the country.

The upload speed requirement of 20 Mbps was a total political sellout. This was set to appease the cable companies, many which struggle to beat that speed. Interestingly, the big cable companies all recognize that their biggest market weakness is slow upload speeds, and most of them are working on plans to implement a mid-split upgrade or else some early version of DOCSIS 4.0 to significantly improve upload speed. Within just a few years, the 20 Mbps upload speed limit is going to feel like ancient history.

The BEAD requirement of only needing to provide 20 Mbps upload is ironic for two reasons. First, in cities, the cable companies will have much faster upload speeds implemented by the time that anybody builds a BEAD network. Second, the cable companies that are pursuing grants are almost universally using fiber to satisfy those grants. Cable companies are rarely building coaxial copper plant for new construction. This means the 20 Mbps speed was set to protect cable companies against overbuilding – not set as a technology neutral speed that is forward looking.

The second argument against the technology neutral argument is that some technologies are clearly not good enough to justify receiving grant dollars. Consider Starlink satellite broadband. It’s a godsend to folks who have no alternatives, and many people rave about how it has solved their broadband problems. But the overall speeds are far slower than what was promised before the technology was launched. I’ve seen a huge number of speed tests for Starlink that don’t come close to the 100/20 Mbps speed required by the BEAD grants.

The same can be said for FWA wireless using cellular spectrum. It’s pretty decent broadband for folks who live within a mile or two of a tower, and I’ve talked to customers who are seeing speeds significantly in excess of 100/20 Mbps. But customers just a mile further away from a tower tell a different story, where download speeds are far under 100 Mbps download. A technology that has such a small coverage area does not meet the technology neutral test unless a cellular company promises to pepper an area with new cell towers.

Finally, and a comment that always gets pushback from WISPs, is that fixed wireless technology using unlicensed spectrum has plainly not been adequate in most places. Interference from the many users of unlicensed spectrum means the broadband speeds vary depending on whatever is happening with the spectrum at a given moment. Interference on the technology also means higher latency and much higher packet losses than landline technologies.

I’ve argued until I am blue in the face that grant speed requirements should be set for the speeds we expect a decade from now and not for the bare minimum that makes sense today. It’s ludicrous to allow award grant funding to a technology that barely meets the 100/20 Mbps grant requirement when that network probably won’t be built until 2025. The real test for the right technology for grant funding is what the average urban customer will be able to buy in 2032. It’s hard to think that speed won’t be something like 2 Gbps/200 Mbps. If that’s what will be available to a large majority of households in a decade it ought to be the technology neutral definition of speed to qualify for grants.

No Home Broadband Option

We spend a lot of time arguing policy questions, such as asking if 25/3 Mbps is adequate broadband. What policymakers should really be talking about are the huge numbers of homes with dreadful broadband. The worst thing about the deceptive FCC maps is that they often give the perception that most rural areas have at least some broadband options when many rural residents will tell you they have no real broadband options.

Policymakers don’t grasp the lousy choices in many rural areas. The FCC maps might show the availability of DSL, but if it’s even available (often it’s not), the speeds can be incredibly slow. Rural households refuse to pay for DSL that might deliver only 1 or 2 Mbps download and practically no upload.

I think the FCC assumes that everybody has access to satellite broadband. But I’ve talked to countless rural residents who tried satellite broadband and rejected it. Real speeds are often much slower than advertised speeds since trees and hills can quash a satellite signal. The latency can be crippling, and in places where the speeds are impaired, the high latency means a household will struggle with simple real-time tasks like keeping a connection to a shopping site. Satellite plans also come with tiny data caps. I’d like to put a few Washington DC policymakers on a monthly data plan with a 40 GB or 60 GB cap so they can understand how quickly that is used in a month. But the real killer with satellite broadband is the cost. HughesNet told investors last year that its average revenue per customer was over $93 per month. Many rural homes refuse to pay that much for a broadband product that doesn’t work.

We hear a lot of stories about how fixed wireless technology is getting better to the point where we’re hearing preposterous conversations about bringing gigabit fixed wireless to rural areas. There are still a lot of places with woods and hills where fixed wireless is a poor technology choice. I worked with one county recently that gathered thousands of speed tests for fixed wireless that showed average download speeds under 5 Mbps and upload speeds below 1 Mbps. There are still a lot of WISPs that are cramming too many customers on towers, chaining too many towers together with wireless backhaul, and selling to customers who are too far from towers. This is not to say that there aren’t great WISPs, but in too many rural places the fixed wireless choices are bleak.

Rural residents have also suffered with cellular hotspots. These are the plans that cellular companies have had for years that basically price home broadband at the same prices and data caps as cellular broadband. During the pandemic, I’ve heard from families who were spending $500 to $1,000 per month in order to enable home-schooling during the pandemic. This product is not available in huge parts of rural America because of the poor or nonexistent cellular coverage. We complain about the FCC’s broadband maps, but those are heads and tails better than the cellular company coverage maps which massively overstate rural cellular availability.

There is some relief in sight for some rural homes. I recently talked to farmers who are thrilled with the T-Mobile fixed cellular product – but they said distance from cell sites is key and that many of their neighbors are out of range of the few cell sites found in most rural counties. There are rural folks who are happy with Starlink. But there are a lot of people now into the second year on the waiting list to get Starlink. Starlink also has reported problems with trees and hills and also comes with a steep $99 per month price tag.

When a rural household says they have no broadband connection, I’ve learned that you have to believe them. They will have already tried the DSL, fixed wireless, satellite, and cellular hotpots, and decided that none of the options work well enough to justify paying for them. The shame is that the FCC maps might give the impression that residents have two, three, or four broadband options when they really have none.

AT&T’s 5G Strategy

AT&T recently described their long-term 5G strategy using what they call the 3 pillars of 5G – the three areas where the company is putting their 5G focus. The first pillar is a concentration on 5G cellular, and the company’s goal is to launch a 5G-based cellular service, with some cities coming on board in the second half of 2020. This launch will use frequencies in the sub-6 GHz range. This admission that there won’t be any AT&T 5G until at least 2020 contradicts the AT&T marketing folks who are currently trying to paint the company’s 4G LTE as pre-5G.

The biggest problem for the public will be getting a 5G cellphone. AT&T is working with Samsung to hopefully launch two phones later this year that have some 5G capability. As always with a new generation of wireless technology, the bottleneck will be in handsets. The cell phone makers can’t just make generic 5G phones – they have to work with the carriers to be ready to support the spectific subset of 5G features that are released. You might recall that the 5G cellular specification contains 13 improvements, and only the first generation of a few of those will be included in the first generation 5G cell sites. Cellphone manufacturers will also have to wrestle with the fact that each big cellular carrier will introduce a different set of 5G features.

This is a real gamble for cellphone makers because a 5G phone will become quickly obsolete. A 5G phone sold in late 2019 probably won’t include all of the 5G features that will be on the market by late 2020 – and this is likely to be true for the next 3 or 4 years as the carriers roll out incremental 5G improvements. It’s also a gamble for customers because anybody that buys an early 5G cellphone will have early bragging rights, but those cool benefits can be out of date in six months. I think most people will be like me and will wait a few years until the 5G dust settles.

AT&T’s second pillar is fixed wireless. This one is a head-scratcher because they are talking about the fixed cellular product they’ve already been using for several years – and that product is not 5G. This is the product that delivers broadband to homes using existing low-band cellular frequencies. This is not the same as Verizon’s product that delivers hundreds of megabits per second but is instead a product that delivers speeds up to 50 Mbps depending upon how far a customer lives from a cell tower – with reports that most households are getting 15 Mbps at best. This is the product that AT&T is mostly using to satisfy its CAF II requirements in rural America. All of the engineers I’ve talked to don’t think that 5G is going to materially improve this product.

The final pillar of AT&T’s strategy is edge computing. What AT&T means by this is to put fast processors at customer sites when there is the need to process low-latency, high-bandwidth data. Like other carriers, AT&T has found that not everything is suited for the cloud and that trying to send big data to and from the cloud can create a bandwidth bottleneck and add latency. This strategy doesn’t require 5G and AT&T has already been deploying edge routers. However, 5G will enhance this ability at customer sites that need to connect a huge number of devices simultaneously. 5G can make it easier to connect to a huge number of IoT devices in a hospital or to 50,000 cell phones in a stadium. The bottom line is that the migration to more edge computing is not a 5G issue and applies equally to AT&T’s fiber customers.

There is really nothing new in the three-pillar announcement and AT&T has been talking about all three applications from some time – but the announcement does highlight the company’s focus for stockholders.

In what was mostly a dig at Verizon, AT&T’s CEO Randall Stephenson did hold out the possibility of AT&T following Verizon into the 5G fixed wireless local loop using millimeter wave spectrum – however, he said such a product offering is probably three to five years into the future. He envisions the product as an enhancement to AT&T’s fiber products, not necessarily a replacement. He emphasized that AT&T is happy with the current fiber deployments. He provided some new statistics on a recent earnings call and said the company is seeing customer penetration rates between 33% and 40% within 18 months of new fiber deployment and penetration around 50% after three years. Those are impressive statistics because AT&T’s fiber deployments have been largely in urban areas competing with the big cable companies.

A year ago, Stephenson said that getting sufficient backhaul was his number one concern with deploying high-bandwidth wireless. While he hasn’t repeated that recently, it fits in with his narrative of seeing millimeter wave radio deployments in the 3-5 year time frame. The company recently released a new policy paper on its AirGig product that says that the product is still under development and might play well with 5G. AirGig is the mysterious wireless product that shoots wireless signals along power lines and somehow uses the power lines to maintain focus of the signal. Perhaps the company is seeing a future path for using AirGig as the backhaul to 5G fixed wireless deployments.