Fixed Wireless in Cities

I am often asked by cities about the option of building a municipal fixed wireless broadband network. As a reminder, fixed wireless in this case is not a cellular system but is the point-to-multipoint technology used by WISPs. My response has been that it’s possible but that the resulting network is probably not going to satisfy the performance goals most cities have in mind.

There are several limitations of fixed wireless technology in an urban that must be considered. The first is the spectrum to be used. Cities tend to be saturated with unlicensed WiFi signals, and the amount of interference will make it a massive challenge to use unlicensed WiFi for broadband purposes. Most folks don’t realize that cellular carriers can snag a lot of the free WiFi spectrum in cities to supplement their cellular data networks – meaning that the free public spectrum is even more saturated than what might be expected.

Licensed spectrum can provide better broadband results. But in cities of any size, most of the licensed spectrum is already spoken for and belongs to cellular companies or somebody else that plans to use it. It never hurts to see if there is spectrum that can be leased, but often there will not be any.

Even if licensed spectrum is available, there are other factors that affect performance of fixed wireless in highly populated areas. The first is that most fixed wireless radios can only serve a relatively small number of customers. Cities are probably not going to be willing to make an investment that can only serve a limited number of people.

Another issue to consider is line-of-sight. In practical terms, this means that neighbor A’s home might block the signal to reach neighbor B. In the typical city, there are going to be a lot of homes that cannot be connected to a fixed wireless network unless there are a lot of towers – and most cities are averse to building more towers.

Even when there is decent line-of-sight, an urban wireless signal can be disturbed by the many routine activities in the city, such as seasonal foliage, bad weather, and even traffic.  One of the more interesting phenomenons of spectrum in an urban setting is how the signal will reflect in scatter in unexpected ways as it bounces off buildings. These factors tend to cause a lot more problems in a dense neighborhood than in a rural setting.

A point-to-multipoint fixed wireless system is also not a great solution for multi-tenant buildings. These networks are designed to provide bandwidth connections to individual users, and there is not enough bandwidth to deliver broadband from one connection to serve multiple tenants. There are also challenges in where to place antennas for individual apartments.

The combination of these issues means that fixed wireless can only serve a relatively small number of customers in an urban area. The speeds are going to bounce around due to urban interference. Speeds are not likely going to be good enough to compete with cable technology.

There is a good analogy to understand the limitations on wireless technologies in cities. Cell carriers have one advantage over many WISPs by owning licensed spectrum. But even with licensed spectrum there are usually numerous small dead spots in cities where the signals can’t reach due to line-of-sight. Cellular radios can serve a lot more customers than fixed wireless radios, but there are still limitations on the number of customers who can buy cellular FWA broadband in a given neighborhood. Any issues faced by cellular networks are worse for a point-to-multipoint network.

The bottom line is that there are a lot of limitations on urban fixed wireless networks that make it a risky investment. Tower space is usually at a premium in cities, and it’s hard to build a network that will reach many customers. There is a lot more interference and line-of-sight issues in a city that makes it hard to maintain a quality connection.

But this doesn’t mean there are no applications that make sense. For example, a fixed wireless network might be ideal for creating a private network for connecting to city locations that don’t need a lot of broadband, like sensor monitoring. That makes a lot more sense than trying to use the technology as an alternative ISP connection for residences and businesses.

Lets Stop Talking About Technology Neutral

A few weeks ago, I wrote a blog about the misuse of the term overbuilding. Big ISPs use the term to give politicians a phrase to use to shield the big companies from competition. The argument is always phrased about how federal funds shouldn’t be used to overbuild where an ISP is already providing fast broadband. What the big ISPs really mean is that they don’t want to have competition anywhere, even where they still offer outdated technologies or where they have neglected networks.

Today I want to take on the phrase ‘technology neutral’. This phrase is being used to justify building technologies that are clearly not as good as fiber. The argument has been used a lot in recent years to say that grants should be technology neutral so as not to favor only fiber. The phrase was used a lot to justify allowing Starlink into the RDOF reverse auction. The phrase has been used a lot to justify allowing fixed wireless technology to win grants, and lately, it’s being used more specifically to allow fixed wireless using unlicensed spectrum into the BEAD grants.

The argument justifies allowing technologies like satellite or fixed wireless using unlicensed spectrum to get grants since the technologies are ‘good enough’ when compared to the requirement of grant rules.

I have two arguments to counter that justification. The only reason the technology neutral argument can be raised is that politicians set the speed requirements for grants at ridiculously low levels. Consider all of the current grants that set the speed requirement for technology at 100/20 Mbps. The 100 Mbps speed requirement is an example of what I’ve recently called underbuilding – it allows for building a technology that is already too slow today. At least 80% of folks in the country today can buy broadband from a cable company or fiber company. Almost all of the cable companies offer download speeds as fast as a gigabit. Even in older cable systems, the maximum speeds are faster than 100 Mbps. Setting a grant speed requirement of only 100 Mbps download is saying to rural folks that they don’t deserve broadband as good as what is available to the large majority of people in the country.

The upload speed requirement of 20 Mbps was a total political sellout. This was set to appease the cable companies, many which struggle to beat that speed. Interestingly, the big cable companies all recognize that their biggest market weakness is slow upload speeds, and most of them are working on plans to implement a mid-split upgrade or else some early version of DOCSIS 4.0 to significantly improve upload speed. Within just a few years, the 20 Mbps upload speed limit is going to feel like ancient history.

The BEAD requirement of only needing to provide 20 Mbps upload is ironic for two reasons. First, in cities, the cable companies will have much faster upload speeds implemented by the time that anybody builds a BEAD network. Second, the cable companies that are pursuing grants are almost universally using fiber to satisfy those grants. Cable companies are rarely building coaxial copper plant for new construction. This means the 20 Mbps speed was set to protect cable companies against overbuilding – not set as a technology neutral speed that is forward looking.

The second argument against the technology neutral argument is that some technologies are clearly not good enough to justify receiving grant dollars. Consider Starlink satellite broadband. It’s a godsend to folks who have no alternatives, and many people rave about how it has solved their broadband problems. But the overall speeds are far slower than what was promised before the technology was launched. I’ve seen a huge number of speed tests for Starlink that don’t come close to the 100/20 Mbps speed required by the BEAD grants.

The same can be said for FWA wireless using cellular spectrum. It’s pretty decent broadband for folks who live within a mile or two of a tower, and I’ve talked to customers who are seeing speeds significantly in excess of 100/20 Mbps. But customers just a mile further away from a tower tell a different story, where download speeds are far under 100 Mbps download. A technology that has such a small coverage area does not meet the technology neutral test unless a cellular company promises to pepper an area with new cell towers.

Finally, and a comment that always gets pushback from WISPs, is that fixed wireless technology using unlicensed spectrum has plainly not been adequate in most places. Interference from the many users of unlicensed spectrum means the broadband speeds vary depending on whatever is happening with the spectrum at a given moment. Interference on the technology also means higher latency and much higher packet losses than landline technologies.

I’ve argued until I am blue in the face that grant speed requirements should be set for the speeds we expect a decade from now and not for the bare minimum that makes sense today. It’s ludicrous to allow award grant funding to a technology that barely meets the 100/20 Mbps grant requirement when that network probably won’t be built until 2025. The real test for the right technology for grant funding is what the average urban customer will be able to buy in 2032. It’s hard to think that speed won’t be something like 2 Gbps/200 Mbps. If that’s what will be available to a large majority of households in a decade it ought to be the technology neutral definition of speed to qualify for grants.

No Home Broadband Option

We spend a lot of time arguing policy questions, such as asking if 25/3 Mbps is adequate broadband. What policymakers should really be talking about are the huge numbers of homes with dreadful broadband. The worst thing about the deceptive FCC maps is that they often give the perception that most rural areas have at least some broadband options when many rural residents will tell you they have no real broadband options.

Policymakers don’t grasp the lousy choices in many rural areas. The FCC maps might show the availability of DSL, but if it’s even available (often it’s not), the speeds can be incredibly slow. Rural households refuse to pay for DSL that might deliver only 1 or 2 Mbps download and practically no upload.

I think the FCC assumes that everybody has access to satellite broadband. But I’ve talked to countless rural residents who tried satellite broadband and rejected it. Real speeds are often much slower than advertised speeds since trees and hills can quash a satellite signal. The latency can be crippling, and in places where the speeds are impaired, the high latency means a household will struggle with simple real-time tasks like keeping a connection to a shopping site. Satellite plans also come with tiny data caps. I’d like to put a few Washington DC policymakers on a monthly data plan with a 40 GB or 60 GB cap so they can understand how quickly that is used in a month. But the real killer with satellite broadband is the cost. HughesNet told investors last year that its average revenue per customer was over $93 per month. Many rural homes refuse to pay that much for a broadband product that doesn’t work.

We hear a lot of stories about how fixed wireless technology is getting better to the point where we’re hearing preposterous conversations about bringing gigabit fixed wireless to rural areas. There are still a lot of places with woods and hills where fixed wireless is a poor technology choice. I worked with one county recently that gathered thousands of speed tests for fixed wireless that showed average download speeds under 5 Mbps and upload speeds below 1 Mbps. There are still a lot of WISPs that are cramming too many customers on towers, chaining too many towers together with wireless backhaul, and selling to customers who are too far from towers. This is not to say that there aren’t great WISPs, but in too many rural places the fixed wireless choices are bleak.

Rural residents have also suffered with cellular hotspots. These are the plans that cellular companies have had for years that basically price home broadband at the same prices and data caps as cellular broadband. During the pandemic, I’ve heard from families who were spending $500 to $1,000 per month in order to enable home-schooling during the pandemic. This product is not available in huge parts of rural America because of the poor or nonexistent cellular coverage. We complain about the FCC’s broadband maps, but those are heads and tails better than the cellular company coverage maps which massively overstate rural cellular availability.

There is some relief in sight for some rural homes. I recently talked to farmers who are thrilled with the T-Mobile fixed cellular product – but they said distance from cell sites is key and that many of their neighbors are out of range of the few cell sites found in most rural counties. There are rural folks who are happy with Starlink. But there are a lot of people now into the second year on the waiting list to get Starlink. Starlink also has reported problems with trees and hills and also comes with a steep $99 per month price tag.

When a rural household says they have no broadband connection, I’ve learned that you have to believe them. They will have already tried the DSL, fixed wireless, satellite, and cellular hotpots, and decided that none of the options work well enough to justify paying for them. The shame is that the FCC maps might give the impression that residents have two, three, or four broadband options when they really have none.

AT&T’s 5G Strategy

AT&T recently described their long-term 5G strategy using what they call the 3 pillars of 5G – the three areas where the company is putting their 5G focus. The first pillar is a concentration on 5G cellular, and the company’s goal is to launch a 5G-based cellular service, with some cities coming on board in the second half of 2020. This launch will use frequencies in the sub-6 GHz range. This admission that there won’t be any AT&T 5G until at least 2020 contradicts the AT&T marketing folks who are currently trying to paint the company’s 4G LTE as pre-5G.

The biggest problem for the public will be getting a 5G cellphone. AT&T is working with Samsung to hopefully launch two phones later this year that have some 5G capability. As always with a new generation of wireless technology, the bottleneck will be in handsets. The cell phone makers can’t just make generic 5G phones – they have to work with the carriers to be ready to support the spectific subset of 5G features that are released. You might recall that the 5G cellular specification contains 13 improvements, and only the first generation of a few of those will be included in the first generation 5G cell sites. Cellphone manufacturers will also have to wrestle with the fact that each big cellular carrier will introduce a different set of 5G features.

This is a real gamble for cellphone makers because a 5G phone will become quickly obsolete. A 5G phone sold in late 2019 probably won’t include all of the 5G features that will be on the market by late 2020 – and this is likely to be true for the next 3 or 4 years as the carriers roll out incremental 5G improvements. It’s also a gamble for customers because anybody that buys an early 5G cellphone will have early bragging rights, but those cool benefits can be out of date in six months. I think most people will be like me and will wait a few years until the 5G dust settles.

AT&T’s second pillar is fixed wireless. This one is a head-scratcher because they are talking about the fixed cellular product they’ve already been using for several years – and that product is not 5G. This is the product that delivers broadband to homes using existing low-band cellular frequencies. This is not the same as Verizon’s product that delivers hundreds of megabits per second but is instead a product that delivers speeds up to 50 Mbps depending upon how far a customer lives from a cell tower – with reports that most households are getting 15 Mbps at best. This is the product that AT&T is mostly using to satisfy its CAF II requirements in rural America. All of the engineers I’ve talked to don’t think that 5G is going to materially improve this product.

The final pillar of AT&T’s strategy is edge computing. What AT&T means by this is to put fast processors at customer sites when there is the need to process low-latency, high-bandwidth data. Like other carriers, AT&T has found that not everything is suited for the cloud and that trying to send big data to and from the cloud can create a bandwidth bottleneck and add latency. This strategy doesn’t require 5G and AT&T has already been deploying edge routers. However, 5G will enhance this ability at customer sites that need to connect a huge number of devices simultaneously. 5G can make it easier to connect to a huge number of IoT devices in a hospital or to 50,000 cell phones in a stadium. The bottom line is that the migration to more edge computing is not a 5G issue and applies equally to AT&T’s fiber customers.

There is really nothing new in the three-pillar announcement and AT&T has been talking about all three applications from some time – but the announcement does highlight the company’s focus for stockholders.

In what was mostly a dig at Verizon, AT&T’s CEO Randall Stephenson did hold out the possibility of AT&T following Verizon into the 5G fixed wireless local loop using millimeter wave spectrum – however, he said such a product offering is probably three to five years into the future. He envisions the product as an enhancement to AT&T’s fiber products, not necessarily a replacement. He emphasized that AT&T is happy with the current fiber deployments. He provided some new statistics on a recent earnings call and said the company is seeing customer penetration rates between 33% and 40% within 18 months of new fiber deployment and penetration around 50% after three years. Those are impressive statistics because AT&T’s fiber deployments have been largely in urban areas competing with the big cable companies.

A year ago, Stephenson said that getting sufficient backhaul was his number one concern with deploying high-bandwidth wireless. While he hasn’t repeated that recently, it fits in with his narrative of seeing millimeter wave radio deployments in the 3-5 year time frame. The company recently released a new policy paper on its AirGig product that says that the product is still under development and might play well with 5G. AirGig is the mysterious wireless product that shoots wireless signals along power lines and somehow uses the power lines to maintain focus of the signal. Perhaps the company is seeing a future path for using AirGig as the backhaul to 5G fixed wireless deployments.