Packet Loss and Broadband Performance

In a recent article in FierceWireless, Joe Madden wrote an article looking at the various wireless technologies he has used at his home in rural central California. Over time he subscribed to a fixed wireless network using WiFi spectrum, cellular LTE broadband, Starlink, and a fixed wireless provider using CBRS spectrum. A lot of rural folks can describe a similar path where they have tried all of the broadband technologies available to them.

Since Joe is a wireless expert who works at Mobile Experts, he was able to analyze his broadband performance in ways that are not easily understood by the average subscriber. Joe came to an interesting conclusion – the difference in performance between various broadband technologies has less to do with speed than with the consistency of the broadband signal.

The average speed tests on the various products varied from 10/2 Mbps on fixed wireless using WiFi, to 117/13 Mbps on Starlink. But what Joe found was that there was a huge difference in consistency as measured by packet loss. Fixed wireless on WiFi had packet loss of 8.5%, while the packet loss on fixed wireless using CBRS spectrum dropped to 0.1%. The difference is stark and is due to the interference that affects using unlicensed spectrum compared to a cleaner signal on licensed spectrum.

But just measuring packet loss is not enough to describe the difference in the performance of the various broadband connections. Joe looked at the number of lost packets that were delivered over 250 milliseconds. That will require some explanation. Packet loss in general describes the percentage of data packets that are not delivered on time. In an Internet transmission, some packets are always lost somewhere in the routing to customers – although most packets are lost due to the local technology at the user end.

When a packet doesn’t show up as expected, the Internet routing protocols ask for that packet to be sent again. If the second packet gets to the user quickly enough, it’s the same, from a user perspective, as if that packet was delivered on time. Joe says that re-sent packets that don’t arrive until after 250 milliseconds are worthless because by then, the signal has been delivered to the user. The easiest way to visualize this is to look at the performance of Zoom calls for folks using rural technologies. Packets that don’t make it on time result in a gap in the video signal that manifests as fuzziness and unclear resolution on the video picture.

Packet loss is the primary culprit for poor Zoom calls. Not receiving all of the video packets on time is why somebody on a Zoom call looks fuzzy or pixelated. If the packet loss is high enough, the user is booted from the Zoom call.

The difference in the percentage of packets that are delivered late between the different technologies is eye-opening. In the fixed wireless using WiFi spectrum an astounding 65% of re-sent packets took longer than 250 ms. Cellular LTE broadband was almost as bad at 57%. Starlink was better at 14%, while fixed wireless using CBRS was lowest at 5%.

Joe is careful to point out that these figures only represent his home and not the technologies as deployed everywhere. But with that said, there are easily explainable technology reasons for the different levels of packet delay. General interference plays havoc with broadband networks using unlicensed spectrum. Starlink has delay just from the extra time for broadband signals to go to and from the satellite and the ground in both directions. The low packet losses on a CBRS network might be due to having very few other neighbors using the new service.

Joe’s comparison doesn’t include other major broadband technologies. I’ve seen some cable networks with high packet loss due to years of accumulated repairs and unresolved issues in the network. The winner of the packet loss comparison is fiber, which typically has an incredibly low packet loss and also a quick recovery rate for lost packets.

The bottom line from the article is that speed isn’t everything. It’s just one of the characteristics that define a good broadband connection, but we’ve unfortunately locked onto speed as the only important characteristic.

Unlicensed Spectrum and BEAD Grants

There is a growing controversy brewing about the NTIA’s decision to declare that fixed wireless technology using only unlicensed spectrum is unreliable and not worthy of funding for the BEAD grants. WISPA, the lobbying arm for the fixed wireless industry, released a press release that says that the NTIA has made a big mistake in excluding WISPs that use only unlicensed spectrum.

I’m not a wireless engineer, so before I wrote this blog, I consulted with several engineers and several technicians who work with rural wireless networks. The one consistent message I got from all of them is that interference can be a serious issue for WISPs deploying only unlicensed spectrum. I’m just speculating, but I have to think that was part of the reason for the NTIA decision – interference can mean that the delivered speeds are not reliably predictable.

A lot of the interference comes from the way that many WISPs operate. The biggest practical problem with unlicensed spectrum is that it is unregulated, meaning there is no agency that can force order in a chaotic wireless situation. I’ve heard numerous horror stories about some of the practices in rural areas where there are multiple WISPs.  There are WISPs that grab all of the available channels of spectrum in a market to block out competitors. WISPs complain about competitors that cheat by rigging radios to operate above the legal power limit, which swamps their competitors. And bad behavior begets bad behavior in a vicious cycle where WISPs try to outmaneuver each other for enough spectrum to operate. The reality is that the WISP market using unlicensed spectrum is a free-for-all – it’s the Wild West. Customers bear the brunt of this as customer performance varies day by day as WISPs rearrange their networks. Unless there is only a single WISP in a market, the performance of the networks using unlicensed spectrum is unreliable, almost by definition.

There are other issues that nobody, including WISPA, wants to address. There are many WISPs that provide terrible broadband because they deploy wireless technology in ways that exceed the physics of the wireless signals. Many of these same criticisms apply to cellular carriers as well, particularly with the new cellular FWA broadband. Wireless broadband can be high-quality when done well and can be almost unusable if deployed poorly.

There are a number of reasons for poor fixed wireless speeds. Some WISPs are still deploying lower quality and/or older radios that are not capable of the best speeds – this same complaint has been leveled for years against DSL providers. ISPs often pile too many customers into a radio sector and overload it, which greatly dilutes the quality of the broadband that can reach any one customer. Another common issue is WISPs that deploy networks with inadequate backhaul. They will string together multiple wireless backhaul links to the point where each wireless transmitter is starved for bandwidth. But the biggest issue that I see in real practice is that some WISPs won’t say no to customers even when the connection is poor. They will gladly install customers who live far past the reasonable range of the radios or who have restricted line-of-sight. These practices are okay if customers willingly accept the degraded broadband – but typically, customers are often given poor broadband for a full price with no explanation.

Don’t take this to mean that I am against WISPs. I was served by a WISP for a decade that did a great job. I know high-quality WISPS that don’t engage in shoddy practices and who are great ISPs. But I’ve worked in many rural counties where residents lump WISPs in with rural DSL as something they will only purchase if there is no alternative.

Unfortunately, some of these same criticisms can be leveled against some WISPs that use licensed spectrum. Having licensed spectrum doesn’t overcome issues of oversubscribed transmitters, poor backhaul, or serving customers with poor line-of-sight or out of range of the radios. I’m not a big fan of giving grant funding to WISPs who put profits above signal quality and customer performance – but I’m not sure how a grant office would know this.

I have to think that the real genesis for the NTIA’s decision is the real-life practices of WISPs that do a poor job. It’s something that is rarely talked about – but it’s something that any high-quality WISP will bend your ear about.

By contrast, it’s practically impossible to deploy a poor-quality fiber network – it either works, or it doesn’t. I have no insight into the discussions that went on behind the scenes at the NTIA, but I have to think that a big part of the NTIA’s decision was based upon the many WISPs that are already unreliable. The NTIA decision means unlicensed-spectrum WISPs aren’t eligible for grants – but they are free to compete for broadband customers. WISPs that offer a high-quality product at a good price will still be around for many years to come.

The NTIA Preference for Fiber

As might be expected when there is $42.5 billion in grant funds available, we are probably not done with the rules for the BEAD grants. There are several areas where heavy lobbying is occurring to change some of the rules established by the NTIA in the NOFO for the grants.

One of the areas with the most lobbying is coming from WISPs that are complaining that the NTIA has exceeded its statutory authority by declaring a strong preference for fiber. The NTIA went so far as to declare that fixed wireless technology that doesn’t use licensed spectrum is not a reliable source of broadband and isn’t eligible for BEAD grants. The wireless industry says that the NTIA is out of bounds and not sticking to a mandate to be technology neutral.

I decided to go back to the Infrastructure Investment and Jobs legislation and compare it with the NOFO to see if that is true. Let’s start with the enabling language in the legislation. The IIJA legislation makes it clear that the NTIA must determine the technologies that are eligible for the BEAD grants. One of the criteria the NTIA is instructed to use is that grant-funded technologies must be deemed to be reliable. Reliable is defined in the Act using factors other than speed and specifically says that the term “reliable broadband service’ means broadband service that meets performance criteria for service availability, adaptability to changing end-user requirements, length of serviceable life, or other criteria, other than upload and download speeds.

I interpret ‘adaptability to end-user requirements’ to mean that a grant-eligible technology must have some degree of what the industry has been calling being future-proofed. A grant-funded technology must be able to meet future broadband needs and not just the needs of today.

‘Length of serviceable life’ refers to how long a grant investment might be expected to last. Historically, broadband electronics of all types typically don’t have a useful life of much more than a decade. Electronics that sit outside in the elements have an even shorter expected life, with components like outdoor receivers for wireless not usually lasting more than seven years. The broadband assets with the longest useful lives are fiber, huts, and new wireless towers. If you weigh together the average life of all of the components in a broadband network, the average useful life of a fiber network will be several times higher than the useful life of a wireless network.

NTIA then used the reliable service criteria to classify only four technologies as delivering a reliable signal – fiber, cable modem hybrid fiber-coaxial technology, DSL over copper, and terrestrial fixed wireless using licensed spectrum. Since DSL cannot deliver the speeds required by the grants, that leaves only three technologies eligible for BEAD grants.

The legislation allows the NTIA to consider other factors. It appears that one of the other factors the NTIA chose is the likelihood that a strong broadband signal will reach a customer. I speculate that fixed wireless using only unlicensed spectrum was eliminated because interference of unlicensed spectrum can degrade the signal to customers. It’s a little harder to understand which factors were used to eliminate satellite broadband. The high-orbit satellites are eliminated by not being able to meet the 100-millisecond requirement for latency established by the legislation. I would speculate that low-orbit satellites are not eligible for grants because the average life of a given satellite is being touted as being about seven years – but I’m sure there are other reasons, such as not yet having any proof of the speeds that can be delivered when a satellite network fills with customers.

From the short list of technologies deemed to be reliable, the NTIA has gone on to say several times in the NOFO that there is a preference for fiber. When looking at the factors defined by the legislation, fiber is the most future-proofed because speeds can be increased drastically by upgrading electronics. Fiber also has a much longer expected useful life than wireless technology.

The accusations against the NTIA seem to be implying that the NTIA had a preference for fiber even before being handed the BEAD grants. But in the end, the NTIA’s preference for fiber comes from ranking the eligible technologies in terms of how the technologies meet the criteria of the legislation. It’s worth noting that there are other parts of the NOFO that do not promote fiber. For example, state broadband offices are encouraged to consider other alternatives when the cost of construction is too high. I think it’s important to note that any NTIA preference for fiber does not restrict a state from awarding substantial awards to fixed wireless technology using licensed spectrum – that’s going to be a call to make by each state.

There is a lot of lobbying going on the expand the NTIA’s list to include fixed wireless using unlicensed spectrum and satellite broadband. I’ve even heard of rumors of lawsuits to force the expansion of the available technologies. That’s the primary reason I wrote this blog – as a warning that lobbying and/or lawsuits might delay the BEAD grants. I think the NTIA has done what the legislation required, but obviously, anybody who is being excluded from the grants has nothing to lose by trying to get reinstated in the grants. When there is this much money at stake, I don’t expect those who don’t like the NTIA rules to go away quietly.

LTE-U

Cell-TowerRecently, the NCTA asked the FCC to make sure that wireless carriers don’t interfere with WiFi spectrum. I wrote a blog a few weeks ago talking about all of the demands on WiFi, and the threat that the NCTA is warning about is another use of the already busy WiFi spectrum.

Cellular carriers are using LTE technology to deliver 4G data. Cellular carriers today deliver 4G data and voice using spectrum for which they have paid billions (at least in the US and Europe). But in urban areas the LTE spectrum is already stressed and the demand for the existing spectrum is growing far faster than the carriers can find new spectrum to offload the extra demand.

The cellular carriers have had their eye on the 5 GHz unlicensed band of spectrum that is used for WiFi. This is a big swatch of spectrum that in some markets is larger than the band that some carriers have for LTE. Recently, various carriers have been experimenting with using this public spectrum to deliver LTE. Huawei and NTT demonstrated this capability last August; Qualcomm showed this capability at the CES show earlier this year. It’s rumored that T-Mobile plans to run a trial of this technology this year.

This new technology is being called LTE-U (for Unlicensed). NCTA filed at the FCC on behalf of their cable company members who use this WiFi spectrum to deliver WiFi for various uses such as distributing data wirelessly around a home or to bring data to settop boxes. They are worried that if the cellular companies start using the spectrum that they will swamp it and make WiFi useless for everybody else, particularly in urban areas where WiFi is under the most pressure.

That certainly is a valid concern. As my recent blog noted, the list of companies and technologies that are planning on using WiFi spectrum is large and growing. And there is already notable stress on WiFi around crowded places like large hotels, convention centers, and stadiums. The fear is that if cellular carriers start using the spectrum this same crowding will spread to more places, making the spectrum useless to everyone.

The cellular carriers argue that the swath of WiFi is large enough to allow them to use it without hurting other users. They argue that nobody can use all of the 400 MHz of spectrum in that band all at once. While that is true, it doesn’t take a huge pile of LTE-U customers at one time to locally overdraw the WiFi spectrum in the same manner that they are overloading the cellular spectrum today.

Engineers tell me that LTE uses the spectrum more efficiently today than does most WiFi technologies. This is due to the fact that the LTE specifications very neatly limit the bandwidth that any one customer can draw while most WiFi applications will let a user grab all of the bandwidth if it’s available. This means you can fit a lot more LTE customers into the spectrum that might be assigned to one WiFi customer.

There is a characteristic of WiFi that makes it incompatible with the way that LTE works. WiFi has been designed to share spectrum. When one customer is using WiFi they can grab a huge swath of spectrum. But when another customer demands bandwidth the system dynamically decreases the first connected customer to make room for the second one. This is very different than how LTE works. LTE works more like a telephone network and if there is enough bandwidth available to handle a customer it will assign a band to the customer or else deliver a ‘busy signal’ (no bars) if there us not enough bandwidth. The problem with these two different operating systems is that LTE would continually grab spectrum until it’s all used and the WiFi users are shut out, much like what you might get in a busy hotel in the evening.

The LTE providers say they have handled this by introducing a new protocol called LAA (Licensed Assisted Access) which introduces the idea of coexistence into the LTE network. If it works properly, LAA ought to be able to coexist with WiFi in the same manner that multiple WiFi customers coexist. Without this change in protocol LTE would quickly gobble all of the free WiFi spectrum.

But this still doesn’t answer the concern that even with LAA there could be a lot of people trying to grab bandwidth in environments where the WiFi is already stressed. Such a network never shuts anybody out like an LTE system will, but rather will just keep subdividing the bandwidth forever until the amount each customer gets is too small to use.

It will be interesting to see what the FCC says about this. This was discussed years ago and the FCC never intended to let licensed cellular holders snatch the public WiFi spectrum. I will also be curious to see if wireless carriers try to charge customers for data usage when that data is being delivered over a free, unlicensed swath of spectrum. And how will customers even know that is where they are getting their data?

I hope the FCC doesn’t let the wireless carriers run rampant with this, because I think it’s inevitable that this is going to cause huge problems. There are already places today where WiFi is overloaded, and this new kind of data traffic could swamp the spectrum in a lot more places. The wireless carriers can make promises all day about how this won’t cause problems, but it doesn’t take a huge number of LTE-U users at a cell site to start causing problems.

Why Not 3.65 GHz?

Transmitter_tower_in_SpainAny company about deploying point-to-multipoint wireless data services ought to be thinking about using the 3.65 GHz spectrum. Unless you happen to own other licensed spectrum, this is probably your best alternative to using the normal unlicensed spectrum. But in many places the normal unlicensed bands of 900MHz, 2.4GHz, and 5.8GHz are congested, and are getting more so every day. I’ve written earlier blogs talking about how all of the cable companies and telcos are now using unlicensed spectrum routers at almost every home. And the Internet of Things is going to pile a ton of new uses onto unlicensed spectrum everywhere.

The FCC authorized the 3.65GHz – 3.70GHz frequency for public use in 2006, with some usage rules to maximize the utility of the spectrum. The rules are aimed to provide the most benefit to smaller markets and less densely populated areas. This can mean a cleaner signal for any carrier deploying a point-to-multipoint wireless services. A few of the rules include:

Restricted Locations. The spectrum cannot be used close to existing government installations or satellite earth stations that use the spectrum. So you can’t deploy around some of the larger air force bases and around a handful of remaining satellite earth stations. The FCC maintains a list of the restricted locations. It should be noted that the earthstation market has been consolidating and over the last few years a number of older earthstations have been decommissioned. This restriction does not block the spectrum in too many places.

Licensed Use. You can license the spectrum for a $280 fee. However, such a license is not exclusive and every holder of the spectrum is expected to coordinate with other users. This is not like a normal FCC license and it is not first come first serve. Everyone using the spectrum in a given area is expected to work with others to minimize interference. The FCC will act as the arbiter if parties can’t work things out. I would point out that in a point-to-multipoint deployment it I fairly easy to keep interference to a minimum.

Contention. There are different rules for using the spectrum depending upon how you deploy it. The rules promote using radios that deploy other spectrum in addition to 3.65 GHz. For radios that only use this spectrum the usage is limited to the 25 MHz band between 3.65 and 3.675 GHz. But radios that allow for a shift to other frequencies when there is contention can use the full 50 MHz channel within the frequency.

The frequency can support bandwidth on one channel up to a theoretical 37 Mbps download. But real life deployments are called somewhere around 25 Mbps close to the transmitter.

Radios for this frequency are readily available from most of the major point-to-multipoint radio manufacturers. The price of the base stations and customer CPE are very much in line with the cost of radios in the unlicensed bands.

One advantage of this spectrum is that it can go a significant distance. It can theoretically work to the horizon, but the throughput diminishes with distance. Life with most bandwidth, you can engineer to get good bandwidth at the outside of your range by sacrificing bandwidth close to the antenna, or you can alternately go for big bandwidth close to the tower with decreasing bandwidth with distance. It’s easy to engineer a system that can deliver 10 Mbps download at five miles. We’ve seen 3 Mbps at 9 miles.

This frequency is best used in a rural deployment, because the bandwidth from a given sector of a basestation is shared with all of the customers using that sector. Like with any shared bandwidth technology, the more customers you cram onto the system, the less bandwidth available for each customer, particularly at peak times.