Categories
Technology

Fixed Cellular Broadband Performance

One of the first in-depth reviews I’ve found for T-Mobile’s fixed cellular broadband was published in the Verve. It’s not particularly flattering to T-Mobile, and this particular customer found the performance to be unreliable – fast sometimes and barely functioning at other times. But I’ve seen other T-Mobile customers raving about the speeds they are receiving.

We obviously can’t draw any conclusions based upon a single review by one customer, but his experience and the contrasting good reviews by others prompted me to talk about why performance on cellular broadband networks can vary so significantly.

I’ve always used the word wonky to describe cellular performance. It’s something I’ve tracked at my own house, and for years the reception of the cellular signal in my home office has varied hour-by-hour and day-by-day. This is a basic characteristic of cellular networks that you’ll never find the cellular carriers talking about or admitting.

The foremost issue with cellular signal strength is the distance of a customer from the local cellular tower. All wireless data transmissions weaken with distance. This is easy to understand. Wireless transmissions spread after they leave a transmitter. The traditional way we depict a wireless transmission shown in the diagram below demonstrates the spread. If two customers have the same receiver,  a customer who is closer to the tower will receive more data bits sooner than somebody who is further after the signal has spread. The customer in the bad review admitted he wasn’t super close to a cell tower, and somebody in his own neighborhood who lives closer to the cell site might have a stronger signal and a better opinion of the product.

There are other factors that create variability in a cellular signal. One is basic physics and the way radio waves behave outdoors. The cellular signal emanating from your local cell tower varies with the conditions in the atmosphere – the temperature, humidity, precipitation, and even wind. Things that stir up the air will affect the cellular signal. A wireless signal in the wild is unpredictable and variable.

Another issue is interference. Cellular companies that use licensed spectrum don’t want to talk about interference, but it exists everywhere. Some interference comes from natural sources like sunspots. But the biggest source of interference is the signal from other cell towers. Interference occurs any time there are multiple sources of the same frequency being used in the same area.

The customer in the review talks about the performance differing by the time of day. That is a phenomenon that can affect all broadband networks and is specific to the local robustness of the T-Mobile network. Performance drops when networks start getting too busy. Every DSL customer or cable company broadband customer has witnessed the network slowing at some times of the day. This can be caused by too many customers sharing the local network – in this case, the number of customers using a cell tower at the same time. The problem can also because caused by high regional usage if multiple cell towers share the same underlying broadband backbone.

The final issue that is somewhat unique to cellular networks is carrier priority. It’s highly likely that T-Mobile is giving first priority to customers using cell phones. That’s the company’s primary source of revenue, so cell phones get first dibs at the bandwidth. That means in busy times that the data left over for the fixed cellular customers might be greatly pinched. As T-Mobile and other carriers sell more of the fixed product, I predict the issue of having second priority will become a familiar phenomenon.

This blog is not intended to be a slam against fixed cellular broadband. The customer that wrote the review switched to cellular broadband to get a less expensive connection than from his cable company. This customer clearly bought into the T-Mobile advertising hype because a cellular broadband signal will never be as reliable as a signal delivered through wires.

We can’t forget the real promise of fixed cellular broadband – bringing broadband to folks who have no alternatives. Somebody that switched to T-Mobile from a 1 Mbps rural DSL product would have written a different and more glowing review of the same product. The bottom line is that anybody buying cellular broadband should recognize that it’s a wireless product – and that means the product comes with the quirks and limitations that are inherent with wireless broadband. I imagine that we’re going to continue to see bad reviews from customers who want to save money but still want the performance that comes with wired broadband. This is another reminder that it’s a mistake to judge a broadband product strictly by the download speed – a 100 Mbps cellular broadband product is not the same as a 100 Mbps cable company connection.

Categories
Uncategorized

Broadband Interference

Jon Brodkin of ArsTechnica published an amusing story about how the DSL went out of service in a 400-resident village in Wales each morning at 7:00 am. It turns out that one of the residents turned on an ancient television that interfered with the DSL signal to the extent that the network collapsed. The ISP finally figured this out by looking around the village in the morning with a spectrum analyzer until they found the source of the interference.

It’s easy to think that the story points out another weakness of old DSL technology, but interference can be a problem for a lot of other technologies.

This same problem is common on cable company hybrid-fiber coaxial networks. The easiest way to understand this is to think back to the old days when we all watched analog TV. Anybody who watched programming on channels 2 through 5 remembers times when the channels got fuzzy or even became unwatchable. It turns out that there are a lot of different devices that interfere with the frequencies used for these channels including things like microwave ovens, certain motors like power tools and lawnmowers, and other devices like blenders. It was a common household occurrence for one of these channels to go fuzzy when somebody in the house, or even in a neighboring home used one of these devices.

This same interference carries forward into cable TV networks. Cable companies originally used the same frequencies for TV channels inside the coaxial wires that were used over the air and the low TV channels sat between the 5 MHz and 42 MHz frequency. It turns out that long stretches of coaxial wires on poles act as a great antenna, so cable systems pick up the same kinds of interference that happens in homes. It was pretty routine for channels 2 and 3, in particular, to be fuzzy in an analog cable network.

You’d think that this interference might have gone away when cable companies converted TV signals to digital. The TV transmissions for channels 2 through 5 got crystal clear because cable companies relocated the digital version of these channels to better frequency. When broadband was added to cable systems the cable companies continue to use the low frequencies. CableLabs elected to use these frequencies for the upload portion of broadband. There is still plenty of interference in cable networks today – probably even more than years ago as coaxial networks have aged and have more points for interference to seep into the wires. Until the pandemic, we didn’t care much about upload bandwidth, but it turns out that one of the major reasons that cable companies struggle to deliver reliable upload speeds is that they are using the noisiest spectrum for the upload function.

The DSL in the village suffered from the same issue since the telephone copper wires also act as a big outdoor antenna. In this village, the frequency emanating from the old TV exactly matched the frequencies used for DSL.

Another common kind of interference is seen in fixed wireless networks in a situation where there are multiple ISPs using the same frequencies in a given rural footprint. I know of counties where there are as many as five or six different wireless ISPs, and most use the same frequencies since most WISPs rely on a handful of channels in the traditional WiFi bandwidth at 2.4 MHz and 5 MHz. I’ve heard of situations where WiFi is so crowded that the performance of all WISPs suffer.

WiFi also suffers from local interference in the home. The WiFi standard says that all devices have an equal chance of using the frequencies. This means that a home WiFi router will cycle through all the signals from all devices trying to make a WiFi connection. When a WiFi router connects with an authorized device inside the home it allows for a burst of data, but then the router disconnects that signal and tries the next signal – cycling through all of the possible sources of WiFi.

This is the same issue that is seen by people using WiFi in a high-rise apartment building or a hotel where many users are trying to connect to WiFi at the same time. Luckily this problem ought to improve. The FCC has authorized the use of 6 GHz spectrum for home broadband which opens up numerous new channels. Interference will only occur between devices trying to share a channel, but that will be far fewer cases of interference than today.

The technology that has no such interference is fiber. Nothing interferes with the light signal between a fiber hub and a customer. However, once customers connect the broadband signal to their home WiFi network, the same interference issues arise. I looked recently and can see over twenty other home WiFi networks from my office – a setup ripe for interference. Before making too much fun of the folks in the Welsh village, there is a good chance that you are subject to significant interference in your home broadband today.

Categories
Technology

Wi-FM

Anytime there are too many WiFi networks in the same proximity it’s inevitable to have contention between networks. Such contention will cause a WiFi network to slow down since the current WiFi standards tells a WiFi device to back-off whenever it sees interference, meaning that two neighboring WiFi networks will both back off when there is contention. People with slow WiFi tend to blame their router for their problems, but often it is this contention that is slowing them down.

Using research first done at MIT and recently revived at Northwestern University, engineers have figured out a way to greatly reduce the contention between neighboring WiFi networks using a technology they are dubbing as Wi-FM since it uses a tiny slice of FM frequency to resolve conflicts.

It’s not hard to imagine situations where WiFi can become congested. For instance, consider somebody living in an apartment building who has other WiFi routers over, under and on all sides, all relatively close. We tend to think of WiFi as being a pretty reliable transmission medium, but when there are many networks all trying to work at the same time there can be a tremendous amount of interference, and a major degradation of throughput.

The Wi-FM technology uses the tiny slice of FM radio spectrum that is reserved for the Radio Data System (RDS). This is the spectrum that is used to transmit the content information about the FM radio programming and is used in your car radio, for example, to tell you the name of the song and the artist you are listening to.

Along with the broadcast information the RDS system also utilizes a time slot technology that allows it to sync up the broadcast information with songs as they change. The Wi-FM technology takes advantage of these time slots and uses the quiet times when there is no broadcast information being sent to monitor the WiFi signals and to direct packets so that they don’t interfere.

WiFi utilizes multiple channels, and if all channels are used efficiently then much of the interference between neighboring networks can be avoided. But there are no techniques that can direct WiFi to change channels on the fly that can be done easily from inside the WiFi spectrum without eating up a lot of the available spectrum in the effort. Using the slice of FM frequency as an external traffic cop allows for the rapid routing of contentious packets to different channels and can greatly reduce contention and interference.

This technology would probably be best used today in places like apartment buildings where there are multiple WiFi networks. But we are moving into a future where there is likely to be a lot more WiFi interference. For example, there are plans to use a continuous WiFi signal to power cellphones and small IoT sensors. And the cellular industry wants to use WiFi as overflow for LTE calls.

So however busy WiFi is today, the chances are that it’s going to get a lot busier in the future. And that means there will be lot more interference between packets. Wi-FM is just one of many techniques that are probably going to be needed if we want to keep the public spectrum usable in busy places. Otherwise, the interference will just accumulate to shut the spectrum down at the busiest times of the day.

Categories
Technology

Are We Expecting too Much from WiFi?

I don’t think that a week goes by when I don’t see somebody proposing a new use for WiFi. This leads me to ask if we are starting to ask too much from WiFi, at least in urban areas.

Like all spectrum, WiFi is subject to interference. Most licensed spectrum has strict rules against interference and there are generally very specific rules about how to handle contention if somebody is interfering with a licensed spectrum-holder. But WiFi is the wild west of spectrum and it’s assumed there is going to be interference between users. There is no recourse to such interference – it’s fully expected that every user has an equal right to the spectrum and everybody has to live with the consequences.

I look at all of the different uses for WiFi and it’s not too hard to foresee problems developing in real world deployments. Consider some of the following:

  • Just about every home broadband connection now uses WiFi as the way to distribute data around the house between devices.
  • Comcast has designed their home routers to have a second public transmitter in addition to the home network, so these routers initiate two WiFi networks at the same time.
  • There is a lot of commercial outdoor WiFi being built that can bleed over into home networks. For example, Comcast has installed several million hotspots that act to provide convenient connections outside for their landline data customers.
  • Many cities are contemplating building citywide WiFi networks that will provide WiFi for their citizens. There are numerous network deployments by cities, but over the next few years I think we will start seeing the first citywide WiFi networks.
  • Cable companies and other carriers are starting to replace the wires to feed TVs with WiFi. And TVs require a continuous data stream when they are being used.
  • Virtual reality headsets are likely to use WiFi to feed the VR headsets. There are already game consoles using WiFi to connect to the network.
  • There is a new technology that will use WiFi to generate the power for small devices like cellphones. For this technology to be effective the WiFi has to beam continuously.
  • And while not big bandwidth user at this point, a lot of IoT devices are going to count on WiFi to connect to the network.

On top of all of these uses, the NCTA sent a memo to the FCC on June 11 that warned of possible interference with WiFi spectrum from outside through the LTE-U or LAA spectrum used for cellphones. Outside interference is always possible, and in a spectrum that is supposed to have interference this might be hard to detect or notice for the average user. There is generally nobody monitoring the WiFi spectrums for interference in the same ways that wireless carriers monitor their licensed spectrum.

All of these various uses of the spectrum raise several different concerns:

  • One concern is just plain interference – if you cram too many different WiFi networks into one area, each trying to grab the spectrum, you run into traditional radio interference which cuts down on the effectiveness of the spectrum.
  • WiFi has an interesting way of using spectrum. It is a good spectrum for sharing applications, but that is also its weakness. When there are multiple networks trying to grab the WiFi signal, and multiple user streams within those networks, each gets a ‘fair’ portion of the spectrum which is going to somehow be decided by the various devices and networks. This is a good thing in that it means that a lot of simultaneous streams can happen at the same time on WiFi, but it also means that under a busy load the spectrum gets chopped into tiny little steams that can be too small to use. Anybody who has tried to use WiFi in a busy hotel knows what that’s like.
  • All WiFi is channelized, or broken down into channels instead of being one large black of spectrum. The new 802.11ac that is being deployed has only two 160 MHz channels and once those are full with a big bandwidth draw, say a virtual reality headset, then there won’t be room for a second large bandwidth application. So forget using more than one VR headset at the same time, or in general trying to run more than one large bandwidth-demanding application.

It’s going to be interesting to see what happens if these problems manifest in homes and businesses. I am imagining a lot of finger-pointing between the various WiFi device companies – when the real problem will be plain old physics.

Categories
Current News Technology

The 600 MHz Incentive Auction

The FCC has again delayed the incentive auction for the 600 MHz spectrum. In a recent public notice the FCC in FCC 14-191, the agency is seeking comments on bidding procedures for the upcoming auction. Most of the document deals with the non-technical aspects of the auction such as bid pricing and procedures.

For those not familiar with this spectrum, today much of it is used by UHF television stations. The upcoming auction is being called an incentive auction because TV stations willing to give up their public spectrum or to be relocated within the spectrum will share in the proceeds of the sale of their spectrum.

But stations aren’t being mandated to leave this spectrum and the recent public notice discusses for the first time what might happen to stations that elect to remain on the public airwaves. The FCC proposes to ‘repack’ a stations frequency and to put it anywhere within the 600 MHz range in such a way as to optimize the 600 MHz frequency in a given market.

The controversial part of the idea is that stations could be placed into spectrum that is used by somebody else. For instance a TV station could be put into spectrum that is reserved today for wireless microphones. Or even more controversial, a station could be placed into what is called the duplex gap, which is a spectrum buffer that sits between major pieces of spectrum and that is used to reduce interference between different technologies. The easiest way to think of the duplex gap is to envision it as a buffer channel that nobody gets to use.

This FCC’s ideas aren’t pleasing anybody. TV stations are now worried that they will end up in parts of the spectrum that will be polluted by other traffic and that will mar transmission quality. And the wireless carriers are unhappy since the TV stations might end up interfering with cellular calls. It’s going to be interesting to read the comments that the FCC gets on this issue and to see how they can resolve it. The auction will quickly fall apart if the stations all decide to not participate.

There are many other interesting parts to this auction. The FCC would like to assign some of the 600 MHz band as unlicensed spectrum for use for WiFi. The 600 MHz band is one of the more useful spectrum bands around in terms of transmission characteristics. It can go long distances and can travel easily through walls and buildings (just think back to the ease of receiving UHF channels on your TV in the basement). The FCC also wants to create more room for ‘white space devices’ that can use the spectrum for high-speed wireless data transmission.

But not everybody is enthusiastic about the ways that the FCC plans to do this. The FCC’s plans are to very aggressively squeeze as much use as possible out of the spectrum and to allow white space devices to operate in the guard bands at power levels that might impair licensed spectrum. AT&T has said that it might not participate in the auction if it believes that the spectrum it buys will be compromised.

The fear expressed by radio engineers is that the current proposal will cause noticeable interference. For example, they say that a device using the white space, say a tablet, and a cellphone using a licensed portion of the 600 MHz might interfere with each other when used together in the same room.

There are already a lot of devices using this frequency today. In addition to the low power TV stations it’s used widely by wireless microphones, medical telemetry and radio astronomy, and there is fear that the repackaging is going to harm all of these uses.

I don’t know if the FCC has anything harder to solve than our shortage of spectrum. The demand for spectrum has grown rapidly and many of the existing bands get easily congested with traffic at peak times. The wireless carriers are clamoring for more spectrum while at the same time there are dozens of other uses of the spectrum including public safety and the military that must be considered in any wireless plan.

I don’t know if it would be possible to develop a good spectrum allocation plan if you started from scratch today, but it seems nearly impossible to satisfy everybody as we try to fit new uses of spectrum over top of a spectrum allocation that was made in a very different time. I don’t envy the FCC the task of figuring this out.