How Much Better is 802.11ax?

The new WiFi standard 802.11ax is expected to be ratified and released as a standard sometime next year. In the new industry nomenclature this now be called WiFi-6. A lot of the woes we have today with bandwidth in our home is due to the current 802.11ac standard that this will be replacing. 802.11ax will introduce a number of significant improvements that ought to improve home WiFi performance.

To understand why these improvements are important we need to first understand the shortcomings of the current WiFi protocols. The industry groups that developed the current WiFi standards had no idea that WiFi would become so prevalent and that the average home might have dozens of WiFi capable devices. The current problems all arise from a WiFi router trying to satisfy multiple demands for a data stream from multiple devices. Unlike cellular technologies, WiFi has no central traffic cop and every device in the environment can make an equal claim for connectivity. When a WiFi router has more demands for usage than it has available channels it pauses and interrupts all data streams until it chooses how to reallocate bandwidth. In a busy environment these stops and restarts can be nearly continuous.

The improvements from 802.11ax will all come from smarter ways to handle requests for connectivity from multiple devices. There is only a small improvement in overall bandwidth with a raw physical data rate of 500 Mbps compared to 422 for 802.11ac. Here are the major new innovations:

Orthogonal Frequency-Division Multiple Access (OFDMA). This improvement will likely have the biggest impact in a home. OFDMA can slice the few big existing WiFi channels into smaller channels, being called resource units. A router will be able to make multiple smaller bandwidth connections using resource units and avoid packet collision and the start/stop cycle of each device asking for primary connectivity.

Bi-Directional Multi-User MIMO. In the last few years we’ve seen home WiFi routers introduce MIMO, which uses multiple antennas to make connections to different devices. This solves one of the problems of WiFi by allowing multiple devices to download separate data streams at the same time without interference. But today’s WiFi MIMO still has one big problem in that the MIMO only work for downloading. Whenever there is a request for any device to use a channel for uploading, today’s MIMO pauses all the downloading streams. Bi-Directional MIMO will allow for 2-way data streams meaning that a request to upload won’t kill downstream transmissions.

Spatial Frequency Reuse. This will have the most benefit in apartments or in homes that have networked multiple WiFi routers. Today a WiFi transmission will pause for any request for connection, even for connections made to a neighbor’s router from the neighbor’s devices. Spatial Frequency Reuse doesn’t fix that problem, but it allows neighboring 802.11.ax routers to coordinate and to adjust the power of transmission requests to increase the chance that a device can connect to and maintain a connection to the proper router.

Target Wake Time. This will allow small devices to remain silent most of the time and only communicate at specific and pre-set times. Today a WiFi router can’t distinguish between a request from a smart blender and a smart TV, and requests from multiple small devices can badly interfere with the streams we care about to big devices. This feature will reduce, and distribute over time the requests for connectivity from the ever-growing horde of small devices we all have.

There’s no rush to go out and buy and 802.11ax router, although tech stores will soon be pushing them. Like all generations of WiFi they will be backwards compatible with earlier WiFi standards, but for a few years they won’t do anything differently than your current router. This is because all of the above features require updated WiFi edge devices that also contain the new 802.11ax standard. There won’t be many devices manufactured with the new standard even in 2019. Even after we introduce 802.11ax devices into our home we’ll continue to be frustrated since our older WiFi edge devices will continue to communicate in the same inefficient way as today.

The Future of WiFi

There are big changes coming over the next few years with WiFi. At the beginning of 2017 a study by Parks Associates showed that 71% of broadband homes now use WiFi to distribute the signal – a percentage that continues to grow. New home routers now use the 802.11ac standard, although there are still plenty of homes running the older 802.11n technology.

But there is still a lot of dissatisfaction with WiFi and many of my clients tell me that most of the complaints they get about broadband connections are due to WiFi issues. These ISPs deliver fast broadband to the home only to see WiFi degrading the customer experience. But there are big changes coming with the next generation of WiFi that ought to improve the performance of home WiFi networks. The next generation of WiFi devices will be using the 802.11ax standard and we ought to start seeing devices using the standard by early 2019.

There are several significant changes in the 802.11ax standard that will improve the customer WiFi experience. First is the use of a wider spectrum channel at 160 MHz, which is four times larger than the channels used by 802.11ac. A bigger channel means that data can be delivered faster, which will solve many of the deficiencies of current WiFi home networks. This will improve the network performance using the brute strength approach of pushing more data through a connection faster.

But probably more significant is the use in 802.11ax of 4X4 MIMO (multiple input / multiple output) antennas. These new antennas will be combined with orthogonal frequency division multiple access (ODMFA). Together these new technologies will provide for multiple and separate data streams within a WiFi network. In layman’s terms think of the new technology as operating four separate WiFi networks simultaneously. By distributing the network load to separate channels the interference on any given channel will decrease.

Reducing interference is important because that’s the cause of a lot of the woes of current WiFi networks. The WiFi standard allows for unlimited access to a signal and every device within the range of a WiFi network has an equal opportunity to grab the WiFi network. It is this open sharing that lets us connect lots of different devices easily to a WiFi network.

But the sharing has a big downside. A WiFi network shares signals by shutting down when it gets more than one request for a signal. The network pauses for a short period of time and then bursts energy to the first network it notices when it reboots. In a busy WiFi environment the network stops and starts often causing the total throughput on the network to drop significantly.

But with four separate networks running at the same time there will be far fewer stops and starts and a user on any one channel should have a far better experience than today. Further, with the ODMFA technology the data from multiple devices can coexist better, meaning that a WiFi router can better handle more than one device at the same time, further reducing the negative impacts of completing signals. The technology lets the network smoothly mix signals from different devices to avoid network stops and starts.

The 802.11ax technology ought to greatly improve the home WiFi experience. It will have bigger channels, meaning it can send and receive data to WiFi connected devices faster. And it will use the MIMO antennas to make separate connections with devices to limit signal collision.

But 802.11ax is not the last WiFi improvement we will see. Japanese scientists have made recent breakthroughs in using what is called the TeraHertz range of frequency – spectrum greater than 300 GHz per second. They’ve used the 500 GHz band to create a 34 Gbps WiFi connection. Until now work in these higher frequencies have been troublesome because the transmission distances for data transmission has been limited to extremely short distances of a few centimeters.

But the scientists have created an 8-array antenna that they think can extent the practical reach of fast WiFi to as much as 30 feet – more than enough to create blazingly fast WiFi in a room. These frequencies will not pass through barriers and would require a small transmitter in each room. But the scientists believe the transmitters and receivers can be made small enough to fit on a chip – making it possible to affordably put the chips into any device including cell phones. Don’t expect multi-gigabit WiFi for a while. But it’s good to know that scientists are working a generation or two ahead on technologies that we will eventually want.

Why is my WiFi Slow?

Wi-FiOne of the universal complaints in the broadband world is that WiFi networks operate poorly. So today I thought I’d talk a bit about how WiFi functions. I think it’s probably different than what most people expect.

Most people know that there are two frequencies used for WiFi today – 2.4 GHz and 5 GHz. The 2.4 GHz band covers 80 megahertz of total bandwidth and is divided into 11 channels in the US. That may sound like a lot, but one 802.11 connection requires five consecutive channels. In practical terms this means that almost all WiFi gear in the US is preset to only offer channels 1, 6, and 11 and that means that only three non-overlapping transmissions can occur at the same time. The WiFi in Japan covers a wider spectrum footprint, up to channel 14, meaning they can use four non-overlapping signals simultaneously.

In practical use if you can see three or more WiFi networks you are experiencing interference, meaning that more than one network is trying to use the same channel at the same time. It is the nature of this interference that causes the most problems with WiFi performance. When two signals are both trying to use the same channel, the WiFi standard causes all competing devices to go quiet for a short period of time, and then both restart and try to grab an open channel. If the two signals continue to interfere with each other, the delay time between restarts increases exponentially in a phenomenon called backoff. As there are more and more collisions between competing networks, the backoff increases and the performance of all devices trying to use the spectrum decays. Your data is transmitted in short bursts each time you make a connection and before the restart cycle repeats.

If you’ve ever been in a hotel where you can see ten or more other WiFi signals, the reason for slow speeds is that there are huge conflicts between competing devices. People generally assume that the hotel has a poor Internet connection, but they could have a fast connection and the slo speeds are due to so many devices trying to connect simultaneously. Each WiFi device is rapidly turning on and off repeatedly trying to get open access to a channel. Your device will grab a channel for a short time and then get kicked off due to interference. Congestion has become so bad on the 2.4 GHz band that AT&T and Comcast no longer use 2.4 GHz for video or voice. Almost all smartphone makers no longer recommend using their smartphones at 2.4 GHz.

WiFi has improved dramatically with the introduction of the 5 GHz spectrum. In North America this spectrum swath has 24 non-overlapping channels. However, more than half of these channels are reserved for weather and military radar. However, this still provides a lot more potential paths to add to the three paths provided by the 2.4 GHz spectrum. Unfortunately the 5 GHz band shares the same WiFi characteristics as the 2.4 GHz spectrum and has the identical interference issues. But with more open channels there is still an increased chance of finding a free channel to use.

And interference between devices is not the only culprit of poor WiFi speeds. The network configuration can also contribute to poor performance. Some of the biggest sources of interference are range extenders or mesh networks that are used to try to get better signals. Range extenders listen to all WiFi transmissions and then retransmit them at a higher power level, and usually using a different channel. This creates even more WiFi signals in the intermediate environment competing for an open channel. When you can see your neighbor’s WiFi network, if they are using range extenders they might be always trying to use most of the available WiFi channels.

In a lot of the US we now also see a lot of public hotspots. For example, Comcast is in my neighborhood and I can walk and maintain a WiFi signal is most places from WiFi public signals that are transmitted from every Comcast home WiFi router. These public signals are always on, meaning that the WiFi router is using at least one channel at all times.

Probably the biggest new culprit for poor WiFi performance comes from our quest for greater speeds. The 802.11ac standard operates by merging together a lot of WiFi channels, and divides the whole WiFi spectrum into just two 160 MHz-wide channels. This means that only two devices using this 802.11ac can use up all of your home WiFi bandwidth. This standard was intended to be used to operate in short high-bandwidth bursts, but as people use this for gaming or watching 4K video the channels stay occupied all of the time.

Unfortunately the demands for WiFi are only increasing. The cellular carriers are still pestering the FCC to allow LTE-U, which would using WiFi to complete cellular calls. There are currently tests underway of the technology. We can also expect an increasing demand for WiFi from IoT devices. While most WiFi devices won’t use spectrum continuously, they still place demands on the channels and cause interference. There are also increasing use of devices that are always on, such as video surveillance cameras or smart home controllers like the Amazon Echo. A lot of experts look out five or ten years and expect WiFi to be unusable in a lot of places.

Comcast and Real Competition

comcast-truck-cmcsa-cmcsk_largeIt’s really interesting to see how Comcast is reacting to Google Fiber in Atlanta. The company has had competition from fiber in the past in the form of Verizon FiOS. But the footprint for that competition hasn’t changed for years. Comcast and Verizon have competed with very similar data speeds and there was not a lot to distinguish one from the other from a product standpoint. Each company has bested the other in some markets, although Verizon seems to have gotten the upper hand in more places.

But now Comcast is facing Google Fiber for the first time and their reaction is interesting. From what I can see they are doing the following:

  • Comcast is offering a gigabit of speed for $70 per month. But it comes with a very ugly 3-year contract. For those that don’t take the 3-year contract the price will be $139.95 per month, plus Comcast will impose a 3 gigabit monthly data cap that could add up to $35 per month to anybody that actually uses the data.
  • Comcast is using negative advertising against Google’s WiFi router and says that Google’s Wifi’s speeds are 30 Mbps while their own is 725 Mbps.
  • And Comcast is widely distributing flyers that tell people in Atlanta not to fall for the Google hype.

So how do these claims stack up and will they be effective?

I think Comcast’s speed comparison is quite silly and that the public will see through it. The general public has been trained for a decade that fiber is better. Not that upload speeds matter to most people, but Google’s speeds are symmetrical while Comcast will have a relatively slow, perhaps 35 Mbps upload. On a fiber network it’s not too hard to engineer to deliver a true gigabit download almost all of the time. But Comcast is going to have the same issues it’s always had with its HFC network. If it sells too many gigabit customers, then its nodes will slow down for everybody on the node. I don’t believe that there are many homes today that really need a gigabit, but once Google is up and running it ought to win the speed test battle in the market.

There is some truth to Comcast’s claim about WiFi, although their numbers are quite skewed. For some reason Google Fiber is still using an 802.11n WiFi router. At best their WiFi routers are going to deliver about 300 Mbps – but in Kansas City the Google routers are reported on consumer websites to deliver about 80 Mbps on average. Comcast is offering 802.11ac routers, and while they are theoretically capable of the speeds they tout, in real life use they deliver between 200 Mbps and 300 Mbps.

The fact is that both companies (and most ISPs) are doing a very poor job with WiFi. Almost all of them offer a one-WiFi router solution which is not acceptable in today’s big bandwidth homes. I have a Comcast WiFi router and it delivers really low speeds to our offices which are opposite ends of the house from the central router. Until a carrier is willing to cross the threshold and install a WiFi network with multiple linked WiFi routers in a home, then all of their solutions are going to be poor in real life practice.

It appears that Comcast is relying on negative advertising against Google, and I seriously doubt this is going to work. Comcast has one of the most hated customer service experiences in the country and Google has been touted – so far – for offering outstanding customer service. It seems like a bad tactic to advertise negatively about somebody that will have a better network product and a better customer experience.

I think Comcast is really missing the point. It seems like they are spending their energy advertising against Google’s gigabit product. But Google announced that it is entering Atlanta with two data products – the gigabit at $70 and a 100 Mbps product at $50. My bet is that the slower product is likely to most cut into Comcast’s penetration rate unless they decide to scrap the 300 gigabit month data cap. Where Comcast says that only a small percentage of customers use more data than that per month, my clients tell me otherwise. Once any customer has been charged extra for a data cap overage on Comcast they most likely will change to Google and they are likely to never come back.

Are We Expecting too Much from WiFi?

Wi-FiI don’t think that a week goes by when I don’t see somebody proposing a new use for WiFi. This leads me to ask if we are starting to ask too much from WiFi, at least in urban areas.

Like all spectrum, WiFi is subject to interference. Most licensed spectrum has strict rules against interference and there are generally very specific rules about how to handle contention if somebody is interfering with a licensed spectrum-holder. But WiFi is the wild west of spectrum and it’s assumed there is going to be interference between users. There is no recourse to such interference – it’s fully expected that every user has an equal right to the spectrum and everybody has to live with the consequences.

I look at all of the different uses for WiFi and it’s not too hard to foresee problems developing in real world deployments. Consider some of the following:

  • Just about every home broadband connection now uses WiFi as the way to distribute data around the house between devices.
  • Comcast has designed their home routers to have a second public transmitter in addition to the home network, so these routers initiate two WiFi networks at the same time.
  • There is a lot of commercial outdoor WiFi being built that can bleed over into home networks. For example, Comcast has installed several million hotspots that act to provide convenient connections outside for their landline data customers.
  • Many cities are contemplating building citywide WiFi networks that will provide WiFi for their citizens. There are numerous network deployments by cities, but over the next few years I think we will start seeing the first citywide WiFi networks.
  • Cable companies and other carriers are starting to replace the wires to feed TVs with WiFi. And TVs require a continuous data stream when they are being used.
  • Virtual reality headsets are likely to use WiFi to feed the VR headsets. There are already game consoles using WiFi to connect to the network.
  • There is a new technology that will use WiFi to generate the power for small devices like cellphones. For this technology to be effective the WiFi has to beam continuously.
  • And while not big bandwidth user at this point, a lot of IoT devices are going to count on WiFi to connect to the network.

On top of all of these uses, the NCTA sent a memo to the FCC on June 11 that warned of possible interference with WiFi spectrum from outside through the LTE-U or LAA spectrum used for cellphones. Outside interference is always possible, and in a spectrum that is supposed to have interference this might be hard to detect or notice for the average user. There is generally nobody monitoring the WiFi spectrums for interference in the same ways that wireless carriers monitor their licensed spectrum.

All of these various uses of the spectrum raise several different concerns:

  • One concern is just plain interference – if you cram too many different WiFi networks into one area, each trying to grab the spectrum, you run into traditional radio interference which cuts down on the effectiveness of the spectrum.
  • WiFi has an interesting way of using spectrum. It is a good spectrum for sharing applications, but that is also its weakness. When there are multiple networks trying to grab the WiFi signal, and multiple user streams within those networks, each gets a ‘fair’ portion of the spectrum which is going to somehow be decided by the various devices and networks. This is a good thing in that it means that a lot of simultaneous streams can happen at the same time on WiFi, but it also means that under a busy load the spectrum gets chopped into tiny little steams that can be too small to use. Anybody who has tried to use WiFi in a busy hotel knows what that’s like.
  • All WiFi is channelized, or broken down into channels instead of being one large black of spectrum. The new 802.11ac that is being deployed has only two 160 MHz channels and once those are full with a big bandwidth draw, say a virtual reality headset, then there won’t be room for a second large bandwidth application. So forget using more than one VR headset at the same time, or in general trying to run more than one large bandwidth-demanding application.

It’s going to be interesting to see what happens if these problems manifest in homes and businesses. I am imagining a lot of finger-pointing between the various WiFi device companies – when the real problem will be plain old physics.

Maybe Finally a Faster WiFi

Wi-FiThe first wave of 802.11ac WiFi routers are starting to show up in use and already there is something faster on the horizon. IEEE has announced that they are starting to work on a new standard named 802.11ax and it looks like the new standard might be able to deliver on some of the hype and promises that were mistakenly made about 802.11ac. This new standard probably is not going to be released until 2018.

I call it unfortunate because 802.11ac has widely been referred to as gigabit WiFi but it is not even close to that. In the real world application of the technology it’s been reported that the ac routers can improve performance over today’s 802.11n routers by between 50% and 100%. That is a significant improvement and it is shame that the marketing hype of the companies that push the technology has created an unfulfillable expectation for these routers. I refer you to my earlier blog that compares the reality to the hype.

The gigabit name given to 802.11ac has more to do with the increased capacity of the router to handle large bandwidth than it did with the connection speeds to any given device. But the 802.11ax standard is going to turns its attention to increasing the connections to users. The early goal of the new standard is to increase bandwidth to devices by as much as 4 times over what can be delivered with 802.11ac.

This improvement is going to come through the use of MIMO-OFDA. MIMO is multiple input – multiple output and refers to a system that has multiple antennas in the router. Devices can also have multiple antennas although that’s not required. OFDA stands for orthogonal frequency division multiplexing and is a standard used in 4G wireless networks today

The combination of those two techniques means that more bits can be forced through a single connection to one device using a single receiving antenna. Making each individual connections from the router more efficient will improve the overall efficiency of the base router.

Interestingly, Huawei is already using these techniques in the lab and they are experiencing raw data rates as fast as 10 gigabits from a router. Huawei is one of the leaders of the 802.11ax standards process and they don’t believe these routers will be market ready until at least 2018

What I find most puzzling in today’s environment is that a lot of vendors have bought hook, line and sinker into the 802.11ac hype . For example, it’s been reported that a number of FTTH vendors and settop box vendors are touting the use of 802.11ac instead of cabling to route TV signals around a home. This might work for single family homes on large lots where there won’t be a lot of interference, but I can foresee many situations where this is going to be a challenge

Certainly there is a lot of chance for interference when you try to do this in an urban environment where living units are crammed a lot closer together. I highlighted some of the forms of WiFi interference in another earlier blog. But there are always other situations where WiFi will not be a great solution for transmitting cable signals between multiple sets. For example, there are plenty of older homes built in the fifties or earlier that have plaster walls with wire mesh lathe which can stop a WiFi signal dead. And there are homes that are larger than the range of the WiFi signal when considering walls and impediments.

But it looks like the 802.11ax standard will finally create enough bandwidth to individual devices to enable WiFi as a reliable alternate to cabling within a house. My fear is that there are going to be so many cases where 802.11ac is a problem that WiFi is going to get a bad name before then. I fear the vendors who are relying on WiFi instead of wires might have been a generation too premature. I hope I’m wrong, but 802.11ac does not look to be enough of an improvement over our current WiFi that it can act as a reliable alternative to wires.

Scratching My Head Over Gigabit Wireless

Wi-FiOver the last few weeks I have seen numerous announcements of companies that plan to deliver gigabit wireless speeds using unlicensed spectrum. For example, RST announced plans to deliver gigabit wireless all over the state of North Carolina. Vivant announced plans to do the same in Utah. And I just scratch my head at these claims.

These networks plan to use the 5 GHz portion of the unlicensed spectrum that we have all come to collectively call WiFi. And these firms will be using equipment that meets the new WiFi standard of 802.11ac. That technology has the very unfortunate common name of gigabit WiFi, surely coined by some marketing guru. I say unfortunate, because in real life it isn’t going to deliver speeds anywhere near to a gigabit. There are two ways to deploy this technology to multiple customers, either through hotspots like they have at Starbucks or on a point-to-multipoint basis. Let’s look at the actual performance of 802.11ac in these two cases.

There is no doubt that an 802.11ac WiFi hotspot is going to perform better than the current hotspots that use 802.11n. But how much better in reality? A number of manufacturers have tested the new technology in a busy environment, and with multiple users the new 80211.ac looks to be between 50% and 100% better than the older 802.11n standard. That is impressive, but that is nowhere near to gigabit speeds.

But let’s look deeper at the technology. One of the biggest improvements in the technology is that the transmitters can bond multiple WiFi channels to make one data path up to one 160 MHz channel. The downside to this is that there are only five channels in the 5 GHz range and so only a tiny handful of devices can use that much spectrum at the same time. When there are multiple users the channel size automatically steps down until it ends up at the same 40 MHz channels as 802.11n.

The most important characteristic of 5 GHz in this application is how fast the spectrum dies with distance. In a recent test with a Galaxy S4 smartphone, the phone could get 238 Mbps at 15 feet, 193 Mbps at 75 feet, 154 Mbps at 150 feet and very little at 300 feet. This makes the spectrum ideal for inside applications, but an outdoor hotspot isn’t going to carry very far.

So why do they call this gigabit WiFi if the speeds above are all that you can get? The answer is that the hotspot technology can include something called beamforming and can combine multiple data paths to a device (assuming that the device has multiple receiving antennas). In theory one 160 MHz channel can deliver 433 Mbps. However, in the real world there are overheads in the data path and about the fastest speed that has been achieved in a lab is about 310 Mbps. Combine three of those (the most that can be combine), and a device that is right next to the hotspot could get 900 Mbps. But again, the speeds listed above for the Galaxy S4 test are more representative of the speeds that can be obtained in a relatively empty environment. Put a bunch of users in the rooms and the speeds drop from there.

But when companies talk about delivering rural wireless they are not talking about hotspots, but about point-to-multipoint networks. How does this spectrum do on those networks? When designing a point-to-point network the engineer has two choices. They can open up the spectrum to deliver the most bandwidth possible. But if you do that, then the point-to-multipoint network won’t do any better than the hotspot. Or, through techniques known as wave shaping, they can design the whole system to maximize the bandwidth at the furthest point in the network. In the case of 5 GHz, about the best that can be achieved is to deliver just under 40 Mbps to 3 miles. You can get a larger throughput if you shorten that to one or two miles, but anybody who builds a tower wants to go as far as they can reach, and so 3 miles is the likely networks that will be built.

However, once you engineer for the furthest point, that is then the same amount of bandwidth that can be delivered anywhere, even right next to the transmitter. Further, that 40 Mbps is total bandwidth and that has to be divided into an upload and download path. This makes a product like 35 Mbps download and 5 Mbps upload a possibility for rural areas.

If this is brought to an area that has no broadband it is a pretty awesome product. But this is nowhere near the bandwidth that can be delivered with fiber, or even with cable modems. It’s a nice rural solution, but one that is going to feel really tiny five years from now when homes are looking for 100 Mbps speeds at a minimum.

So it’s unfortunate that these companies are touting gigabit wireless. This technology only has this name because it’s theoretically possible in a lab environment to get that much output to one device. But it creates a really terrible public expectation to talk about selling gigabit wireless and then delivering 35 Mbps, or 1/28th of a gigabit.