Categories
Regulation - What is it Good For?

FCC – Please Do the Right Thing with RDOF

The $42.5 federal BEAD broadband grants that are being funded from the Infrastructure Investment and Jobs Act should be a gamechanger for rural broadband. There will be many hundreds of millions of grants given to each state to fund the construction of broadband networks. This is likely once-in-a-generation funding, so there will only be one chance to do this right.

There is one pending issue that could really gum up the BEAD grants – there are pending RDOF awards that should not be funded. These pending RDOF grants fall into three categories.

First are RDOF auction winners that have probably bitten off more than they can chew. An example of this might be LTD Broadband. I don’t have any inside knowledge of the company, but I’ve seen estimates that the company would need to raise something north of $7 billion dollars to go along with the $1 billion RDOF award. There are likely other similar companies in the auction. The FCC has had almost a year to determine the financial ability of grant winners to fund the rest of the projects they won. If these companies don’t have the needed funding, it’s time for the FCC to cut them loose. This shouldn’t be a hard determination.

The second category is unique. Starlink won nearly a billion dollars of RDOF funding. There are still a lot of unknowns about the company’s capabilities. I know some of the RDOF areas won by Starlink are heavily wooded, and from what I hear, that’s a big problem for the technology. There are also still questions about the ability of Starlink to serve every home in a grant area – which is what the RDOF requires. I have nothing against Starlink, and if I lived in a rural area, I would have been first in line for the beta test. But the company is still an unproven technology in terms of being able to serve everybody. The company is still a start-up with no guarantee of success or longevity. At the end of the day, Starlink doesn’t meet the basic requirement that federal funding should only go to companies that can guarantee to meet the requirements of the award.

Finally, are the RDOF auction winners that claim to be able to deliver gigabit wireless technology. Like Starlink, these are not field-proven technologies and likely will never deliver what is being promised. Over the last year, I haven’t talked to a single engineer who thinks it’s possible to deliver a wireless gigabit to every customer in rural Census blocks with gigabit wireless. I have no doubt that the new wireless technologies have the capability of being a lot faster than current fixed wireless technology. But these grants weren’t awarded to deliver a few hundred megabits per second. These grant winner should be tossed for overclaiming the technology, since doing so gave them an unfair advantage in the auction. If they had bid with the ability to deliver 200 Mbps the auction results would have been very different. These companies gamed the auction rules and that alone should have invalidated the awards. Unfortunately, the FCC might be ready to make these awards, having recently awarded funding to Resound Networks to provide gigabit wireless broadband.

It’s obvious that the FCC is already wrestling with all of these issues because it’s been eleven months since the RDOF winners filed their long-form information. But the FCC must know that the BEAD grants change everything. If it had known that BEAD grants were coming, the FCC probably would not have held the reverse auction. This new federal grant money changes the equation and brings a new paradigm that should make it easier for the FCC to make up its mind about questionable RDOF awards.

If the FCC gets this wrong, then the RDOF areas in question won’t be seeing the same broadband solutions that are coming everywhere else. The BEAD grants make it easy for the FCC to reject applicants that have not demonstrated the financial wherewithal to fund the promised RDOF solution. The BEAD grants should make it easy to reject Starlink – the company is still free to market broadband to all of rural America, and it already has a huge waiting list of people willing to buy service. The BEAD grants should make it easier for the FCC to admit it erred in letting bidders overclaim technology.

It’s not going to be easy for the FCC to publicly admit that it made some big mistakes in the RDOF auction. Most of these issues could have been avoided if the FCC had pre-screened applicants. Any technology that was not already proven to work in the real world should have been excluded from the auction. Applicants should have been given a dollar limit for participation in the auction based on their balance sheet. But the FCC has a chance to set this right by rejecting the questionable awards and letting the folks that live in these areas have a chance for a better and more permanent broadband solution through BEAD grants. FCC – please do the right thing.

Categories
Current News Technology

Google Looking at Wireless Drops

In an interview with Re/code Craig Barrett, the CEO of Access for Alphabet said that Google is looking at wireless last mile technologies. Google is not the only one looking at this. The founder of Aereo has announced a new wireless initiative to launch this summer in Boston under the brand name of Starry. And Facebook says it is also investigating the technology.

The concept is not new. I remember visiting an engineer in Leesburg, Virginia back in the 90s who had developed a wireless local loop technology. He had working prototypes that could beam a big data pipe for the time (I’m fuzzily remembering a hundred Mbps back when DSL was still delivering 1 Mbps). His technology was premature in that there wasn’t any good technology at the time for bringing fast broadband to the curb.

As usual there will be those that jump all over this news and declare that we no longer need to build fiber. But even should one of these companies develop and perfect the best imaginable wireless technology there is still going to have to be a lot of fiber built. All of these new attempts to develop wireless last mile technologies share a few common traits that are dictated by the nature of wireless spectrum.

First, to get good the kind of big bandwidth that Google wants to deliver, the transmitter and the customer have to be fairly close together. Starry is talking about a quarter mile deliver distance. One characteristic of any wireless signal is that the signal weakens with distance. And the higher the frequency of the spectrum used, the faster the signal deteriorates.

Second, unless there is some amazing breakthrough, a given transmitter will have a fixed and limited number of possible paths that be established to customers. This characteristic makes it very difficult to connect to a lot of customers in a densely populated area and is one of the reasons that wireless today is more normally used for less densely populated places.

Third, the connection for this kind of point-to-multipoint network must be line of sight. In an urban environment every building creates a radio ‘shadow’ and block access to customers sitting behind that building. This can be overcome to a small degree with technologies that bounce the signal from one customer to another – but such retransmission of a signal cuts the both the strength of the signals and the associated bandwidth.

However, Google has already recognized that there are a lot of people unwilling or unable to buy a gigabit of bandwidth from them on fiber. In Atlanta the company is not just selling a gigabit connection and is hitting the street with a 100 Mbps connection for $50. A good wireless system that had access to the right kind of spectrum could satisfy that kind of bandwidth to a fairly reasonable number of customers around a given transmitter. But it would be technically challenging to try to do the same with gigabit bandwidth unless each transmitter served fewer customers (and had to be even closer to the customer). A gigabit wireless network would start looking a lot like the one I saw year ago in Virginia where there was a transmitter for just a few nearby customers – essentially fiber to the curb with gigabit wireless local loops.

But if Starry can do what they are shooting for – the delivery of a few hundred Mbps of bandwidth at an affordable price will be very welcome today and would provide real competition to the cable companies that have monopolies in most urban neighborhoods. But, and here is where many might disagree with me, the time is going to come in a decade or two where 200 Mbps of bandwidth is going to become just as obsolete as first generation DSL has become in the twenty years since it was developed.

Over the next twenty years we can expect the full development of virtual and augmented reality so that real telepresence is available – holographic images of people and places brought to the home. This kind of technology will require the kind of bandwidth that only fiber can deliver. I think we’ll start seeing this just a few years from now. I can already imagine a group of teenagers gathering at one home, each with their own headset to play virtual reality games with people somewhere else. That application will very easily require a gigabit pipe just a few years from now.

I welcome the idea of the wireless last mile if it serves to break the cable monopoly and bring some real price competition into broadband. It’s a lot less appealing if the wireless companies decide instead to charge the same high prices as the incumbents. It sounds like the connections that Starry is shooting for are going to fast by today’s standards, but I’m betting that within a few decades that the technology will fall to the wayside – like every technology that doesn’t bring a fast wire to the home.

Categories
Technology The Industry

The 5G Hype

Both AT&T and Verizon have had recent press releases about how they are currently testing 5G cellular data technology, and touting how wonderful it’s going to be. The AT&T Press release on 5G included the following statements:

Technologies such as millimeter waves, network function virtualization (NFV), and software-defined networking (SDN) will be among the key ingredients for future 5G experiences. AT&T Labs has been working on these technologies for years and has filed dozens of patents connected with them. . . . We expect 5G to deliver speeds 10-100 times faster than today’s average 4G LTE connections. Customers will see speeds measured in gigabits per second, not megabits.

AT&T went on to say that they are testing the technology now and plan to start applying it in a few applications this year in Austin, TX.

This all sounds great, but what are the real facts about 5G? Consider some of the following:

Let’s start with the standard for 5G. It has not yet been developed and is expected to be developed by 2018. The Next Generation Mobile Network Alliance (the group that will be developing the standard) states that the standard is going to be aimed at enabling the following:

  • Data rates of several tens of megabits per second should be supported for tens of thousands of users;
  • 1 gigabit per second can be offered simultaneously to workers on the same office floor;
  • Several hundreds of thousands of simultaneous connections to be supported for massive sensor deployments

How does this stack up against AT&T’s claims? First, let’s talk about how 4G does today. According to OpenSignal (who studies the speeds from millions of cellular connections), the average LTE download speeds in the 3rd quarter of last year for the major US carriers was 6 Mbps for Sprint, 8 Mbps for AT&T, and 12 Mbps for both Verizon and T-Mobile.

The standard is going to be aimed to improve average speeds for regular outdoor usage to ‘several tens of megabits per second’ which means speeds of maybe 30 Mbps. That is a great data speed on a cellphone, but it is not 10 to 100 times faster than today’s 4G speeds, but instead a nice incremental bump upward.

Where the hype comes from is the part of the standard that talks about delivering speeds within an office. With 5G that is going to be a very different application, and that very well might achieve gigabit speeds. This is where the millimeter waves come into play. As it turns out, AT&T and Verizon are talking about two totally different technologies and applications, but are purposefully making people think there will be gigabit cellular data everywhere.

The 5G standard is going to allow for the combination of multiple very high frequencies to be used together to create a very high bandwidth data path of a gigabit or more. But there are characteristics of millimeter wavelengths that limit this to indoor usage inside the home or office. For one, these frequencies won’t pass through hardly anything and are killed by walls, curtains, and to some extent even clear windows. And the signal from these frequencies can only carry large bandwidth a very short distance – at the highest bandwidth perhaps sixty feet. This technology is really going to be a competitor to WiFi but using cellular frequencies and standards. It will allow the fast transfer of data within a room or an office and would provide a wireless way to transmit something like Google’s gigabit broadband around an office without wires.

But these millimeter waves are not going to bring the same benefits outdoors that they can do indoors. There certainly can be places where somebody could get much faster speeds from 5G outdoor – if they are close to a tower and there are not many other users. But these much faster speeds are not going to work, for example, for somebody in a moving car. The use of multiple antennas for multiple high frequencies is going to require an intricate and complicated antenna array at both the transmitter and the receiver. But in any case the distance limitations and the poor penetration ability of millimeter frequencies means this application will never be of much use for widespread outdoor cellphone coverage.

So 5G might mean that you will be able to get really fast speeds inside your home, at a convention center or maybe a hotel, assuming that those places have a very fast internet backbone connection. But the upgrade to what you think of as cellular data is going to be a couple-fold increase in data speeds for the average user. And even that is going to mean slightly smaller coverage circles from a given cell tower than 4G.

The problem with this kind of hype is that it convinces non-technical people that we don’t need to invest in fiber because gigabit cellular service is coming very soon. And nothing could be further from the truth. There will someday be gigabit speeds, but just not in the way that people are hoping for. And both big companies make this sound like it’s right around the corner. There is no doubt that the positive press over this are is great for AT&T and Verizon. But don’t buy the hype – because they are not promising what people think they are hearing.

Categories
Current News Technology

Scratching My Head Over Gigabit Wireless

Over the last few weeks I have seen numerous announcements of companies that plan to deliver gigabit wireless speeds using unlicensed spectrum. For example, RST announced plans to deliver gigabit wireless all over the state of North Carolina. Vivant announced plans to do the same in Utah. And I just scratch my head at these claims.

These networks plan to use the 5 GHz portion of the unlicensed spectrum that we have all come to collectively call WiFi. And these firms will be using equipment that meets the new WiFi standard of 802.11ac. That technology has the very unfortunate common name of gigabit WiFi, surely coined by some marketing guru. I say unfortunate, because in real life it isn’t going to deliver speeds anywhere near to a gigabit. There are two ways to deploy this technology to multiple customers, either through hotspots like they have at Starbucks or on a point-to-multipoint basis. Let’s look at the actual performance of 802.11ac in these two cases.

There is no doubt that an 802.11ac WiFi hotspot is going to perform better than the current hotspots that use 802.11n. But how much better in reality? A number of manufacturers have tested the new technology in a busy environment, and with multiple users the new 80211.ac looks to be between 50% and 100% better than the older 802.11n standard. That is impressive, but that is nowhere near to gigabit speeds.

But let’s look deeper at the technology. One of the biggest improvements in the technology is that the transmitters can bond multiple WiFi channels to make one data path up to one 160 MHz channel. The downside to this is that there are only five channels in the 5 GHz range and so only a tiny handful of devices can use that much spectrum at the same time. When there are multiple users the channel size automatically steps down until it ends up at the same 40 MHz channels as 802.11n.

The most important characteristic of 5 GHz in this application is how fast the spectrum dies with distance. In a recent test with a Galaxy S4 smartphone, the phone could get 238 Mbps at 15 feet, 193 Mbps at 75 feet, 154 Mbps at 150 feet and very little at 300 feet. This makes the spectrum ideal for inside applications, but an outdoor hotspot isn’t going to carry very far.

So why do they call this gigabit WiFi if the speeds above are all that you can get? The answer is that the hotspot technology can include something called beamforming and can combine multiple data paths to a device (assuming that the device has multiple receiving antennas). In theory one 160 MHz channel can deliver 433 Mbps. However, in the real world there are overheads in the data path and about the fastest speed that has been achieved in a lab is about 310 Mbps. Combine three of those (the most that can be combine), and a device that is right next to the hotspot could get 900 Mbps. But again, the speeds listed above for the Galaxy S4 test are more representative of the speeds that can be obtained in a relatively empty environment. Put a bunch of users in the rooms and the speeds drop from there.

But when companies talk about delivering rural wireless they are not talking about hotspots, but about point-to-multipoint networks. How does this spectrum do on those networks? When designing a point-to-point network the engineer has two choices. They can open up the spectrum to deliver the most bandwidth possible. But if you do that, then the point-to-multipoint network won’t do any better than the hotspot. Or, through techniques known as wave shaping, they can design the whole system to maximize the bandwidth at the furthest point in the network. In the case of 5 GHz, about the best that can be achieved is to deliver just under 40 Mbps to 3 miles. You can get a larger throughput if you shorten that to one or two miles, but anybody who builds a tower wants to go as far as they can reach, and so 3 miles is the likely networks that will be built.

However, once you engineer for the furthest point, that is then the same amount of bandwidth that can be delivered anywhere, even right next to the transmitter. Further, that 40 Mbps is total bandwidth and that has to be divided into an upload and download path. This makes a product like 35 Mbps download and 5 Mbps upload a possibility for rural areas.

If this is brought to an area that has no broadband it is a pretty awesome product. But this is nowhere near the bandwidth that can be delivered with fiber, or even with cable modems. It’s a nice rural solution, but one that is going to feel really tiny five years from now when homes are looking for 100 Mbps speeds at a minimum.

So it’s unfortunate that these companies are touting gigabit wireless. This technology only has this name because it’s theoretically possible in a lab environment to get that much output to one device. But it creates a really terrible public expectation to talk about selling gigabit wireless and then delivering 35 Mbps, or 1/28th of a gigabit.

Exit mobile version