Wireless Networks Need Fiber

As I examine each of the upcoming wireless technologies it looks like future wireless technology is still going to rely heavily on an underlying fiber network. While the amount of needed fiber will be less than building fiber to every customer premise, supporting robust wireless networks is still going to require significant construction of new fiber.

This is already true today for the traditional cellular network and most existing towers are fiber-fed, although some have microwave backhaul. The amount of bandwidth needed at traditional cell sites is already outstripping the 1 or 2 GB capacity of wireless backhaul technologies. Urban cell sites today are fed with as much as 5 – 10 GB pipes and most rural ones have (or would like to have) a gigabyte feed. I’ve seen recent contractual negotiations for rural cell sites asking for as much as 5 GB of backhaul within the next 5 – 10 years.

Looking at the specification for future 5G cellular sites means that fiber will soon be the only backhaul solution for cell sites. The specifications require that a single cell site be capable of as much as 20 GB download and 10 GB upload. The cellular world is currently exploring mini-cell sites (although that effort has slowed down) to some degree due to the issues with placing these devices closer to customers. To be practical these small cell sites must be placed on poles (existing or newly built), on rooftops and on other locations found near to areas with high usage demand. The majority of these small sites will require new fiber construction. Today these sites can probably use millimeter wave radio backhaul, but as bandwidth needs increase, this is going to mean bringing fiber to poles and rooftops.

Millimeter wave radios are also being touted as a way to bring gigabit speeds to consumers. But delivering fast speeds means getting the radios close to customers. These radios use extremely high frequencies, and as such travel for short distances. As a hot spot a millimeter wave radio is only good for a little over 100 feet. But even if formed into a tight microwave beam it’s a little over a mile – and also requires true line-of-sight. These radios will be vying for the same transmitter locations as mini-cell sites.

Because of the short distances that can be delivered by the millimeter wave radios, this technology is going to initially be of most interest in the densest urban areas. Perhaps as the radios get cheaper there will be more of a model for suburban areas. But the challenge of deploying wireless in urban areas is that is where fiber is the most expensive to build. It’s not unusual to see new fiber construction costs of $150,000 and $200,000 per mile in downtown areas. The urban wireless deployment faces the challenge of getting both fiber and power to poles, rooftops and sides of buildings. This is the issue that has already stymied the deployment of mini-cell sites, and it’s going to become more of an issue as numerous companies want to build competing wireless networks in our cities. I’m picturing having the four major cellular companies and half a dozen wireless ISPs all wanting access to the same prime transmitter sites. All of these companies will have to deal with the availability of fiber, or will need to build expensive fiber to support their networks.

Even rural wireless deployments needs a lot of fiber. A quality wireless point-to-point wireless network today needs fiber at each small tower. When that is available then the current technologies can deploy speeds between 20 Mbps and 100 Mbps. But using wireless backhaul instead of fiber drastically cuts the performance of these networks and there are scads of rural WISPs delivering bandwidth products of 5 Mbps or less. As the big telcos tear down their remaining rural copper, the need for rural fiber is going to intensify. But the business case is often difficult to justify to build fiber to supply bandwidth to only a small number of potential wireless or wireline customers.

All of the big companies that are telling Wall Street about their shift to wireless technologies are conveniently not talking about this need for lots of fiber. But when they go to deploy these technologies on any scale they are going to run smack into the current lack of fiber. And until the fiber issue is solved, these wireless technologies are not going to deliver the kinds of speeds and won’t be quickly available everywhere as is implied by the many press releases and articles talking about our wireless future. I have no doubt that there will eventually be a lot of customers using wireless last mile – but only after somebody first makes the investment in the fiber networks needed to support the wireless networks.

More on 5G Standards

I wrote a blog last week about the new 5G standard being developed by the International Telecommunications Union (ITU). This standard is expected to be passed this November. However this standard is not the end of the standards process, but rather the beginning. The ITU IMT-2020 standard defines the large targets that define a fully developed 5G product. Basically it’s the wish list and a fully-compliant 5G product will meet the full standard.

But within 5G there are already a number of specific use cases for 5G that are being developed. The most immediate three are enBB (enhanced mobile broadband, or better functioning cellphones), URLLC (ultra-low latency communications to enhance data connectivity) and mMTC (massive machine type communications, to communicate with hordes of IoT devices). Each use case requires a unique set of standards to define how those parts of the 5G network will operate. And there will be other use cases.

The primary body working on these underlying standards is the 3GPP (3rd Generation Partnership Project). This group brings together seven other standards bodies – ARIB, ATIS, CCSA, ETSI, TSDSI, TTA, TTC – which demonstrates how complicated it is to develop a new wireless technology that will be accepted worldwide. I could talk about what each group does, but that would take a whole blog. Each standards group looks at specific aspects of radio communications such as the modulating schemes to be used, or the format of information to be passed so that devices can talk to each other. But the involvement of this many different standards groups explains a bit about why it takes so long to go from a new technology concept like 5G to functioning wireless products.

There is currently a lot work being done to create the specific standards for different portions of a 5G network. This includes the Radio Access Network (RAN), Services and System Aspects (SA) and Core Network and Terminals (CT).

The 5G RAN group, which looks at radio architecture, began work in 2015. Their first phase of work (referred to as Release 15) is looking at both the eMBB and the URLCC use cases. The goal is to define the specific architecture and feature set that is needed to meet the 5G specification. This first phase is expected to be finished in the fourth quarter of 2018. The 5G RAN group is also working on Release 16, which looks more specifically at getting radios that can comply with all of the aspects of IMT-2020 and is targeted to be completed in December of 2019.

The 5G SA group has already been actively working on the services and systems aspects of 5G. The preliminary work from this group was finished last year and final approval of their phase 1 work was just approved at the Mobile World Congress. But the SA group and the RAN group worked independently and it’s expected that there will be work to be done at the end of each phase of the RAN group to bring the two groups into sync.

The work on the core network has begun with some preliminary testing and concepts, but most of their work can’t be started until the RAN group finishes its work in 2018 and 2019.

The reason I am writing about this is to demonstrate the roadblocks that still remain to rolling out any actual 5G products. Manufacturers will not commit to making any mass-produced hardware until they are sure it’s going to be compatible with all parts of the 5G network. And it doesn’t look like any real work can be done in that area until about 2020.

Meanwhile there is a lot of talk from AT&T, Verizon and numerous vendors about 5G trials, and these press releases always make it sound like 5G products will quickly follow these trials. But for the most part these trials are breadboard tests of some of the concepts of the 5G architecture. These tests provide valuable feedback on problems developed in the field and on what works and doesn’t work.

And these companies are also making 5G claims about some technologies that aren’t really 5G yet. Most of the press releases these days are talking about point-to-point or point-to-multipoint radios using millimeter wave frequencies. But in many cases these technologies have been around for a number of years and the ‘tests’ are attempts to use some of the 5G concepts to goose more bandwidth out of existing technology.

And that’s not a bad thing. AT&T, Verizon, Google and Starry, among others, are looking for ways to use high-bandwidth wireless technologies in the last mile. But as you can see by the progress of the standards groups defining 5G, the radios we see in the next few years are not going to be 5G radios, no matter what the marketing departments of those companies call them.

AT&T’s Broadband Trials

John Donovan, the chief strategy officer for AT&T, spoke at the Mobile World Congress recently and said that the company was trying five different technologies for the last mile. This includes WLL (wireless local loop), G.Fast, 5G, AirGig and fiber-to-the-premise. He said the company would be examining the economics of all of different technologies. Let me look at each one, in relation to AT&T.

Wireless Local Loop (WLL). The technology uses the companies LTE bandwidth but utilizes a point-to-multipoint network configuration. By using a small dish on the house to receive the signal the company is getting better bandwidth than can be received from normal broadcast cellular. The company has been doing trials on various different versions of the technology for many years. But there are a few recent trials of the newest technology that AT&T will be using for much of its deployment in rural America as part of the CAF II plan. That plan requires the ISP to deliver at least 10/1 Mbps. AT&T says that the technology is delivering speeds of 15 to 25 Mbps. The company says that even at the edge of a cellular network that a customer can get 10 Mbps about 90% of the time.

G.Fast. This is a technology that uses high frequencies to put more bandwidth on telephone copper wire. Speeds are reported to be as high as 500 Mbps, but only for very short distances under 200 feet. AT&T recently announced a G.Fast trial in an apartment building in Minneapolis. The technology is also being tested by CenturyLink and Windstream. All of these trials are using existing telephone copper inside of existing apartment buildings to deliver broadband. So this is not really a last mile technology. AT&T brings fiber to the apartment complex and then uses G.Fast as an inside wire technology. If they find it to be reliable this would be a great alternative to rewiring apartments with fiber.

5G. AT&T recently announced a few trials of early 5G technologies in Austin. They are looking at several technology ideas such carrier aggregation (combining many frequencies). But these are just trials, and AT&T is one of the companies helping to test pre-5G ideas as part of the worldwide effort to define the 5G specifications. These are not tests of market-ready technologies, but are instead field trials for various concepts needed to make 5G work. There is no doubt that AT&T will eventually replace LTE wireless with 5G wireless, but that transition is still many years in the future. The company is claiming to be testing 5G for the press release benefits – but these are not tests of a viable last mile technology – just tests that are moving lab concepts to early field trials.

AirGig. This one remains a mystery. AT&T says it will begin trialing the technology later this year with two power companies. There has been a little bit of clarification of the technology since the initial press release. This is not a broadband over powerline technology – it’s completely wireless and is using the open lines-of-sight on top of power poles to create a clear path for millimeter wave radios. The company has also said that they don’t know yet which wireless technology will be used to go from the poles into the home – they said the whole range of licensed spectrum is under consideration including the LTE frequencies. And if that’s the case then the AirGig is a fiber-replacement, but the delivery to homes would be about the same as WLL.

FTTP. Donovan referred to fiber-to-the-home as a trial, but by now the company understands the economics of fiber. The company keeps stretching the truth a bit about their fiber deployments. The company keeps saying that they have deployed fiber to 4 million homes, with 8 million more coming in the next three years. But the fact is they have actually only passed the 4 million homes that they can market to as is disclosed on their own web site. The twelve million home target was something that was dictated by the FCC as part of the settlement allowing the company to buy DirecTV.

We don’t know how many fiber customers AT&T has. They are mostly marketing this to apartment buildings, although there are residential customers around the country saying they have it. But they have not sold big piles of fiber connections like Verizon FiOS. This can be seen by looking at the steady drop in total AT&T data customers – 16.03 million in 2014, 15.78 million in 2015 and 15.62 million at the end of the third quarter of 2016. AT&T’s fiber is not really priced to be super-competitive, except in markets where they compete with Google Fiber. Their normal prices elsewhere on fiber are $70 for 100 Mbps, $80 for 300 Mbps and $99 for a gigabit.

Standards for 5G

itu_logo_743395401Despite all of the hype that 5G is right around the corner, it’s important to remember that there is not yet a complete standard for the new technology.

The industry just took a big step on February 22 when the ITU released a draft of what it hopes is the final specification for 5G. The document is heavy in engineering detail and is not written for the layman. You will see that the draft talks about a specification for ‘IMT-2020’ which is the official name of 5G. The goal is for this draft to be accepted at a meeting of the ITU-R Study Group in November.

This latest version of the standard defines 13 metrics that are the ultimate goals for 5G. A full 5G deployment would include all of these metrics. What we know that we will see is commercial deployments from vendors claiming to have 5G, but which will actually meet only some parts of a few of these metrics. We saw this before with 4G, and the recent deployment of LTE-U is the first 4G product that actually meets most of the original 4G standard. We probably won’t see a cellular deployment that meets any of the 13 5G metrics until at least 2020, and it might be five to seven more years after that until fully compliant 5G cellular is deployed.

The metric that is probably the most interesting is the one that establishes the goal for cellular speeds. The goals of the standard are 100 Mbps download and 50 Mbps upload. Hopefully this puts to bed the exaggerated press articles that keep talking about gigabit cellphones. And even should the technology meet these target speeds, in real life deployment the average user is probably only going to receive half those speeds due to the fact that cellular speeds decrease rapidly with distance from a cell tower. Somebody standing right next to a cell tower might get 100 Mbps, but even as close as a mile away the speeds will be considerably less.

Interestingly, these speed goals are not much faster than is being realized by LTE-U today. But the new 5G standard should provide for more stable and guaranteed data connections. The standard is for a 5G cell site to be able to connect to up to 1 million devices per square kilometer (a little more than a third of a square mile). This, plus several other metrics, ought to result in stable 5G cellular connections – which is quite different than what we are used to with 4G connections. The real goal of the 5G standard is to provide connections to piles of IoT devices.

The other big improvement over 4G are the expectations for latency. Today’s 4G connections have data latencies as high as 20 ms, which accounts for most problems in loading web pages or watching video on cellphones. The new standard is 4 ms latency, which would improve cellular latency to around the same level that we see today on fiber connections. The new 5G standard for handing off calls between adjoining cell sites is 0 ms, or zero delay.

The standard increases the demand potential capacity of cell sites and provides a goal for the ability of a cell site to process peak data rates of 20 Gbps down and 10 Gbps up. Of course, that means bringing a lot more bandwidth to cell towers and only extremely busy urban towers will ever need that much capacity. Today the majority of fiber-fed cell towers are fed with 1 GB backbones that are used to satisfy upload and download combined. We are seeing cellular carriers inquiring about 10 GB backbones, and we need a lot more growth to meet the capacity built into the standard.

There are a number of other standards. Included is a standard requiring greater energy efficiency, which ought to help save on handset batteries – the new standard allows for handsets to go to ‘sleep’ when not in use. There is a standard for peak spectral efficiency which would enable 5G to much better utilize existing spectrum. There are also specifications for mobility that extend the goal to be able to work with vehicles going as fast as 500 kilometers per hour – meaning high speed trains.

Altogether the 5G standard improves almost every aspect of cellular technology. It calls for more robust cell sites, improved quality of the data connections to devices, lower energy requirements and more efficient hand-offs. But interestingly, contrary to the industry hype, it does not call for gigantic increases of cellular handset data speeds compared to a fully-compliant 4G network. The real improvements from 5G are to make sure that people can get connections at busy cell sites while also providing for huge numbers of connections to smart cars and IoT devices. A 5G connection is going to feel faster because you ought to almost always be able to make a 5G connection, even in busy locations, and that the connection will have low latency and be stable, even in moving vehicles. It will be a noticeable improvement.

Time for a New Telecom Act, Part 2

FCC_New_LogoYesterday’s blog postulated that we would see a new telecom act this year from Congress. That blog looked at what was accomplished by the last Telecommunications Act of 1996. Today I’m looking ahead at the issues that a new Act needs to address.

Last week we learned more about how the process will probably work. A new telecom act would likely be spearheaded by the Energy and Commerce Subcommittee on Communications and Technology. Last week Rep. Marsha Blackburn, head of that committee, told the press that she favored giving the new FCC a shot at fixing the things under its purview before the House would tackle a new Act. The FCC doesn’t have the authority to make many of the needed changes in telecom regulation, but it does have considerable power. Anyway, this probably means a new act is at least a year away.

Here are some of the things that I think the FCC and Congress need to address to modernize telecom:

Need for More Spectrum. It’s becoming clear that a lot of big ISPs are thinking of deploying 5Gn and various other millimeter wave technologies. The FCC needs to continue to open up more spectrum for broadband. There is still a lot of spectrum has been reserved for government use and there needs to be more attempts to share frequency when possible. There also needs to be a fresh look taken at how frequency is used. Historically many bands of frequency had narrow channels aimed at accommodating voice traffic or a single channel of television. From an engineering perspective we can get a lot more out of spectrum if we can make wider channels in the spectrum bands that are already in use.

Tackling Cybersecurity. 2016 was a year when security breaches led the industry news weekly. There is no easy fix for security issues, but there are big steps that can be taken. For example, we are flooding the world with IoT devices that are easily hacked and which can now be used to launch coordinated denial of service attacks. With Congressional backing the FCC could create standards to make IoT devices more secure. The government will never make us free from hacking, but there are a lot of sensible standards and fixes needed for IoT devices.

Expanding Access to Fast Broadband. As somebody who works regularly in rural America I know that lack of broadband there is now one of the biggest problems identified by rural households. We need to find ways to get good broadband to more places, and we have to do this smartly by building infrastructure that will last for decades. We’ve already seen how not to do this with the CAF II program that is being used to expand DSL and LTE wireless – two technologies that are already inadequate today.

Unless we see that fiber is built everywhere this is going to be an ongoing major issue. For example, if we fix broadband for those that have none but ignore the bigger swathe of the country that has only marginally acceptable broadband today, we will be back in a decade looking at how to fix broadband in those places.

We also need rules that unleashes anybody willing to spend money on fiber. I see numerous rural counties and towns that are ready to spring for bond issues to get fiber. We need rules that allow anybody willing to invest in fiber be able to do so – be that local governments, electric cooperatives, rural telcos or anybody else.

Infrastructure Issues. There are still a lot of infrastructure roadblocks to deploying fiber. We have never done a good job of fulfilling the mandate from the 1996 Act to provide access to poles and conduit. And we are now looking at deploying a fiber-fed wireless network that is going to mean bringing both fiber and power to buildings, rooftops, poles and other infrastructure. We need to find a way to get this done without also trampling over the legitimate concerns of local jurisdictions. For example, the FCC can’t just demand that cities allow free and quick fiber construction if that means digging up newly paved streets or overburdening poles – we need to find rules that work. And we need to do a much better job of this than we have done so far.

Programming. It’s now clear that online video content is competitive alternative to traditional cable TV. We need rules that unleash cable companies and anybody else to sell programming that people really want to buy. That means stepping away from the current rigid cable rules that mandate the giant channel lineups. Companies need to be free to create programming bundles that people want to buy. This might mean allowing a la carte programming. And there must be rules that require content providers to sell to everybody in an unbiased manner.

I don’t know how many of these big issues the current FCC is going to be willing to tackle. It seems like a lot of their agenda for the first six months will be to undo things ordered by the previous FCC. While I understand the desire to mold the FCC to the political persuasion of whatever party is in power, most of the issues on my list above are not partisan. They are just things that we all need to solve if we are to have a telecom infrastructure that serves us all well.

The Challenges of Fixed Gigabit Wireless

webpass_logoWe got a preview this week of what fixed wireless service might look like in urban environments. Google announced it is aggressively expanding the footprint of Webpass, the wireless ISP that Google purchased last year. The company has been operating in six cities and will now be expanding to nine more markets. These will all be downtown urban deployments.

The deployment uses high-capacity microwave links to serve high-rise buildings. Webpass already has 20,000 residential customers in the six markets, all which live in downtown high-rises. The company focuses more on serving business customers. This business plan has been around for years and I was actually helping to launch a business years ago with the same plan that died with the 2000 telecom crash.

The network consists of microwave shots to each building on the network. The first hurdle in getting this to work is to get enough quality radio sites to see buildings. As I noted in a blog last week, access to this kind of real estate is at a premium in urban areas, as cellphone providers have found when trying to deploy small cell sites.

The radios required to make the links are not gigantic, but you need one full radio and a dish at both ends of every link. This means that from any one given hub building there will be a limited number of links that can be made to other buildings, just due to space limitations. If you imagine half a dozen companies trying to this same thing (this will be the same basic deployment method for urban 5G), then you can picture a proliferation of companies fighting over available radio space on roofs.

Webpass in the past has limited their deployment to buildings that are either already wired with category 5 cable or fiber. They face the same issue that any broadband provider faces in bringing broadband into older buildings – only they are starting on the roof rather than from a basement wiring closet like other ISPs. There are very few ISPs yet willing to tackle the rewiring effort needed in large older buildings that serve residences. As you will see from the pricing below, Webpass and other ISPs are a lot more willing to tackle business buildings and absorb some rewiring costs.

The primary thing for the public to understand about this new roll-out is that it’s very limited. This won’t go to single family homes. It will go to downtown residential high-rises, but only to those that are pre-wired or easy to wire. And even in those buildings Webpass won’t go unless they get at least 10 customers. However, they will contract with landlords to serve whole buildings.

The Webpass pricing is interesting. For residential customers the price is $60 per month regardless of the speed achieved. Webpass says they deliver speeds between 100 Mbps and 500 Mbps, but in reading numerous reviews, there are complaints that speeds can get slower at peak evening time in some buildings (as one would expect when there are a lot of customers sharing one radio link).

Webpass’ pricing for businesses varies according to the number of other customers they get in a building. For example, if there are 10 or more business customers in a building they will sell a 100 – 200 Mbps connection for $250 per month with a 10 TB monthly data cap. But prices are much higher for customers in buildings with fewer than 10 customers:

Speed              Cost                 Data Cap         Price with no Cap

10 Mbps          $125                   1 TB                $375

20 Mbps          $250                   2 TB                $750

50 Mbps          $500                   5 TB                $1,500

100 Mbps        $1,000                10 TB              $2,000

250 Mbps                                                           $2,500

500 Mbps                                                           $4,000

1 Gbps                                                                $5,500

From a technical perspective Webpass is deploying in line with the way the technology works. The radios are too expensive to deploy to smaller customers or to smaller buildings. A building also need to be within a mile of the base transmitter (and hopefully closer) to get good speeds. That is largely going to mean downtown deployments.

We know there are a number of other companies considering a similar plan. Starry announced almost two years ago that they were deploying something similar in Boston, but has yet to launch. We know AT&T and Verizon are both exploring something similar to this Google product using 5G radios. But all of these companies are going to be fighting over the same limited markets.

The cellular companies keep hinting in their press releases that they will be able to use 5G to bring gigabit speeds. When they say that, this is the kind of deployment they are talking about. The only way they are going to be able to bring gigabit wireless speeds to single family homes and to suburbs is if they can develop some sort of mini transmitters to go onto utility poles. That technology is going to require building fiber close to each house and the radios are going to replace fiber drops. The above deployment by Webpass is not hype – they already have customers in six markets. But this technology is not the panacea for fast broadband for everyone that you might believe from reading the press releases.

Catching Up On Small Cell Deployment

light-pole-on-i-805-in-san-diego2I remember going to a presentation at a trade show a few years back where there was great enthusiasm for the future of small cell sites for cellular networks. The panel, made up mostly of vendors, was predicting that within five years there would be hundreds of millions of small cells deployed throughout all of the urban areas of the US.

Small cells are supposed to relieve congestion from the larger existing cellular towers. They can be hung anywhere such as on light poles, rooftops, and even in manholes. They have a relatively small coverage area ranging from 30 to 300 feet depending upon the local situation.

But I recently saw that MoffettNathanson estimated that there have only been 30,000 small cells deployed so far. That’s obviously a far cry smaller than the original projections and it’s an interesting study in the dynamics of the telecom industry for why this didn’t go as planned. We’ve seen other examples of new technologies before that didn’t pan out as promised, so it’s a familiar story to us that have been following the industry for a while.

There are a number of different issues that have slowed down small cell deployment. One of the key ones is price since it can cost between $35,000 and $65,000 to get a small cell in place. That’s a steep price to pay for a small coverage area unless that area is full of people much of the day.

Another problem is that small cells need to be fiber fed and also need to have a source of reliable continuous power. Not surprisingly, that turns out to be a big issue in the crowded urban areas where the small cells make the most sense. It’s not easy, for example, to bring fiber to an existing light pole. And it’s often not even easy to bring reliable power to some of the best-suited cell locations.

The problems that surprised the cellular industry the most are the problems with getting permits to place the cell sites. Remember that these sites are deployed in the densest parts of big cities and many of those cities have a lot of rules about running new fiber or power lines in those areas. Some of the cellular companies have cited waits as long as two years for permitting in some locations.

Yet another problem is that the big cellular companies are having a hard time figuring out how to incorporate the new technology into their processes. The whole industry has grown up dealing with big cell towers and all of the work flows and processes are geared towards working in the tower environment. I can’t tell you how many times I’ve seen big companies have trouble dealing with something new. It was the inability to change the existing workflows, for example, that led Verizon to basically start a whole new company from scratch when they launched FiOS.

And like any new technology, the small cells have not always delivered the expected performance. This has a few companies stepping back to assess if small cells are the right way to go. For instance, AT&T has largely stopped new small cell deployment for now.

The FCC recently took a stab at some new regulations that might make the permitting process easier. And the FCC just released a policy paper that promised to look at further easing the rules for deploying wireless technology and for getting onto poles.

The main reason that I’m following small cells is that the industry is on the cusp of implementing two new technologies that are going to face all of the same issues. It’s clear that 5G is going to need small cells if it is to be able to handle the number of devices in a local area that have been hyped by the cellular companies. And Google, AT&T and others are looking at wireless local loop technologies that are also going to require small fiber-fed devices be spread throughout a service area. My gut feeling is the same problems that have plagued small cell deployment are going to be a thorn for these new technologies as well – and that might mean it’s going to take a lot longer to deploy these technologies than what the industry is touting.

The Battle for IoT Connectivity

Amazon EchoThere is a major battle brewing for control of the connections that control the Internet of Things. Today in the early stage of home IoT most devices are being connected using WiFi. But there is going to be a huge push to have connection instead made through 5G cellular.

I saw an article this week where Qualcomm said that they were excited about 5G and that it would be a world-changing technology. The part of 5G that they are most excited about is the possibility of using 5G to connect IoT devises together. Qualcomm’s CEO Stephen Mollenkopf talked about 5G at the recent CES show and talked about a future where 5G is used for live-streaming virtual reality, autonomous cars and connected cities where street lamps are networked together.

Of course, Qualcomm and the cellular vendors are most interested in the potential for making money using 5G technology. Qualcomm wants to make the hundreds of millions of chips they envision in a 5G connected world. And Verizon and AT&T want to sell data connections to all of the 5G connected devices. It’s an interesting vision of the world. Some of that vision makes sense and 5G is the obvious way to connect outdoors for things like street lights.

But it’s not obvious to me at this early stage of IoT that either 5G or WiFi are the obvious winner of the battle for IoT connectivity in the home. There are pros and cons for each technology.

WiFi has an upper hand today because it’s already in almost every home. People are comfortable using WiFi because it doesn’t cost anything extra to connect an IoT device. But WiFi has some natural limitations that might make it a harder choice in the future if our homes get filled with IoT devices. As I’ve discussed in some recent blogs, the way that WiFi shares data can be a big problem when there is a lot of steady and continuous demand for the bandwidth. WiFi is probably a great choice for IoT devices that only occasionally need to make a connection or that need short-burst connections to share information.

But the WiFi standard doesn’t include quality of service and any prioritization of which connections are the most important. WiFi instead always does its best to share bandwidth, regardless of the number of devices that are asking to connect to it. When a WiFi router gets multiple demands it shuts down for a short period and then tries to reinitiate connections again. If too many devices are demanding connection, a WiFi system goes into a mode of continuously stopping and restarting and none of the connections get a satisfactory connection. Even if there is enough bandwidth in the network to handle most of the requests, too many simultaneous requests simply blows the brains out of WiFi. The consequence for this is that having a lot of small and inconsequential connections can ruin the important connections like video streaming or gaming.

But cellular data is also not an automatic answer. Certainly today there is no way to cope with IoT using 4G cellular networks. Each cell site has a limited number of connections. A great example of this is that I often talk to a buddy of mine in DC while he commutes, and he usually loses his cellular signal when crossing the between Maryland and Virginia. This is due to there not being enough cellular connections available in the limited area of the American Legion bridge. 5G will supposedly solve this problem and promises to expand the number of connections from a cell site by a factor of 50 times or so – meaning that there will be a lot more possible connections. But you still have to wonder if that will be sufficient in a world when every IoT device wants a connection. LG just announced that every appliance it sells will now come with an IoT connection, and I imagine this will soon be true of all appliances, toys and almost anything else you buy in the future that has any electronics.

Of a bigger concern to me is that 5G connections are not going to be free. With WiFi, once I’ve bought my home broadband connection I can add devices at will (until I overload my router). But I think Verizon and AT&T are excited about IoT because they want to charge a small monthly fee for every device you connect through them. It may not be a lot – perhaps a dollar per device per month – but the next thing you know every home will be sending then an additional $50 or more per month to keep IoT devices connected. It’s no wonder they are salivating at the possibility. And it’s no wonder that the big cable companies are talking about buying T-Mobile.

I’m also concerned from a security perspective of sending the data from all of my IoT devices to the same core routers at Verizon or AT&T. Since it’s likely that the recent privacy rules for broadband will be overturned or weakened, I am concerned about having one company know so much about me. If I use a WiFi network my feeds will still go out through my data ISP, but if I’m concerned about security I can encrypt my network and make it harder for them to know what I’m doing. That is going to be impossible to do with a cellular connection.

But one thing is for sure and this is going to be a huge battle. And it’s likely to be fought behind the scenes as the cellular companies try to make deals with device manufacturers to use 5G instead of WiFi. WiFi has the early lead today and it’s still going to be a while until there are functional 5G cellular networks. But once those are in place it’s going to be a war worth watching.

Wireless Trends for 2017

Wi-FiToday I look at wireless trends for 2017. While most of my clients are small landline carriers, the wireless industry has a lot of impact on every ISP these days.

New Spectrum for Rural Broadband. The FCC should release spectrum at the end of the current Incentive Auction that can be used for rural broadband. This would be a slice of spectrum that used to be occupied by UHF television stations and that is being referred to as ‘white space’ spectrum. The beauty of this spectrum for rural broadband is that it will travel significantly far from a tower and will penetrate most obstacles that stop other spectrum. This spectrum has been allotted to only a few carriers under experimental licenses and so it might be a few years until affordable gear is ready for the market – but this would be a great tool for reaching remote customers.

New WiFi. The FCC should also finally release new WiFi in the 3.5 GHz band. This bandwidth will be available through a new spectrum sharing arrangement that will make it available to carriers while giving first priority to existing government and satellite users of the spectrum. But it’s a broad swath of 150 MHz and within a few years will add to the capacity of wireless point-to-multipoint networks. If the spectrum-sharing rules being used for 3.5 GHz work well, expect to start seeing sharing with other spectrum. This would be a great change for everybody and would spectrum owners on the notice that they have to either use or share spectrum and they can’t sit on it and let it go unused.

LTE Replaces Rural Copper. This is the year when we will start to really see AT&T and Verizon tearing down rural copper networks and forcing rural customers onto 4G LTE. What will never stop amazing me is that the FCC is paying for a lot of this from the CAF II fund.

Zero Rating Will Be Big. Expect all of the cellular carriers to aggressively adopt zero-rating, which is where they will provide their own video products to customers without it counting against cellular data caps. Zero-rating is not allowed under net neutrality rules, but it’s clear that the new FCC will soon reverse those rules.

Zero-rating is a really mixed bag. It will certainly be a boon to customers who don’t mind getting locked into big company bundles – for instance, an AT&T cellular customer might be able to watch unlimited DirecTV Now (but not Netflix) on their cellphone. But zero-rating also is glaring proof that wireless data caps are all about the extra revenue and not about bandwidth issues since the wireless carriers will open up wireless data pipes wide for those willing to pay them a lot of money.

There Will Be Huge 5G Hype. Expect the wireless companies and the press to talk about nothing but 5G. We will hear all year how the technology is being tested and how it’s right on the horizon. And all of the press releases won’t make any distinction between 5G cellular and 5G indoor gigabit wireless. So the general public will end 2017 mistakenly thinking that they will soon have gigabit cellphones.

There Will Be New Wireless Choices. Expect Comcast to launch their wireless product in a few test markets this year. Charter will also be closely watching those trials. Also don’t be surprised if Sprint or T-Mobile are bought by companies wanting to get into the cellular business. A really crazy rumor I read had Verizon merging with Comcast – but honestly nothing would surprise me any more with big company mergers.

WiFi Calling from Cellphones. There will be a big short this year as more and more calls will be made from cellphones directly over WiFi networks. Google Project Fi and Republic Wireless started this trend in 2016 and many others, including the big cell providers will join the trend.

Please Stop Hinting at Gigabit Cellular

SONY DSCLast week there were several press releases announcing that AT&T was working with a major corporation to provide a test of 5G technology. A few days later the industry found out that the company taking part in the test is Intel, which will be making the chips involved in the tests. Intel will apparently be beta testing early units for providing high-speed bandwidth at one of their locations.

It really bothers me every time I see one of these announcements, because the whole industry seems to have bought into the hype from companies like AT&T that conflate two totally different technologies under the name of 5G. The AT&T and Intel test is going to be for a technology to provide faster indoor wireless connections using millimeter wave spectrum in competition with WiFi.

But most of the world sees the term ‘5G’ and assumes it means the next generation of cellular technology. And that means that most people reading about the AT&T press release think that we are just a few years away from having gigabit cell phones. And we are not.

I don’t know who decided to use the term 5G for two drastically different technologies. My guess is that the confusion has been purposefully sown by AT&T and Verizon. Certainly the average consumer is more likely to pay attention if they think their cell phones will soon be blazingly fast.

But this kind of confusion has real life negative consequences. Politicians and decision makers read these articles and assume that there is a fast cellular alternative coming in a few years – and this allows them to take the issue of faster landline broadband off the plate. It’s not a hard mistake to make and I’ve even seen this same confusion from smaller telco and cable company owners who see the headlines but don’t dig deeper. I assume one reason this confusion is being promoted is that both AT&T and Verizon benefit if fewer companies are investing in fiber last-mile networks to compete with them.

The millimeter wave technology that Intel is going to alpha test is to provide gigabit speed wireless connections for very short distances. It’s a technology that can distribute gigabit speed connections around an office suite, for example. The gigabit speeds are good for about 60 feet from a transmitter which fits the indoor environment and desire for speed. But even in that environment the technology has a major limitation in that these frequencies won’t pass through almost anything. Even a wall or possibly even a cubicle divider can kill the signal. And so these early tests are probably to find the best way to scatter the bandwidth around the office to reach all the nooks and crannies found in the real world.

This technology is being called 5G because the technology will use the 5G standard, even though that standard is not yet developed. But we already know that the 5G standard will have one major benefit over WiFi. WiFi is a bandwidth sharing protocol which gives equal preference to every transmission. If one WiFi device in an office is demanding a large amount of bandwidth and another data-hungry device comes online the protocol automatically shares the bandwidth between the two devices. 5G will allow the router to guarantee the bandwidth at different levels to each device without sharing.

But this millimeter wave trial at Intel has almost nothing else in common with cellular data transmissions other than the fact that they use the same standard. Cellular networks use much lower frequencies which have been chosen because they travel a decent distance from a cell tower, and for the most part cellular frequencies are good at penetrating walls and trees and other obstacles.

Cellular networks are not going to use millimeter wave frequencies to get to cellphones. To make that work would require mini-cell sites of some sort every hundred feet or so. That can be made to work, but really is a totally impractical application in the real world unless we someday find a way to put little cell sites literally everywhere. Using these frequencies for cellular would be a niche application that might only work in a place like a conference center and the cellphone companies are not going to automatically build this technology into cellphones. It takes chip space, extra power and new antennae to add another frequency and nobody is going to add that extra cost to a cellphone until most of the world can use it – and that literally could take many decades, if ever.

Instead, the 5G standard will be used in cellphones to improve data speeds – but not at anything near to gigabit speeds. The early versions of the 5G specification have a goal of being able to deliver 50 Mbps data speeds to large numbers of phones out of a cell site. That’s a 4 – 5 times increase in cellular speeds from today and is going to make it a lot more enjoyable to browse the web from a cellphone. But 50 Mbps is very different than gigabit cellular speeds. The big companies really need have to stop implying there is going to be gigabit cellular. That is extremely misleading and is very far from the truth.