The Proliferation of Small Wireless Devices

Cities nationwide are suddenly seeing requests to place small wireless devices in public rights-of-way. Most of the requests today are for placing mini-cell sites, but in the near future there are going to be a plethora of other outdoor wireless devices to support 5G broadband and wireless loops.

Many cities are struggling with how to handle these requests. I think that once they understand the potential magnitude of future requests it’s going to become even more of an issue. Following are some of the many issues involved with outdoor wireless electronics placement:

Franchising. One of the tools cities have always used to control and monitor placement of things in rights-of-way is through the use of franchise agreements that specifically spell out how any given company can use the right-of-way. But FCC rules have prohibited franchises for cellular carriers for decades – rules that were first put into place to promote the expansion of cellular networks. Those rules made some sense when cities only had to deal with large cellular towers that are largely located outside of rights-of-way, but make a lot less sense for devices that can be placed anywhere in a city.

Aesthetics. These new wireless devices are not going to be placed in the traditional locations like large cellular towers, water towers and rooftops of buildings. Instead the wireless providers will want to place them on existing telephone poles and light poles. Further, I’ve heard of requests for the placement of new, taller poles as tall as 100 feet that would be used just for the wireless network.

The devices that will be used are going to vary widely in size and requirements, making it difficult to come up with any one-size-fits-all new rules. The devices might vary in sizes ranging from a laptop computer up to a small dorm refrigerator. And some of the devices will be accompanied by support struts and other devices that together make for a fairly large new structure. The vast majority of these devices will need an external power feed (some might be solar powered) and many are also going to need a fiber feed.

It’s also expected that 5G devices are going to want relatively clear line-of-sight and this means a lot more tree-trimming, including trimming at greater heights than in the past. I can picture this creating big issues in residential neighborhoods.

Proliferation. I doubt that any city is prepared for the possible proliferation of wireless devices. Not only are there four major cellular companies, but these devices are going to be deployed by the cable companies that are now entering the cellular market along with a host of ISPs that want to deliver wireless broadband. There will also be significant demand for placement for connecting private networks as well as for the uses by the cities themselves. I remember towns fifty years ago that had unsightly masses of telephone wires. Over the next decade or two it’s likely that we will see wireless devices everywhere.

Safety. One of the concerns for any city and the existing utilities that use poles and rights-of-way is the safety of technicians that work on poles. Adding devices to poles always makes it more complicated to work on a pole. But adding live electric feeds to devices (something that is fairly rare on poles) and new fiber wires and the complexity increases again – particularly for technicians trying to make repairs in storm conditions.

Possible Preemption of City Rights. Even after considering all these issues, it’s possible that the choice might soon be moot for cities. At the federal level both the FCC and Congress are contemplating rules that make it easier for cellular companies to deploy these devices. There are also numerous bills currently in state legislatures that are looking at the same issues. In both cases most of the rules being contemplated would override local control and would institute the same rules everywhere. And as you might imagine, almost all of these laws are being pushed by the big cellular companies and largely favor them over cities.

It’s easy to understand why the cellular companies want universal rules. It would be costly for them to negotiate this city by city. But local control of rights-of-way has been an effective tool for cities to use to control haphazard proliferation of devices in their rights-of-way. This is gearing up to be a big battle – and one that will probably come to a head fairly soon.

Is our Future Mobile Wireless?

I had a conversation last week with somebody who firmly believes that our broadband future is going to be 100% mobile wireless. He works for a big national software company that you would recognize and he says the company believes that the future of broadband will be wireless and they are migrating all of their software applications to work on cellphones. If you have been reading my blog you know I take almost the opposite view, but there are strong proponents of a wireless future, and it’s a topic worth continually revisiting.

Certainly we are doing more and more things by cellphone. But I think those that view future broadband as mobile are concentrating on faster mobile data speeds but are ignoring the underlying overall data capacity of cellular networks. I still think that our future is going to become even more reliant on fiber in order to handle the big volumes of bandwidth we will all need. This doesn’t mean that I don’t love cellphone data – but I think it’s a complement for landline broadband and not an equivalent substitute. Cellphone networks have major limitations and they are not going to be able to keep up with our need for bandwidth capacity. Even today the vast majority of cellphone data is handed off to landline networks through WiFi. And in my mind that just makes a cellphone into another terminal on your landline network.

Almost everybody understands the difference in quality between using your cellphone in your home using WiFi versus doing the same tasks using only the cellular network. I largely use my cellphone for reading news articles. And while this is a lot lighter application than watching video, I find that I usually have problems opening articles on the web when I’m out of the house. Today’s 4G speeds are still pretty poor and the national average download speed is reported to be just over 7 Mbps.

I think all of the folks who think cellphones are the future are counting on 5G to make a huge difference. But as I’ve written many times, it will be at least a decade before we see a mature 5G cellular network – and even then the speeds are not likely to be hugely faster than the 4G specification today. 5G is really intended to increase the stability of broadband connections (less dropped calls) and the number of connections (able to connect to a lot of IoT devices). The 5G specifications are not even shooting for at a huge speed increase, with the specification calling for 100 Mbps download cellular speeds, which translates into an average of perhaps 50 Mbps connections for all of the customers within a cell site. Interestingly, that’s the same target speed of the 4G specification.

And those greater future speeds sounds great. Since a cellphone connection by definition is for one user, a faster speed means that a cellular connection will support a 4K video stream eventually. But what this argument ignores is that a home a decade from now is going to be packed with devices wanting to make simultaneous connections to the Internet. It is the accumulated volume of usage from all of those devices that is going to add up to huge broadband demand for homes.

Already today homes are packed with broadband hungry devices. We have smart TVs, cellphones, laptops, desktops and tablets all wanting to connect to the network. We have other bandwidth hungry applications like gaming boxes and surveillance cameras. More and more of us are cutting the cord and watching video online. And then there are going to piles of new devices with smaller broadband demands, but which in total will add up to significant bandwidth. Further, a lot of applications we use are now in the cloud. My home uses a lot of bandwidth every day just backing up my data files, connecting to software in the cloud, making VoIP calls, and automatically updating software and apps.

I’ve touted a statistic many times that you might be tired of hearing, but I think it’s at the heart of the matter. The amount of bandwidth used by homes has been doubling every three years since 1980, and there is no end in sight to that trend. Already today a 4G connection is inadequate to support the average home. If you don’t think that’s true, talk to the homes now using AT&T’s fixed LTE connections that deliver 10 Mbps. That kind of speed is not adequate today to provide enough bandwidth to use the many broadband services I discussed above. Cellular connections are already too slow today to provide a reasonable home broadband, even as AT&T is planning to foist these connections on millions of rural homes.

There is no reason to think that 5G will be able top satisfy the total broadband needs of a home. The only way it might do that is if we end up in a world where we have to buy a small cellular subscription for every device in our home – I know I would prefer to instead connect all of my devices to WiFi to avoid such fees. Yes, 5G will be faster, but a dozen years from now when 5G is finally a mature cellular technology, homes will need a lot more bandwidth and a 5G connections then will feel just as inadequate then as 4G feels today.

Unless we get to a future point where the electronics get so cheap that there will be a ‘cell site’ for every few homes, then it’s hard to figure that cellular can ever be a true substitute for landline broadband. And even if such a technology develops you still have to ask if it would make any sense to deploy. Those small cell sites are largely going to have to be fiber fed to deliver the needed bandwidth and backhaul. And in that case small cell sites might not be any cheaper than fiber directly to the premise, especially when considering the lifecycle costs of the cell site electronics. Even if we end up with that kind of network – it’s would not really be a cellular network as much as it would be using wireless loops as the last few feet of a landline network – something that for years we have called fiber-to-the-curb. Such a network would still require us to build fiber almost everywhere.

The Challenges of 5G Deployment

The industry is full of hype right now about the impending roll-out of 5G cellular. This is largely driven by the equipment vendors who want to stir up excitement among their stockholders. But not everybody in the industry thinks that there will be a big rush to implement 5G. For example, a group called RAN Research issued a report last year that predicted a slow 5G implementation. They think that 4G will be the dominant wireless technology until at least 2030 and maybe longer.

They cite a number of reasons for this belief. First, 4G isn’t even fully developed yet and the standards and implementation coalition 3GPP plans to continue to develop 4G until at least 2020. There are almost no 4G deployments in the US that fully meet the 4G standards, and RAN Research expects the wireless carriers to continue to make incremental upgrades, as they have always done, to improve cellular along the 4G path.

They also point out that 5G is not intended as a forklift upgrade to 4G, but is instead intended to coexist alongside. This is going to allow a comfortable path for the carriers to implement 5G first in those places that most need it, but not rush to upgrade places that don’t. This doesn’t mean that the cellular carriers won’t be claiming 5G deployments sometime in the next few years, much in the way that they started using the name 4G LTE for minor improvements in 3G wireless. It took almost five years after the first marketing rollout of 4G to get to what is now considered 3.5G. We are just now finally seeing 4G that comes close to meeting the full standard.

But the main hurdle that RAN Research sees with a rapid 5G implementation is the cost. Any wireless technology requires a widespread and rapid deployment in order to achieve economy of scale savings. They predict that the cost of producing 5G-capable handsets is going to be a huge impediment to implementation. Very few people are going to be willing to pay a lot more for a 5G handset unless they can see an immediate benefit. And they think that is going to be the big industry hurdle to overcome.

Implementing 5G is going to require a significant expenditure in small dense cell-sites in order to realize the promised quality improvements. It turns out that implementing small cell sites is a lot costlier and lot more expensive than the cellular companies had hoped. It also turns out that the technology will only bring major advantages to those areas where there is the densest concentration of customers. That means big city business districts, stadiums, convention centers and hotel districts – but not many other places.

That’s the other side of the economy of scale implementation issue. If 5G is only initially implemented in these dense customer sites, then the vast majority of people will see zero benefit from 5G since they don’t go to these densely packed areas very often. And so there are going to be two economy of scale issues to overcome – making enough 5G equipment to keep the vendors solvent while also selling enough more-expensive phones to use the new 5G cell sites. And all of this will happen as 5G is rolled out in drabs and dribbles as happened with 4G.

The vendors are touting that software defined networking will lower the cost to implement 5G upgrades. That is likely to become true with the electronics after they are first implemented. It will be much easier to make the tiny incremental 5G improvements to cell sites after they have first been upgraded to 5G capability. But RAN Research thinks it’s that initial deployment that is going to be the hurdle. The wireless carriers are unlikely to rush to implement 5G in suburban and rural America until they see overwhelming demand for it – enough demand that justifies upgrading cell sites and deploying small cell sites.

There are a few trends that are going to affect the 5G deployment. The first is the IoT. The cellular industry is banking on cellular becoming the default way to communicate with IoT devices. Certainly that will be the way to communicate with things like smart cars that are mobile, but there will be a huge industry struggle to instead use WiFi, including the much-faster indoor millimeter wave radios for IoT. My first guess is that most IoT users are going to prefer to dump IoT traffic into their landline data pipe rather than buy separate cellular data plans. For now, residential IoT is skewing towards the WiFi and towards smart devices like the Amazon Echo which provide a voice interface for using the IoT.

Another trend that could help 5G would be some kind of government intervention to make it cheaper and easier to implement small cell sites. There are rule changes being considered at the FCC and in several state legislatures to find ways to speed up implementation of small wireless transmitters. But we know from experience that there is a long way to go after a regulatory rule change until we see change in the real world. It’s been twenty years now since the Telecommunications Act of 1996 required that pole owners make their poles available to fiber overbuilders – and yet the resistance of pole owners is still one of the biggest hurdles to fiber deployment. Changing the rules always sounds like a great idea, but it’s a lot harder to change the mindset and behavior of the electric companies that own most of the poles – the same poles that are going to be needed for 5G deployment.

I think RAN Research’s argument about achieving 5G economy of scale is convincing. Vendor excitement and hype aside, they estimated that it would cost $1,800 today to build a 5G capable handset, and the only way to get that price down would be to make hundreds of millions of 5G capable handsets. And getting enough 5G cell sites built to drive that demand is going to be a significant hurdle in the US.

Wireless Networks Need Fiber

As I examine each of the upcoming wireless technologies it looks like future wireless technology is still going to rely heavily on an underlying fiber network. While the amount of needed fiber will be less than building fiber to every customer premise, supporting robust wireless networks is still going to require significant construction of new fiber.

This is already true today for the traditional cellular network and most existing towers are fiber-fed, although some have microwave backhaul. The amount of bandwidth needed at traditional cell sites is already outstripping the 1 or 2 GB capacity of wireless backhaul technologies. Urban cell sites today are fed with as much as 5 – 10 GB pipes and most rural ones have (or would like to have) a gigabyte feed. I’ve seen recent contractual negotiations for rural cell sites asking for as much as 5 GB of backhaul within the next 5 – 10 years.

Looking at the specification for future 5G cellular sites means that fiber will soon be the only backhaul solution for cell sites. The specifications require that a single cell site be capable of as much as 20 GB download and 10 GB upload. The cellular world is currently exploring mini-cell sites (although that effort has slowed down) to some degree due to the issues with placing these devices closer to customers. To be practical these small cell sites must be placed on poles (existing or newly built), on rooftops and on other locations found near to areas with high usage demand. The majority of these small sites will require new fiber construction. Today these sites can probably use millimeter wave radio backhaul, but as bandwidth needs increase, this is going to mean bringing fiber to poles and rooftops.

Millimeter wave radios are also being touted as a way to bring gigabit speeds to consumers. But delivering fast speeds means getting the radios close to customers. These radios use extremely high frequencies, and as such travel for short distances. As a hot spot a millimeter wave radio is only good for a little over 100 feet. But even if formed into a tight microwave beam it’s a little over a mile – and also requires true line-of-sight. These radios will be vying for the same transmitter locations as mini-cell sites.

Because of the short distances that can be delivered by the millimeter wave radios, this technology is going to initially be of most interest in the densest urban areas. Perhaps as the radios get cheaper there will be more of a model for suburban areas. But the challenge of deploying wireless in urban areas is that is where fiber is the most expensive to build. It’s not unusual to see new fiber construction costs of $150,000 and $200,000 per mile in downtown areas. The urban wireless deployment faces the challenge of getting both fiber and power to poles, rooftops and sides of buildings. This is the issue that has already stymied the deployment of mini-cell sites, and it’s going to become more of an issue as numerous companies want to build competing wireless networks in our cities. I’m picturing having the four major cellular companies and half a dozen wireless ISPs all wanting access to the same prime transmitter sites. All of these companies will have to deal with the availability of fiber, or will need to build expensive fiber to support their networks.

Even rural wireless deployments needs a lot of fiber. A quality wireless point-to-point wireless network today needs fiber at each small tower. When that is available then the current technologies can deploy speeds between 20 Mbps and 100 Mbps. But using wireless backhaul instead of fiber drastically cuts the performance of these networks and there are scads of rural WISPs delivering bandwidth products of 5 Mbps or less. As the big telcos tear down their remaining rural copper, the need for rural fiber is going to intensify. But the business case is often difficult to justify to build fiber to supply bandwidth to only a small number of potential wireless or wireline customers.

All of the big companies that are telling Wall Street about their shift to wireless technologies are conveniently not talking about this need for lots of fiber. But when they go to deploy these technologies on any scale they are going to run smack into the current lack of fiber. And until the fiber issue is solved, these wireless technologies are not going to deliver the kinds of speeds and won’t be quickly available everywhere as is implied by the many press releases and articles talking about our wireless future. I have no doubt that there will eventually be a lot of customers using wireless last mile – but only after somebody first makes the investment in the fiber networks needed to support the wireless networks.

More on 5G Standards

I wrote a blog last week about the new 5G standard being developed by the International Telecommunications Union (ITU). This standard is expected to be passed this November. However this standard is not the end of the standards process, but rather the beginning. The ITU IMT-2020 standard defines the large targets that define a fully developed 5G product. Basically it’s the wish list and a fully-compliant 5G product will meet the full standard.

But within 5G there are already a number of specific use cases for 5G that are being developed. The most immediate three are enBB (enhanced mobile broadband, or better functioning cellphones), URLLC (ultra-low latency communications to enhance data connectivity) and mMTC (massive machine type communications, to communicate with hordes of IoT devices). Each use case requires a unique set of standards to define how those parts of the 5G network will operate. And there will be other use cases.

The primary body working on these underlying standards is the 3GPP (3rd Generation Partnership Project). This group brings together seven other standards bodies – ARIB, ATIS, CCSA, ETSI, TSDSI, TTA, TTC – which demonstrates how complicated it is to develop a new wireless technology that will be accepted worldwide. I could talk about what each group does, but that would take a whole blog. Each standards group looks at specific aspects of radio communications such as the modulating schemes to be used, or the format of information to be passed so that devices can talk to each other. But the involvement of this many different standards groups explains a bit about why it takes so long to go from a new technology concept like 5G to functioning wireless products.

There is currently a lot work being done to create the specific standards for different portions of a 5G network. This includes the Radio Access Network (RAN), Services and System Aspects (SA) and Core Network and Terminals (CT).

The 5G RAN group, which looks at radio architecture, began work in 2015. Their first phase of work (referred to as Release 15) is looking at both the eMBB and the URLCC use cases. The goal is to define the specific architecture and feature set that is needed to meet the 5G specification. This first phase is expected to be finished in the fourth quarter of 2018. The 5G RAN group is also working on Release 16, which looks more specifically at getting radios that can comply with all of the aspects of IMT-2020 and is targeted to be completed in December of 2019.

The 5G SA group has already been actively working on the services and systems aspects of 5G. The preliminary work from this group was finished last year and final approval of their phase 1 work was just approved at the Mobile World Congress. But the SA group and the RAN group worked independently and it’s expected that there will be work to be done at the end of each phase of the RAN group to bring the two groups into sync.

The work on the core network has begun with some preliminary testing and concepts, but most of their work can’t be started until the RAN group finishes its work in 2018 and 2019.

The reason I am writing about this is to demonstrate the roadblocks that still remain to rolling out any actual 5G products. Manufacturers will not commit to making any mass-produced hardware until they are sure it’s going to be compatible with all parts of the 5G network. And it doesn’t look like any real work can be done in that area until about 2020.

Meanwhile there is a lot of talk from AT&T, Verizon and numerous vendors about 5G trials, and these press releases always make it sound like 5G products will quickly follow these trials. But for the most part these trials are breadboard tests of some of the concepts of the 5G architecture. These tests provide valuable feedback on problems developed in the field and on what works and doesn’t work.

And these companies are also making 5G claims about some technologies that aren’t really 5G yet. Most of the press releases these days are talking about point-to-point or point-to-multipoint radios using millimeter wave frequencies. But in many cases these technologies have been around for a number of years and the ‘tests’ are attempts to use some of the 5G concepts to goose more bandwidth out of existing technology.

And that’s not a bad thing. AT&T, Verizon, Google and Starry, among others, are looking for ways to use high-bandwidth wireless technologies in the last mile. But as you can see by the progress of the standards groups defining 5G, the radios we see in the next few years are not going to be 5G radios, no matter what the marketing departments of those companies call them.

AT&T’s Broadband Trials

John Donovan, the chief strategy officer for AT&T, spoke at the Mobile World Congress recently and said that the company was trying five different technologies for the last mile. This includes WLL (wireless local loop), G.Fast, 5G, AirGig and fiber-to-the-premise. He said the company would be examining the economics of all of different technologies. Let me look at each one, in relation to AT&T.

Wireless Local Loop (WLL). The technology uses the companies LTE bandwidth but utilizes a point-to-multipoint network configuration. By using a small dish on the house to receive the signal the company is getting better bandwidth than can be received from normal broadcast cellular. The company has been doing trials on various different versions of the technology for many years. But there are a few recent trials of the newest technology that AT&T will be using for much of its deployment in rural America as part of the CAF II plan. That plan requires the ISP to deliver at least 10/1 Mbps. AT&T says that the technology is delivering speeds of 15 to 25 Mbps. The company says that even at the edge of a cellular network that a customer can get 10 Mbps about 90% of the time.

G.Fast. This is a technology that uses high frequencies to put more bandwidth on telephone copper wire. Speeds are reported to be as high as 500 Mbps, but only for very short distances under 200 feet. AT&T recently announced a G.Fast trial in an apartment building in Minneapolis. The technology is also being tested by CenturyLink and Windstream. All of these trials are using existing telephone copper inside of existing apartment buildings to deliver broadband. So this is not really a last mile technology. AT&T brings fiber to the apartment complex and then uses G.Fast as an inside wire technology. If they find it to be reliable this would be a great alternative to rewiring apartments with fiber.

5G. AT&T recently announced a few trials of early 5G technologies in Austin. They are looking at several technology ideas such carrier aggregation (combining many frequencies). But these are just trials, and AT&T is one of the companies helping to test pre-5G ideas as part of the worldwide effort to define the 5G specifications. These are not tests of market-ready technologies, but are instead field trials for various concepts needed to make 5G work. There is no doubt that AT&T will eventually replace LTE wireless with 5G wireless, but that transition is still many years in the future. The company is claiming to be testing 5G for the press release benefits – but these are not tests of a viable last mile technology – just tests that are moving lab concepts to early field trials.

AirGig. This one remains a mystery. AT&T says it will begin trialing the technology later this year with two power companies. There has been a little bit of clarification of the technology since the initial press release. This is not a broadband over powerline technology – it’s completely wireless and is using the open lines-of-sight on top of power poles to create a clear path for millimeter wave radios. The company has also said that they don’t know yet which wireless technology will be used to go from the poles into the home – they said the whole range of licensed spectrum is under consideration including the LTE frequencies. And if that’s the case then the AirGig is a fiber-replacement, but the delivery to homes would be about the same as WLL.

FTTP. Donovan referred to fiber-to-the-home as a trial, but by now the company understands the economics of fiber. The company keeps stretching the truth a bit about their fiber deployments. The company keeps saying that they have deployed fiber to 4 million homes, with 8 million more coming in the next three years. But the fact is they have actually only passed the 4 million homes that they can market to as is disclosed on their own web site. The twelve million home target was something that was dictated by the FCC as part of the settlement allowing the company to buy DirecTV.

We don’t know how many fiber customers AT&T has. They are mostly marketing this to apartment buildings, although there are residential customers around the country saying they have it. But they have not sold big piles of fiber connections like Verizon FiOS. This can be seen by looking at the steady drop in total AT&T data customers – 16.03 million in 2014, 15.78 million in 2015 and 15.62 million at the end of the third quarter of 2016. AT&T’s fiber is not really priced to be super-competitive, except in markets where they compete with Google Fiber. Their normal prices elsewhere on fiber are $70 for 100 Mbps, $80 for 300 Mbps and $99 for a gigabit.

Standards for 5G

itu_logo_743395401Despite all of the hype that 5G is right around the corner, it’s important to remember that there is not yet a complete standard for the new technology.

The industry just took a big step on February 22 when the ITU released a draft of what it hopes is the final specification for 5G. The document is heavy in engineering detail and is not written for the layman. You will see that the draft talks about a specification for ‘IMT-2020’ which is the official name of 5G. The goal is for this draft to be accepted at a meeting of the ITU-R Study Group in November.

This latest version of the standard defines 13 metrics that are the ultimate goals for 5G. A full 5G deployment would include all of these metrics. What we know that we will see is commercial deployments from vendors claiming to have 5G, but which will actually meet only some parts of a few of these metrics. We saw this before with 4G, and the recent deployment of LTE-U is the first 4G product that actually meets most of the original 4G standard. We probably won’t see a cellular deployment that meets any of the 13 5G metrics until at least 2020, and it might be five to seven more years after that until fully compliant 5G cellular is deployed.

The metric that is probably the most interesting is the one that establishes the goal for cellular speeds. The goals of the standard are 100 Mbps download and 50 Mbps upload. Hopefully this puts to bed the exaggerated press articles that keep talking about gigabit cellphones. And even should the technology meet these target speeds, in real life deployment the average user is probably only going to receive half those speeds due to the fact that cellular speeds decrease rapidly with distance from a cell tower. Somebody standing right next to a cell tower might get 100 Mbps, but even as close as a mile away the speeds will be considerably less.

Interestingly, these speed goals are not much faster than is being realized by LTE-U today. But the new 5G standard should provide for more stable and guaranteed data connections. The standard is for a 5G cell site to be able to connect to up to 1 million devices per square kilometer (a little more than a third of a square mile). This, plus several other metrics, ought to result in stable 5G cellular connections – which is quite different than what we are used to with 4G connections. The real goal of the 5G standard is to provide connections to piles of IoT devices.

The other big improvement over 4G are the expectations for latency. Today’s 4G connections have data latencies as high as 20 ms, which accounts for most problems in loading web pages or watching video on cellphones. The new standard is 4 ms latency, which would improve cellular latency to around the same level that we see today on fiber connections. The new 5G standard for handing off calls between adjoining cell sites is 0 ms, or zero delay.

The standard increases the demand potential capacity of cell sites and provides a goal for the ability of a cell site to process peak data rates of 20 Gbps down and 10 Gbps up. Of course, that means bringing a lot more bandwidth to cell towers and only extremely busy urban towers will ever need that much capacity. Today the majority of fiber-fed cell towers are fed with 1 GB backbones that are used to satisfy upload and download combined. We are seeing cellular carriers inquiring about 10 GB backbones, and we need a lot more growth to meet the capacity built into the standard.

There are a number of other standards. Included is a standard requiring greater energy efficiency, which ought to help save on handset batteries – the new standard allows for handsets to go to ‘sleep’ when not in use. There is a standard for peak spectral efficiency which would enable 5G to much better utilize existing spectrum. There are also specifications for mobility that extend the goal to be able to work with vehicles going as fast as 500 kilometers per hour – meaning high speed trains.

Altogether the 5G standard improves almost every aspect of cellular technology. It calls for more robust cell sites, improved quality of the data connections to devices, lower energy requirements and more efficient hand-offs. But interestingly, contrary to the industry hype, it does not call for gigantic increases of cellular handset data speeds compared to a fully-compliant 4G network. The real improvements from 5G are to make sure that people can get connections at busy cell sites while also providing for huge numbers of connections to smart cars and IoT devices. A 5G connection is going to feel faster because you ought to almost always be able to make a 5G connection, even in busy locations, and that the connection will have low latency and be stable, even in moving vehicles. It will be a noticeable improvement.

Time for a New Telecom Act, Part 2

FCC_New_LogoYesterday’s blog postulated that we would see a new telecom act this year from Congress. That blog looked at what was accomplished by the last Telecommunications Act of 1996. Today I’m looking ahead at the issues that a new Act needs to address.

Last week we learned more about how the process will probably work. A new telecom act would likely be spearheaded by the Energy and Commerce Subcommittee on Communications and Technology. Last week Rep. Marsha Blackburn, head of that committee, told the press that she favored giving the new FCC a shot at fixing the things under its purview before the House would tackle a new Act. The FCC doesn’t have the authority to make many of the needed changes in telecom regulation, but it does have considerable power. Anyway, this probably means a new act is at least a year away.

Here are some of the things that I think the FCC and Congress need to address to modernize telecom:

Need for More Spectrum. It’s becoming clear that a lot of big ISPs are thinking of deploying 5Gn and various other millimeter wave technologies. The FCC needs to continue to open up more spectrum for broadband. There is still a lot of spectrum has been reserved for government use and there needs to be more attempts to share frequency when possible. There also needs to be a fresh look taken at how frequency is used. Historically many bands of frequency had narrow channels aimed at accommodating voice traffic or a single channel of television. From an engineering perspective we can get a lot more out of spectrum if we can make wider channels in the spectrum bands that are already in use.

Tackling Cybersecurity. 2016 was a year when security breaches led the industry news weekly. There is no easy fix for security issues, but there are big steps that can be taken. For example, we are flooding the world with IoT devices that are easily hacked and which can now be used to launch coordinated denial of service attacks. With Congressional backing the FCC could create standards to make IoT devices more secure. The government will never make us free from hacking, but there are a lot of sensible standards and fixes needed for IoT devices.

Expanding Access to Fast Broadband. As somebody who works regularly in rural America I know that lack of broadband there is now one of the biggest problems identified by rural households. We need to find ways to get good broadband to more places, and we have to do this smartly by building infrastructure that will last for decades. We’ve already seen how not to do this with the CAF II program that is being used to expand DSL and LTE wireless – two technologies that are already inadequate today.

Unless we see that fiber is built everywhere this is going to be an ongoing major issue. For example, if we fix broadband for those that have none but ignore the bigger swathe of the country that has only marginally acceptable broadband today, we will be back in a decade looking at how to fix broadband in those places.

We also need rules that unleashes anybody willing to spend money on fiber. I see numerous rural counties and towns that are ready to spring for bond issues to get fiber. We need rules that allow anybody willing to invest in fiber be able to do so – be that local governments, electric cooperatives, rural telcos or anybody else.

Infrastructure Issues. There are still a lot of infrastructure roadblocks to deploying fiber. We have never done a good job of fulfilling the mandate from the 1996 Act to provide access to poles and conduit. And we are now looking at deploying a fiber-fed wireless network that is going to mean bringing both fiber and power to buildings, rooftops, poles and other infrastructure. We need to find a way to get this done without also trampling over the legitimate concerns of local jurisdictions. For example, the FCC can’t just demand that cities allow free and quick fiber construction if that means digging up newly paved streets or overburdening poles – we need to find rules that work. And we need to do a much better job of this than we have done so far.

Programming. It’s now clear that online video content is competitive alternative to traditional cable TV. We need rules that unleash cable companies and anybody else to sell programming that people really want to buy. That means stepping away from the current rigid cable rules that mandate the giant channel lineups. Companies need to be free to create programming bundles that people want to buy. This might mean allowing a la carte programming. And there must be rules that require content providers to sell to everybody in an unbiased manner.

I don’t know how many of these big issues the current FCC is going to be willing to tackle. It seems like a lot of their agenda for the first six months will be to undo things ordered by the previous FCC. While I understand the desire to mold the FCC to the political persuasion of whatever party is in power, most of the issues on my list above are not partisan. They are just things that we all need to solve if we are to have a telecom infrastructure that serves us all well.

The Challenges of Fixed Gigabit Wireless

webpass_logoWe got a preview this week of what fixed wireless service might look like in urban environments. Google announced it is aggressively expanding the footprint of Webpass, the wireless ISP that Google purchased last year. The company has been operating in six cities and will now be expanding to nine more markets. These will all be downtown urban deployments.

The deployment uses high-capacity microwave links to serve high-rise buildings. Webpass already has 20,000 residential customers in the six markets, all which live in downtown high-rises. The company focuses more on serving business customers. This business plan has been around for years and I was actually helping to launch a business years ago with the same plan that died with the 2000 telecom crash.

The network consists of microwave shots to each building on the network. The first hurdle in getting this to work is to get enough quality radio sites to see buildings. As I noted in a blog last week, access to this kind of real estate is at a premium in urban areas, as cellphone providers have found when trying to deploy small cell sites.

The radios required to make the links are not gigantic, but you need one full radio and a dish at both ends of every link. This means that from any one given hub building there will be a limited number of links that can be made to other buildings, just due to space limitations. If you imagine half a dozen companies trying to this same thing (this will be the same basic deployment method for urban 5G), then you can picture a proliferation of companies fighting over available radio space on roofs.

Webpass in the past has limited their deployment to buildings that are either already wired with category 5 cable or fiber. They face the same issue that any broadband provider faces in bringing broadband into older buildings – only they are starting on the roof rather than from a basement wiring closet like other ISPs. There are very few ISPs yet willing to tackle the rewiring effort needed in large older buildings that serve residences. As you will see from the pricing below, Webpass and other ISPs are a lot more willing to tackle business buildings and absorb some rewiring costs.

The primary thing for the public to understand about this new roll-out is that it’s very limited. This won’t go to single family homes. It will go to downtown residential high-rises, but only to those that are pre-wired or easy to wire. And even in those buildings Webpass won’t go unless they get at least 10 customers. However, they will contract with landlords to serve whole buildings.

The Webpass pricing is interesting. For residential customers the price is $60 per month regardless of the speed achieved. Webpass says they deliver speeds between 100 Mbps and 500 Mbps, but in reading numerous reviews, there are complaints that speeds can get slower at peak evening time in some buildings (as one would expect when there are a lot of customers sharing one radio link).

Webpass’ pricing for businesses varies according to the number of other customers they get in a building. For example, if there are 10 or more business customers in a building they will sell a 100 – 200 Mbps connection for $250 per month with a 10 TB monthly data cap. But prices are much higher for customers in buildings with fewer than 10 customers:

Speed              Cost                 Data Cap         Price with no Cap

10 Mbps          $125                   1 TB                $375

20 Mbps          $250                   2 TB                $750

50 Mbps          $500                   5 TB                $1,500

100 Mbps        $1,000                10 TB              $2,000

250 Mbps                                                           $2,500

500 Mbps                                                           $4,000

1 Gbps                                                                $5,500

From a technical perspective Webpass is deploying in line with the way the technology works. The radios are too expensive to deploy to smaller customers or to smaller buildings. A building also need to be within a mile of the base transmitter (and hopefully closer) to get good speeds. That is largely going to mean downtown deployments.

We know there are a number of other companies considering a similar plan. Starry announced almost two years ago that they were deploying something similar in Boston, but has yet to launch. We know AT&T and Verizon are both exploring something similar to this Google product using 5G radios. But all of these companies are going to be fighting over the same limited markets.

The cellular companies keep hinting in their press releases that they will be able to use 5G to bring gigabit speeds. When they say that, this is the kind of deployment they are talking about. The only way they are going to be able to bring gigabit wireless speeds to single family homes and to suburbs is if they can develop some sort of mini transmitters to go onto utility poles. That technology is going to require building fiber close to each house and the radios are going to replace fiber drops. The above deployment by Webpass is not hype – they already have customers in six markets. But this technology is not the panacea for fast broadband for everyone that you might believe from reading the press releases.

Catching Up On Small Cell Deployment

light-pole-on-i-805-in-san-diego2I remember going to a presentation at a trade show a few years back where there was great enthusiasm for the future of small cell sites for cellular networks. The panel, made up mostly of vendors, was predicting that within five years there would be hundreds of millions of small cells deployed throughout all of the urban areas of the US.

Small cells are supposed to relieve congestion from the larger existing cellular towers. They can be hung anywhere such as on light poles, rooftops, and even in manholes. They have a relatively small coverage area ranging from 30 to 300 feet depending upon the local situation.

But I recently saw that MoffettNathanson estimated that there have only been 30,000 small cells deployed so far. That’s obviously a far cry smaller than the original projections and it’s an interesting study in the dynamics of the telecom industry for why this didn’t go as planned. We’ve seen other examples of new technologies before that didn’t pan out as promised, so it’s a familiar story to us that have been following the industry for a while.

There are a number of different issues that have slowed down small cell deployment. One of the key ones is price since it can cost between $35,000 and $65,000 to get a small cell in place. That’s a steep price to pay for a small coverage area unless that area is full of people much of the day.

Another problem is that small cells need to be fiber fed and also need to have a source of reliable continuous power. Not surprisingly, that turns out to be a big issue in the crowded urban areas where the small cells make the most sense. It’s not easy, for example, to bring fiber to an existing light pole. And it’s often not even easy to bring reliable power to some of the best-suited cell locations.

The problems that surprised the cellular industry the most are the problems with getting permits to place the cell sites. Remember that these sites are deployed in the densest parts of big cities and many of those cities have a lot of rules about running new fiber or power lines in those areas. Some of the cellular companies have cited waits as long as two years for permitting in some locations.

Yet another problem is that the big cellular companies are having a hard time figuring out how to incorporate the new technology into their processes. The whole industry has grown up dealing with big cell towers and all of the work flows and processes are geared towards working in the tower environment. I can’t tell you how many times I’ve seen big companies have trouble dealing with something new. It was the inability to change the existing workflows, for example, that led Verizon to basically start a whole new company from scratch when they launched FiOS.

And like any new technology, the small cells have not always delivered the expected performance. This has a few companies stepping back to assess if small cells are the right way to go. For instance, AT&T has largely stopped new small cell deployment for now.

The FCC recently took a stab at some new regulations that might make the permitting process easier. And the FCC just released a policy paper that promised to look at further easing the rules for deploying wireless technology and for getting onto poles.

The main reason that I’m following small cells is that the industry is on the cusp of implementing two new technologies that are going to face all of the same issues. It’s clear that 5G is going to need small cells if it is to be able to handle the number of devices in a local area that have been hyped by the cellular companies. And Google, AT&T and others are looking at wireless local loop technologies that are also going to require small fiber-fed devices be spread throughout a service area. My gut feeling is the same problems that have plagued small cell deployment are going to be a thorn for these new technologies as well – and that might mean it’s going to take a lot longer to deploy these technologies than what the industry is touting.