Clearing Mid-range Spectrum

The FCC is in the process of trying to free up mid-range spectrum for 5G. They just opened a Notice of Proposed Rulemaking looking at 2.5 GHz spectrum, located in the contiguous block between 2495 and 2690 MHz. Overall this is the largest contiguous block of mid-range spectrum. Over half of the spectrum sits idle today, particularly in rural America. The history of this spectrum demonstrates the complications involved in trying to reposition spectrum for broadband and cellular use.

The frequency was first assigned by the FCC in 1963 when it was made available to school systems to transmit educational TV between multiple schools. The spectrum band was called Instructional Television Fixed Service (ITFS). The band was divided into twenty channels and could transmit a TV signal up to about 35 miles. I grew up in a school system that used the technology and from elementary school onward we had a number of classes taught on the TV. Implementing the technology was expensive and much of the spectrum was never claimed.

In 1972 the FCC recognized the underuse of the spectrum and allowed commercial operators to use the bands of 2150 to 2162 MHz on an unlicensed basis for pay-TV transmissions to rooftop antennas. The spectrum could only carry a few TV channels and in the 1970s was used in many markets to transmit the early version of HBO and Nickelodeon. This spectrum band was known as Multipoint Distribution Service (MDS) and also was good for about 35 miles.

Reacting to pressure from cellular companies, the FCC reallocated eight additional channels of the spectrum for commercial use. Added to the MDS spectrum this became known as Multichannel Multipoint Distribution Service (MMDS). At the time this displaced a few school systems and anybody using the spectrum had to pay to move a school system to another workable channel. This spectrum was granted upon request to operators for specific markets.

In 1991 the FCC changed the rules for MMDS and allowed the channels to be used to transmit commercial TV signals. In 1995 any unused MMDS spectrum was sold under one of the first FCC auctions, which was the first to divide service areas into the geographic areas known as Basic Trading Areas (or BTAs) that are still used today. Before this auction, the spectrum was awarded in 35-mile circles called Geographic Service Areas (GSAs). The existing GSAs were left in place and the spectrum sold at auction had to work around existing GSAs.

The FCC started getting pressure from wireless companies to allow for the two-way transmission of data in the frequency (up to now it had been all one-way delivery to a customer site). In 2005 the FCC changed the rules and renamed the block of spectrum as Broadband Radio Service (BRS). This added significant value to licenses since the spectrum could now be repositioned for cellular usage.

At this point, Clearwire entered the picture and saw the value of the spectrum. They offered to buy or lease the spectrum from school systems at prices far lower than market value and were able to amass the right to use a huge amount of the spectrum nationwide. Clearwire never activated much of the spectrum and was in danger of losing the rights to use it. In 2013 Sprint purchased Clearwire, and Sprint is the only cellular company using the spectrum band today.

Today the spectrum band has all sorts of users. There are still school districts using the spectrum to transmit cable TV. There are still license holders who never stopped using the 35-mile GSA areas. There are still MMDS license holders who found a commercial use for the spectrum. And Sprint holds much of the spectrum not held by these other parties.

The FCC is wrestling in the NPRM with how to undo the history of the spectrum to make it more valuable to the industry. Education advocates are still hoping to play in the space since much of the spectrum sits idle in rural America (as is true with a lot of cellular and other mid-range spectrum). The other cellular carriers would like to see chunks of the spectrum sold at auction. Other existing license holders are fighting to extract the biggest value out of any change of control of the spectrum.

The challenge for repositioning this spectrum is complicated because the deployment of the spectrum differs widely today by market. The FCC is struggling to find an easy set of rules to put the genie back in the bottle and start over again. In terms of value for 5G, this spectrum sits in a sweet spot in terms of coverage characteristics. Using the spectrum for cellular data is probably the best use of the spectrum, but the FCC has to step carefully to do this in such a way as to not end up in court cases for years disputing any order. Reallocating spectrum is probably the most difficult thing the FCC does and it’s not hard to see why when you look at the history of this particular block of spectrum and realize that every block of spectrum has a similar messy past.

AT&T and Augmented Reality

Lately it seems like I find a news article almost every week talking about new ways that people are using broadband. The latest news is an announcement that AT&T is selling Magic Leap augmented reality headsets in six cities plus online.

The AT&T launch is being coordinated with the release of an augmented reality immersive experience that will bring The Game of Thrones into people’s homes with a themed gaming experience called The Dead Must Die, with a teaser in this trailer.

Augmented reality differs from virtual reality in that augmented reality overlays images into the local environment. A user will see characters in their living room as opposed to being immersed in a total imaginary environment with virtual reality.

Magic Leap is one of the most interesting tech start-ups. They started in 2014 with a $542 million investment, and since then have raised over $2.3 billion dollars. The company’s investors and advisors include people like Alibaba executive vice chair Joe Tsai and director Steven Spielberg. There have been rumors over the years of an impending product, but until now they’ve never brought a product to market. AT&T will be selling Magic Leap’s first headset, called the Magic Leap One Creator Edition for a price of $2,295. The mass-market headset will surely cost a lot less.

AT&T’s interest in the technology extends past selling the headsets. Magic Leap recently signed a deal with the NBA and its broadcast partner Turner which is now owned by AT&T and will obviously be looking at augmented reality broadcasts of basketball games.

AT&T’s interest goes even far beyond that and they are looking at the Magic Leap technology as the entry into the spatial Internet – moving today’s web experience to three dimensions. AT&T sees the Magic Leap headset as the entry into bringing virtual reality to industries like healthcare, retail and manufacturing. They envision people shopping in 3D, doctors getting 3D computer assistance for visualizing a patient during an operating, and manufacturer workers aided by overlaid 3D blueprints on the manufacturing floor.

While the Magic Leap headset will work on WiFi today, AT&T is promoting Magic Leap as part of their 5G Innovation Program. AT&T is touting this as a technology that will benefit greatly from 5G, which will allow users to go mobile and use the augmented reality technology anywhere.

I couldn’t find any references to the amount of bandwidth used by this first-generation headset, but it has to be significant. Looking at the Game of Thrones application, a user is immersed in a 3D environment and can move and interact with elements in the augmented reality. That means a constant transmission of the elements in the 3D environment. I have to think that is at least equivalent to several simultaneous video transmissions. Regardless of the bandwidth used today, you can bet that as augmented reality becomes mainstream that content makers will find ways to use greater bandwidth.

We are already facing a big increase in bandwidth that is needed to support gaming from the cloud – as is now being pushed by the major game vendors. Layering augmented reality on top of that big data stream will increase bandwidth needs by another major increment.

Broadband and Food Safety

I recently saw a presentation that showed how food safety is starting to rely on good rural broadband. I’ve already witnessed many other ways that farmers use broadband like precision farming, herd monitoring, and drone surveillance, but food safety was a new concept for me.

The presentation centered around the romaine lettuce scare of a few months ago. The food industry was unable to quickly identify the source of the contaminated produce and the result was a recall of all romaine nationwide. It turns out the problem came from one farm in California with E. Coli contamination, bur farmers everywhere paid a steep price as all romaine was yanked from store shelves and restaurants, also resulting in cancellations of upcoming orders.

Parts of the food industry have already implemented the needed solution. You might have noticed that the meat industry is usually able to identify the source of problems relatively quickly and can ususally track problems back to an individual rancher or packing house. Cattle farmer are probably the most advanced at tracking the history of herd animals, but all meat producers track products to some extent.

The ideal solution to the romaine lettuce problem is to document every step of the farming process and to make that information available to retailers and eventually to consumers. In the case of romaine that might mean tracking and recording the basic facts of each crop at each farm. That would mean recording the strain of seeds used. It would mean logging the kinds of fertilizer and insecticide applied to a given field. It would mean recording the date when the romaine was picked. The packing and shipping process would then be tracked so that everything from the tracking number on the box or crate, and the dates and identity of every immediate shipper between farm to grocery store would be recorded.

Inititally this would be used to avoid the large blanket recalls like happened with romaine. Ultimately, this kind of information could be made available to consumers. We could wave our smartphone at produce and find out where it was grown, when it was picked and how long it’s been sitting in the store. There are a whole lot of steps that have to happen before the industry can reach that ultimate goal.

The process needs to start with rural broadband. The farmer needs to be able to log the needed information in the field. The day may come when robots can automatically log everything about the growing process, and that will require even more intensive and powerful broadband. The farmer today needs an easy data entry system that allows data to be scanned into the cloud as they work during the growing, harvesting, and packing process.

There also needs to be some sort of federal standards so that every farmer is collecting the same data, and in a format that can be used by every grocery store and restaurant. There is certainly a big opportunity for any company that can develop the scanners and the software involved in such a system.

In many places this can probably be handled with robust cellular data service that extends into the fields. However, there is a lot of rural America that doesn’t have decent, or even any cell service out in the fields. Any farm tracking data is also going to need adequate broadband to upload data into the cloud. Farms with good broadband are going to have a big advantage over those without. We already know this is true today for cattle and dairy farming where detailed records are kept on each animal. I’ve talked to farmers who have to drive every day to find a place to upload their data into the cloud.

In the many counties where I work today the farmers are among those leading the charge for better broadband. If selling produce or animals requires broadband we are going to see farmers move from impatience to insistence when lack of connectivity means loss of profits.

I know as a consumer that I would feel better knowing more about the produce I buy. I’d love to buy more produce that was grown locally or regionally, but it’s often nearly impossible to identify in the store. I’d feel a lot safer knowing that the batch of food I’m buying has been tracked and certified as safe. Just in the last year there’s been recalls on things like romaine, avocados, spring onions, and packaged greens mixes. I don’t understand why any politician that serves a farming district is not screaming loudly for a national solution for rural broadband.

Millimeter Wave Cellular Service

Verizon is claiming to have the first real-world deployment of fast 5G cellular service. They launched an early version of what they are calling 5G in downtown Chicago and Minneapolis. This launch involves the deployment of millimeter wave spectrum.

A review of the cellular performance in FierceWireless showed exactly what was to be expected. This new service will only be available from a few cell sites in each city. For now the service can only be received using a Motorola Z3 handset that has been modified with a 5G Moto Mod adapter.

As would be expected, the millimeter wave broadband was fast, with peak speed measured at 500 Mbps. But also as expected, the coverage area is small, and millimeter wave spectrum is easily blocked by almost any impediment. Walking inside a building or around the corner of a building killed the broadband signal. The signal speed cut in half when received through a window. When not in the range of the millimeter wave signal the phone reverts to 4G, because Verizon is not yet close to implementing any actual 5G standards. This was not a trial of 5G technology – it’s a trial that shows that millimeter wave spectrum can carry a lot of data. That is especially easy to demonstrate when there are only one or two users on a given cell site.

Verizon announced a fee of $10 per month for the faster data speed, but almost immediately said the fee will be waived. This launch is another marketing gimmick letting Verizon get headlines proclaiming 500 Mbps cellular data speeds. The reviewer noted that the Verizon store in downtown Chicago was not ready to provide the product to anybody.

There are big issues with using millimeter wave spectrum for cellular service. I first ask what a cellphone user can do with that kind of speed. A cellphone can already be used to stream a video on a decent 4G connection. Other than software updates there isn’t any real need to download big files on a cellphone. It’s unlikely that the cellular carriers are going to let you tether speeds of that magnitude to a computer.

The other big issues will be the real-life limitations of millimeter wave spectrum outdoors. Since the frequency won’t pass through walls, this is strictly going to be an outdoor walking technology. As the FierceWireless review showed, it’s extremely easy to walk out of coverage. A cellular carrier will need to provide multiple cell sites in very close proximity in order to cover a given area.

It’s hard to think that there will ever be many subscribers willing to pay $10 more per month for a product with these limitations. How many people care about getting faster data speed outside, and only in areas of a city that are close to 5G transmitters? Would many cellular customers pay more so that they could save a few minutes per month to download software updates?

It’s hard to envision that the incremental revenues from customers will ever justify the cost of deploying multiple cell sites within close proximity of each other. T-Mobile already announced that they don’t plan to charge extra for 5G data when it’s available – there is no incentive to offer the product if there is no additional revenue.

What I found interesting is that Verizon also announced that they will be launching this same product in 20 additional urban markets soon, with 30 markets by the end of the year. The company will be using this launch to promote the new Galaxy S10 5G phone that will be able to utilize the millimeter wave spectrum. Verizon is touting the new service by saying that it will provide access to faster streaming, augmented-reality, gaming, and consumer and business applications.

If anything, this launch is a gimmick to sell more of the expensive 5G handsets. I wonder how many people will buy this phone hoping for faster service, only to realize that they have to stand outside close to a downtown millimeter wave cell site to use it. How many people want to go outside to enjoy faster gaming or augmented reality?

This is not to say that millimeter wave spectrum doesn’t have value, but that value will manifest when Verizon or somebody offers an indoor 5G modem that’s connected to a landline broadband connection. That would enable a cellphone to connect to faster gaming or augmented reality. That has some definite possibilities, but that is not cellular service, but rather an indoor broadband connection using a cellphone as the receiver.

I’m really starting to hate these gimmicks. Verizon and AT&T are both painting a false picture of 5G by making everybody think it will provide gigabit speeds everywhere – something that is not even listed as a goal of the 5G specifications. These gimmicks are pure marketing hype. The companies want to demonstrate that they are cutting edge. The gimmicks are aimed even more for politicians who the carriers are courting to support deregulation of broadband in the name of 5G. In the cease of this particular gimmick, Verizon might sell more Samsung 5G phones. But the gimmicks are just gimmicks and this trial is not a real product.

The Fourth Industrial Revolution

There is a lot of talk around the world among academics and futurists that we have now entered into the beginnings of the fourth industrial revolution. The term industrial revolution is defined as a rapid change in the economy due to technology.

The first industrial revolution came from steam power that drove the creation of the first large factories to create textiles and other goods. The second industrial revolution is called the age of science and mass production and was powered by the simultaneous development of electricity and oil-powered combustion engines. The third industrial revolution was fairly recent and was the rise of digital technology and computers.

There are differing ideas of what the fourth industrial revolution means, but every prediction involves using big data and emerging technologies to transform manufacturing and the workplace. The fourth industrial revolution means mastering and integrating an array of new technologies including artificial intelligence, machine learning, robotics, IoT, nanotechnology, biotechnology, and quantum computing. Some technologists are already predicting that the shorthand description for this will be the age of robotics.

Each of these new technologies is in their infancy but all are progressing rapidly. Take the most esoteric technology on the list – quantum computing. As recently as three or four years ago this was mostly an academic concept and we now have first generation quantum computers. I can’t recall where I read it, but I remember a quote that said that if we think of the fourth industrial revolution in terms of a 1,000-day process that we are now only on day three.

The real power of the fourth industrial revolution will come from integrating the technologies. The technology that is the most advanced today is robotics, but robotics will change drastically when robots can process huge amounts of data quickly and can use AI and machine learning to learn and cope with the environment in real time. Robotics will be further enhanced in a factory or farm setting by integrating a wide array of sensors to provide feedback from the surrounding environment.

I’m writing about this because all of these technologies will require the real-time transfer of huge amounts of data. Futurists and academics who talk about the fourth industrial revolution seem to assume that the needed telecon technologies already exist – but they don’t exist today and need to be developed in conjunction with the other new technologies.

The first missing element to enable the other technologies are computer chips that can process huge amounts of data in real time. Current chip technology has a built-in choke point where data is queued and fed into and out of a chip for processing. Scientists are exploring a number of ways to move data faster. For example, light-based computing has the promise to move data at speeds up to 50 Gbps. But even that’s not fast enough and there is research being done using lasers to beam data directly into the chip processor – a process that might increase processing speeds 1,000 times over current chips.

The next missing communications element is a broadband technology that can move data fast enough to keep up with the faster chips. While fiber can be blazingly fast, a fiber is far too large to use at the chip level, and so data has to be converted at some point from fiber to some other transmission path.

The amount of data that will have to be passed in some future applications is immense. I’ve already seen academics bemoaning that millimeter wave radios are not fast enough, so 5G will not provide the solution. Earlier this year the first worldwide meeting was held to officially start collaborating on 6G technology using terabit wave spectrum. Transmissions at those super-high frequencies only stay coherent for a few feet, but these frequencies can carry huge amounts of data. It’s likely that 6G will play a big role in providing the bandwidth to the robots and other big data needs of the fourth industrial revolution. From the standpoint of the telecom industry, we’re no longer talking about last-mile and we are starting to address the last-foot!

Are You Ready for 6G?

The first 6G summit convenes this coming weekend in Levi, Lapland, Finland, sponsored by the University of Oulu. The summit will end with a closed-door, invitation-only assembly of wireless researchers and vendors with the goal to create a draft vision statement defining the goals of 6G research. Attendees include all of the major wireless vendors like Huawei, Ericsson, Samsung, and NTT, along with researchers from numerous universities and groups like Nokia Bell Labs.

As you would expect, even as 5G standards were being finalized there were already private and academic research labs working on what will come next. So far, some of the vision for 6G includes concepts like:

  • Use of higher frequencies between 100 GHz and 1 THz, introducing the world to the idea of terahertz spectrum. The upper end of this range lies between radio waves and infrared light. The FCC just approved research above 95 GHz.
  • Researches believe this next generation wireless will be needed to finally enable 3D holograms needed for lifelike telepresence.
  • The higher frequencies would also allow for densification and for the simultaneous transmission of multiple large-bandwidth transmissions. Researchers already believe that with the higher frequencies that the capacity of a wireless network could be as much as 1,000 times that of 5G – but even 10 times faster would be a major breakthrough.
  • Scientists anticipate within a decade that we’ll have advanced far enough with artificial intelligence to enable AI-powered routing that will choose the best path in real time for each packet and will significantly decrease latency.
  • Various researchers from Brown University and universities in Australia have said that 5G will be inadequate to satisfy our future needs for both bandwidth and for the overall number of IoT connections. One of the goals of 6G will be to increase the number of connected devices from a given transmitter by one to two magnitudes.

The higher frequencies will allow for even faster data transmission, as much as 10 times faster than the gigabit speeds envisioned for point-to-multipoint 5G using millimeter wave radios.

There are a number of issues to be overcome with the higher frequencies, the primary being that radio waves at those frequencies won’t pass through any barrier. However, scientists already think there might be strategies for bouncing the waves around obstacles.

The other shortcoming of the frequencies is the short distances before the signal dissipates. This is likely to limit the higher frequencies to indoor use allowing for indoor wireless networks with speeds as fast as 10 Gbps.

Interestingly, researchers in China say that this vision of 6G is the end of the line in terms of major platform upgrades and that there will never be a 7G. After 6G the goal over time will be to improve the performance of the various aspects of the technologies involved. Apparently, the Chinese have never met any AT&T and Verizon marketing staff.

Many of the group researching these topics are already talking about having a 6G set of standards by 2030. But there is a lot of research to be done including fundamental steps like developing chips capable of handling the higher speeds. We also will hit regulatory barriers – governments all regulate the use of radio waves, but it might be harder to regulate the use of the light-like frequencies at the base of the infrared spectrum.

Finally – A Whitebox Solution for Small ISPs

A few years ago I wrote about the new industry phenomenon where the big users of routers and switches like Facebook, Google, Microsoft, and Amazon were saving huge amounts of capital by buying generic routers and switches and writing their own operating software. Since those early days these companies have also worked to make these devices far more energy efficient. At the time of that blog, I noted that it was impractical for smaller ISPs to take advantage of the cheaper gear because of the difficulty and risk of writing their own operating software.

That’s all changed and there now is a viable way for smaller ISPs to realize the same huge savings on routers and switches. As you would expect, vendors stepped into the process to match whitebox hardware and operating software to create carrier-class routers and switches for a fraction of the cost of buying name brand gear.

There are a few new industry terms associated with this new industry. Whitebox refers to network hardware that uses commodity silicon and disaggregated software. Britebox refers to similar hardware, but which is built by mainstream hardware vendors like Dell. Commodity hardware refers to whitebox hardware matched with mainstream software.

There are a number of vendors of whitebox hardware including Edge-core Networks, SuperMICR, Facebook, and FS.Com. Much of the gear is built to match specifications provided by the big data center operators, meaning that hardware from different vendors is now becoming interchangeable.

The potential savings are eye-opening. One way to look at the cost of switches is to compare the cost per 10-gigabit MPLS port. In looking at list prices there is a whitebox switch available from FS.Com for $92 per port; Dell britebox hardware is $234 per port; Juniper is priced at $755 per port and Cisco at $2,412 per port. To be fair, a lot of buyers get discounts from the list prices of name brand hardware – but a 96% savings over list price is something that everybody needs to investigate.

As I mentioned, the whitebox hardware is also more energy efficient – saving money on air conditioning is the issue that led the big data center companies looking for a better solution. The FS.Com 10 gigabit switch uses about 200 watts of power annually; the Dell Britebox uses 234 watts; the Juniper switch uses 650 watts and the Cisco switch uses 300 watts. There is no question that a whitebox solution is greener and less expensive to operate.

There is also off-the-shelf software that has been created to operate the whitebox hardware. The most commonly used are IP Infusion OcNOS and Cumulus Linux. The cost of this disaggregated software is also far less expensive than the cost of the software embedded in the cost of mainstream hardware.

Probably the biggest concern of any ISP who is considering a whitebox solution is configuring their system and getting technical assistance when they have problems. The good news is that there are now vendors who have assembled a team to provide this kind of end-to-end support. One such vendor is IPArchiTechs (IPA). They have engineers that will configure and install a new system and a 24/7 helpdesk that provides the same kind of support available with name brand gear.

There are other advantages for using whitebox hardware. Should an ISP ever want to upgrade or change software the hardware can be reused and reprogrammed. The same thing goes with the disaggregated software – an ISP licenses the software and can transfer it to a different box without having to ‘buy’ the software again. The whitebox software also avoids upgrade fees often charged by vendors to increase speeds or to unlock unused ports.

There is whitebox gear available for most ISP functions. In some cases the same gear could be used in the core, in aggregation points or in the last mile just by changing the software – but there is whitebox hardware sized for the various uses. There are still a few network functions that the whitebox software hasn’t mastered, like BGP edge routing – but the hardware/software combination can handle the needs of most ISPs I work with.

Whitebox hardware and software has come of age. Anybody buying expensive Cisco or Juniper gear needs to consider the huge savings available from a whitebox solution. The big vendors have been successful by forcing customers to pay for numerous features and capabilities they never use – it makes more sense to buy more efficient hardware and pay for only the features you need.

The Impending Cellular Data Crisis

There is one industry statistic that isn’t getting a lot of press – the fact that cellular data usage is more than doubling every two years. You don’t have to plot that growth rate very many years into the future to realize that existing cellular networks will be inadequate to handle the increased demand in just a few years. What’s even worse for the cellular industry is that the growth is the nationwide average. I have many clients who tell me there isn’t nearly that much growth at rural cellular towers – meaning there is likely even faster growth at some urban and suburban towers.

Much of this growth is a self-inflicted wound by the cellular industry. They’ve raised monthly data allowances and are often bunding in free video with cellular service, thus driving up usage. The public is responding to these changes by using the extra bandwidth made available to them.

There are a few obvious choke points that will be exposed with this kind of growth. Current cellphone technology limits the number of simultaneous connections that can be made from any given tower. As customers watch more video they eat up slots on the cell tower that otherwise could have been used to process numerous short calls and text messages. The other big chokepoint is going to be the broadband backhaul feeding each cell cite. When usage grows this fast it’s going to get increasingly expensive to buy leased backbone bandwidth – which explains why Verizon and AT&T are furiously building fiber to cell sites to avoid huge increases in backhaul costs.

5G will fix some, but not all of these issues. The growth is so explosive that cellular companies need to use every technique possible to make cell towers more efficient. Probably the best fix is to use more spectrum. Adding an additional spectrum to a cell site immediately adds capacity. However, this can’t happen overnight. Any new spectrum is only useful if customers can use it and it takes a number of years to modify cell sites and cellphones to work on a new spectrum. The need to meet growing demand is the primary reason that the CTIA recently told the FCC they need an eye-popping 400 MHz of new mid-range spectrum for cellular use. The industry painted that as being needed for 5G, but it’s needed now for 4G LTE.

Another fix for cell sites is to use existing frequency more efficiently. The most promising way to do this is with the use of MIMO antenna arrays – a technology to deploy multiple antennas in cellphones to combine multiple spectrum together to create a larger data pipe. MIMO technology can make it easier to respond to a request from a large bandwidth user – but it doesn’t relieve the overall pressure on a cell tower. If anything, it might do the exact opposite and let cell towers prioritize those that want to watch video over smaller users who might then be blocked from making voice calls or sending text messages. MIMO is also not an immediate fix and also needs to work through the cycle of getting the technology into cellphones.

The last strategy is what the industry calls densification, which is adding more cell sites. This is the driving force behind placing small cell sites on poles in areas with big cellular demand. However, densification might create as many problems as it solves. Most of the current frequencies used for cellular service travel a decent distance and placing cell sites too close together will create a lot of interference and noise between neighboring towers. While adding new cell sites adds additional local capacity, it also decreases the efficiency of all nearby cell sites using traditional spectrum – the overall improvement from densification is going to be a lot less than might be expected. The worse thing about this is that interference is hard to predict and is very much a local issue. This is the primary reason that the cellular companies are interested in millimeter wave spectrum for cellular – the spectrum travels a short distance and won’t interfere as much between cell sites placed closely together.

5G will fix some of these issues. The ability of 5G to do frequency slicing means that a cell site can provide just enough bandwidth for every user – a tiny slice of spectrum for a text message or IoT signal and a big pipe for a video stream. 5G will vastly expand the number of simultaneous users that can share a single cell site.

However, 5G doesn’t provide any additional advantages over 4G in terms of the total amount of backhaul bandwidth needed to feed a cell site. And that means that a 5G cell site will get equally overwhelmed if people demand more bandwidth than a cell site has to offer.

The cellular industry has a lot of problems to solve over a relatively short period of time. I expect that in the middle of the much-touted 5G roll-out we are going to start seeing some spectacular failures in the cellular networks at peak times. I feel sympathy for cellular engineers because it’s nearly impossible to have a network ready to handle data usage that doubles every two years. Even should engineers figure out strategies to handle five or ten times more usage, in only a few years the usage will catch up to those fixes.

I’ve never believed that cellular broadband can be a substitute for landline broadband. Every time somebody at the FCC or a politician declares that the future is wireless I’ve always rolled my eyes, because anybody that understands networks and the physics of spectrum can easily demonstrate that there are major limitations on the total bandwidth capacity at a given cell site, along with a limit on how densely cell sites can be packed in an area. The cellular networks are only carrying 5% of the total broadband in the country and it’s ludicrous to think that they could be expanded to carry most of it.

5G For Rural America?

FCC Chairman Ajit Pai recently addressed the NTCA-The Rural Broadband Association membership and said that he saw a bright future for 5G in rural America. He sees 5G as a fixed-wireless deployment that fits in well with the fiber deployment already made by NTCA members.

The members of NTCA are rural telcos and many of these companies have upgraded their networks to fiber-to-the-home. Some of these telcos tackled building fiber a decade or more ago and many more are building fiber today using money from the ACAM program – part of the Universal Service Fund.

Chairman Pai was talking to companies that largely have been able to deploy fiber, and since Pai is basically the national spokesman for 5G it makes sense that he would try to make a connection between 5G and rural fiber. However, I’ve thought through every business model for marrying 5G and rural fiber and none of them make sense to me.

Consider the use of millimeter wave spectrum in rural America. I can’t picture a viable business case for deploying millimeter wave spectrum where a telco has already deployed fiber drops to every home. No telco would spend money to create wireless drops where they have already paid for fiber drops. One of the biggest benefits from building fiber is that it simplifies operations for a telco – mixing two technologies across the same geographic footprint would add unneeded operational complications that nobody would tackle on purpose.

The other business plan I’ve heard suggested is to sell wholesale 5G connections to other carriers as a new source of income. I also can’t imagine that happening. Rural telcos are going to fight hard to keep out any competitor that wants to use 5G to compete with their existing broadband customers. I can’t imagine a rural telco agreeing to provide fiber connections to 5G transmitters that would sit outside homes and compete with their existing broadband customers, and a telco that lets in a 5G competitor would be committing economic suicide. Rural business plans are precarious, by definition, and most rural markets don’t generate enough profits to justify two competitors.

What about using 5G in a competitive venture where a rural telco is building fiber outside of their territory? There may come a day when wireless loops have a lower lifecycle cost than fiber loops. But for now, it’s hard to think that a wireless 5G connection with electronics that need to be replaced at least once a decade can really compete over the long-haul with a fiber drop that might last 50 or 75 years. If that math flips we’ll all be building wireless drops – but that’s not going to happen soon. It’s probably going to take tens of millions of installations of millimeter wave drops until telcos trust 5G as a substitute for fiber.

Chairman Pai also mentioned mid-range spectrum in his speech, specifically the upcoming auction for 3.5 GHz spectrum. How might mid-range spectrum create a rural 5G play that works with existing fiber? It might be a moot question since few rural telcos are going to have access to licensed spectrum.

But assuming that telcos could find mid-range licensed spectrum, how would that benefit from their fiber? As with millimeter wave spectrum, a telco is not going to deploy this technology to cover the same areas where they already have fiber connections to homes. The future use of mid-range spectrum will be the same as it is today – to provide wireless broadband to customers that don’t live close to fiber. The radios will be placed on towers, the taller the better. These towers will then make connections to homes using dishes that can communicate with the tower.

Many of the telcos in the NTCA are already deploying this fixed wireless technology today outside of their fiber footprint. This technology benefits from having towers fed by fiber, but this rarely the same fiber that a telco is using to serve customers. In most cases this business plan requires extending fiber outside of the existing service footprint – and Chairman Pai said specifically that he saw advantage for 5G from existing fiber.

Further, it’s a stretch to label mid-range spectrum point-to-multipoint radio systems as 5G. From what numerous engineers have told me, 5G is not going to make big improvements over the way that fixed wireless operates today. 5G will add flexibility for the operator to fine-tune the wireless connection to any given customer, but the 5G technology won’t inherently increase the speed of the wireless broadband connection.

I just can’t find any business plan that is going to deliver 5G in rural America that takes advantage of the fiber that the small telcos have already built. I would love to hear from readers who might see a possibility that I have missed. I’ve thought about this a lot and I struggle to find the benefits for 5G in rural markets that Chairman Pai has in mind. 5G clearly needs a fiber-rich environment – but companies who have already built rural fiber-to-the-home are not going to embrace a second overlay technology or openly allow competitors onto their networks.

Google Fiber Leaving Louisville

Most readers have probably heard by now that Google Fiber is leaving Louisville because of failures with their fiber network. They are giving customers two months of free service and sending them back to the incumbent ISPs in the city. The company used a construction technique called micro-trenching where they cut a tiny slit in the road, one inch wide and few inches deep to carry the fiber. Only a year after construction the fiber is popping out of the micro-trenches all over the city.

Everybody I’ve talked to is guessing that it’s a simple case of ice heaving. While a micro-trench is sealed, it’s likely that small amounts of moisture seep into the sealed micro-trench and freezes when it gets cold. The first freeze would create tiny cracks, and with each subsequent freeze the cracks would get a little larger until the trench finally fills up with water, fully freezes and ejects the fill material. The only way to stop this would be to find a permanent seal that never lets in moisture. That sounds like a tall task in a city like Louisville that might freeze and thaw practically every night during the winter.

Nobody other than AT&T or Charter can be happy about this. The reason that Google Fiber elected to use micro-trenching is that both big ISPs fought tooth and nail to block Google Fiber from putting fiber on the utility poles in the city. The AT&T suit was resolved in Google’s favor, with the Charter one is still in court. Perhaps Google Fiber should have just waited out the lawsuits – but the business pressure was there to get something done. Unfortunately, the big ISPs are being rewarded for their intransigence.

One obvious lesson learned is not to launch a new network using an untried and untested construction technique. In this case, the micro-trenches didn’t just fail, they failed spectacularly, in the worst way imaginable. Google Fiber says the only fix for the problem would be to build the network again from scratch, which makes no financial sense.

Certainly, the whole industry is going to now be extremely leery about micro-trenching, but there is a larger lesson to be learned from this. For example, I’ve heard from several small ISPs who are ready to leap into the 5G game and build networks using millimeter wave radios installed on poles. This is every bit a new and untested technology like micro-trenching. I’m not predicting that anybody pursuing that business plan will fail – but I can assuredly promise that they will run into unanticipated problems.

Over my career, I can’t think of a single example where an ISP that took a chance on a cutting-edge technology didn’t have big problems – and some of those problems were just as catastrophic as what Google Fiber just ran into. For example, I can remember half a dozen companies that tried to deploy broadband networks using the LMDS spectrum. I remember one case where the radios literally never worked and the venture lost their $2 million investment. I remember several others where the radios had glitches that caused major customer outages and were largely a market disaster.

One thing that I’ve seen over and over is that telecom vendors take shortcuts. When they introduce a new technology they are under extreme pressure to get it to market and drive new revenues. Ideally, a vendor would hold small field trials of new technology for a few years to work out the bugs. But if a vendor finds an ISP willing to take a chance on a beta technology, they are happy to let the customers of that ISP be the real guinea pigs for the technology, and for the ISP to take the hit for the ensuing problems.

I can cite similar stories for the first generation of other technologies including the first generation of DSL, WiFi mesh networks, PON fiber-to-the-home and IPTV. The companies that were the first pioneers deploying these technologies had costly and sometimes deadly problems. So perhaps the lesson learned is that pioneers pay a price. I’m sure that this failure of micro-trenching will result in changing or abandoning the technique. Perhaps we’ll learn to not use micro-trenches in certain climates. Or perhaps they’ll find a way to seal the micro-trenches against humidity. But none of those future solutions will make up for Google Fiber’s spectacular failure.

The real victims of this situation are the households in Louisville who had changed to Google Fiber – and everybody else in the City. Because of Google Fiber’s lower prices, both Charter and AT&T lowered prices everywhere in the city. You can bet it’s not going to take long to get the market back to full prices. Any customers crawling back to the incumbents from Google Fiber can probably expect to pay full price immediately – there is no real incentive to give them a low-price deal. As a whole, every household in the City is going to be spending $10 or $20 more per month for broadband – which is a significant penalty on the local economy.