Using Wireless Backhaul

Mike Dano of Light Reading reports that Verizon is considering using wireless backhaul to reach as many as 20% of small cell sites. Verizon says they will use wireless backhaul for locations where they want to provide 5G antennas but can’t get fiber easily or affordably. The article sites an example of using wireless backhaul to provide connectivity where it’s hard to get the rights-of-way to cross railroad tracks.

This prompts me today to write about the issues involved with wireless backhaul. Done well it can greatly expand the reach of a network. Done poorly it can degrade performance or cause other problems. This is not an anti-Verizon blog because they are one of the more disciplined carriers in the industry and are likely to deploy wireless backhaul the right way.

Dano says that Verizon has already addressed one issue that is of concern today to municipalities that are seeing small cell deployments. Cities are worried about small cell devices that are large and unsightly. There are already pictures on the web of small cells gone awry where a mass of different electronics are pole-mounted to create an unsightly mess. Verizon describes their solution as integrated, meaning that no additional external antennas are needed – implying that the backhaul is likely using the same frequencies being used to reach customers. The small cell industry would do well to take heed of Verizon’s approach. It looks like courts are siding with municipalities in terms of being able to dictate aesthetic considerations for small cells.

Another issue to consider is the size of the wireless backhaul link. For instance, if Verizon uses millimeter wave backhaul there is a limitation today of being able to deliver about 1-gigabit links for 2 miles or 2-gigabit links for about a mile. The amount of bandwidth and the distance between transmitters differ according to the frequency used – but none of the wireless backhaul delivery technologies deliver as much bandwidth as fiber. Verizon has been talking about supplying 10-gigabit links to cell sites using next-generation PON technology. Wireless backhaul is going to be far less robust than fiber. This is likely not an issue today where many cell sites are using less than 2 gigabits of bandwidth. However, as the amount of broadband used by cellular networks keeps doubling every few years it might not take long for many cell sites to outgrow a wireless backhaul link.

The primary issue with wireless backhaul is the bandwidth dilution from feeding multiple wireless sites from one fiber connection. Consider an example where one cell site is fiber-fed with a 10-gigabit fiber backhaul. If that site them makes 2-gigabit wireless connections to four other cell sites, each of the 5 sites is now upward limited to 2 gigabits of usage. The bandwidth of the four secondary sites is limited by the 2-gigabit link feeding each one. The core site loses whatever bandwidth is being used by the other sites.

That’s probably a poor example because today most cell sites use less than 2 gigabits of bandwidth. Verizon’s use of 10-gigabit fiber backhaul moves them ahead of the rest of the industry that has cell sites with 1- to 5-gigabit backhaul connections today. The weaknesses of wireless backhaul are a lot more apparent when the wireless network beings at a site that only has a 1- or 2-gigabit fiber connection.

I’m sure that over time that Verizon plans to build additional fiber to relieve network congestion. Their use of wireless backhaul is going to push off the need for fiber by a decade or more and is a sensible way to preserve capital today.

The issues with wireless backhaul are far more critical for carriers that don’t have Verizon’s deep pockets, fiber networks, or discipline. It’s not hard today to find wireless networks that have overdone wireless backhaul. I’ve talked to numerous rural customers who are buying fixed wireless links from WISPs who are delivering only a few Mbps of bandwidth. Some of these customers are getting low speeds because they live too far away from the transmitting tower. Sometimes speeds are low because a WISP oversold the local antenna and is carrying more customers than the technology comfortably can serve.

But many rural wireless systems have slow speeds because of overextended wireless backhaul. In many cases in rural America, there are no fiber connections available for fixed wireless transmitters, which are often installed on grain elevators, water towers, church steeples or tall poles. I’ve seen networks that are making multiple wireless hops from a single gigabit fiber connection.

I’ve also seen preliminary designs for wireless ‘mesh’ networks where pole-mounted transmitters will beam wireless broadband into homes. Every wireless hop in these networks cuts the bandwidth in half at both radio sites (as bandwidth is split and shared). If you feed a mesh wireless network with a gigabit of bandwidth, then by the fifth hop a transmitter only sees 62 Mbps of raw bandwidth (which is overstated because by not accounting for overheads). It’s not hard to do the math to see why some rural wireless customers only see a few Mbps of bandwidth.

I’m sure that Verizon understands that many of the cell sites they serve today wirelessly will eventually need fiber, and I’m sure they’ll eventually build the needed fiber. But I also expect that there will be networks built with inadequate wireless backhaul that will barely function at inception and that will degrade over time as customer demand grows.

5G is Fiber-to-the-Curb

The marketing from the wireless companies has the whole country buzzing with speculation that the whole world is going to go wireless with the introduction of 5G. There is a good chance that within five years that a good and reliable and pole-mounted technology could become the preferred way to go from the curb to homes and businesses. When that happens we will finally have wireless fiber-to-the-curb – something that I’ve heard talked about for at least 25 years.

I remember visiting an engineer in the horse country of northern Virginia in the 1990s who had developed a fiber-to-the-curb wireless technology that could deliver more than 100 Mbps from a pole to a house. His technology was limited in that there had to be one pole-mounted transmitter per customer, and there was a distance limitation of a few hundred feet for the delivery. But he was clearly on the right track and was twenty years ahead of his time. At that time we were all happy with our 1 Mbps DSL and 100 Mbps sounded like science fiction. But I saw his unit functioning at his home, and if he had caught the attention of a big vendor we might have had wireless fiber-to-the-curb a lot sooner than now.

I have to laugh when I read people talking about our wireless future, because it’s clear that this technology is going to require a lot of fiber. There is a lot of legislative and lobbying work going on to make it easier to mount wireless units on poles and streetlights, but I don’t see the same attention being put into making it easier to build fiber – and without fiber this technology is not going to work as promised.

It’s easy to predict that there are going to be a lot of lousy 5G deployments. ISPs are going to come to a town, connect to a single gigabit fiber and then serve the rest of the town from that one connection. This will be the cheap way to deploy this technology and those without capital are going to take this path. The wireless units throughout the town will be fed with wireless backhaul, with many of them on multiple wireless hops from the source. In this kind of network the speeds will be nowhere near the gigabit capacity of the technology, the latency will be high and the network will bog down in the evenings like any over-subscribed network. A 5G network deployed in this manner will not be a killer app that will kill cable networks.

However, a 5G fiber-to-the-curb network built the right way is going to be as powerful as an all-fiber network. That’s going to mean having neighborhood wireless transmitters to serve a limited number of customers, with each transmitter fed by fiber. When Verizon and AT&T talk about the potential for gigabit 5G this is what they are talking about. But they are not this explicit because they are not likely today to deploy networks this densely. The big ISPs still believe that people don’t really need fast broadband. They will market this new technology by stressing that it’s 5G while building networks that will deliver far less than a gigabit.

There are ISPs who will wait for this technology to mature before switching to it, and they will build networks the right way. In a network with fiber everywhere this technology makes huge sense. One of the problems with a FTTH network that doesn’t get talked about a lot is abandoned drops. Fiber ISPs build drops to homes and over time a substantial number of premises no longer use the network for various reasons. I know of some 10-year old networks where as many as 10% of fiber drops have been abandoned as homes that buy service from somebody else. A fiber-to-the-curb network solves this problem by only serving those who have active service.

I also predict that the big ISPs will make every effort to make this a customer-provisioned technology. They will mail customers a receiver kit to save on a truck roll, because saving money is more important to them than quality. This will work for many customers, but others will stick the receiver in the wrong place and never get the speed they might have gotten if the receiver was mounted somewhere else in the home.

There really are no terrible broadband technologies, but there are plenty of terrible deployments. Consider that there are huge number of rural customers being connected to fixed wireless networks. When those networks are deployed properly – meaning customers are not too far from the transmitter and each tower has a fiber feed – the speeds can be great. I know a colleague who is 4-miles from a wireless tower and is getting nearly 70 Mbps download. But there are also a lot of under-capitalized ISPs that are delivering speeds of 5 Mbps or even far less using the same technology. They can’t afford to get fiber to towers and instead use multiple wireless hops to get to neighborhood transmitters. This is a direct analogue of what we’ll see in poorly deployed 5G networks.

I think it’s time that we stop using the term 5G as a shortcut for meaning gigabit networks. 5G is going to vary widely depending upon the frequencies used and will vary even more widely depending on how the ISP builds their network. There will be awesome 5G deployments, but also a lot of so-so and even lousy ones. I know I will be advising my clients on building wireless fiber-to-the-curb – and that means networks that still need a lot of fiber.

A Hybrid Model for Rural America

Lately I’ve looked at a lot of what I call a hybrid network model for bringing broadband to rural America. The network involves building a fiber backbone to support wireless towers while also deploying fiber to any pockets of homes big enough to justify the outlay. It’s a hybrid between point-to-multipoint wireless and fiber-to-the home.

I’ve never yet seen a business model that shows a feasible model for building rural FTTP without some kind of subsidy. There are multiple small telcos building fiber to farms using some subsidy funding from the A-CAM portion of the Universal Service Fund. And there are state broadband grant programs that are helping to build rural fiber. But otherwise it’s hard to justify building fiber in places where the cost per passing is $10,000 per household or higher.

The wireless technology I’m referring is a point-to-multipoint wireless network using a combination of frequencies including WiFi and 3.65 GHz. The network consists of placing transmitters on towers and beaming signals to dishes at a customer location. In areas without massive vegetation or other impediments this technology can now reliably deliver 25 Mbps download for 6 miles and higher bandwidth closer to the tower.

A hybrid model makes a huge difference in financial performance. I’ve now seen an engineering comparison of the costs of all-fiber and a hybrid network in half a dozen counties and the costs for building a hybrid network are in the range of 20% – 25% of the cost of building fiber to everybody. That cost reductions can result in a business model with a healthy return that creates significant positive cash over time.

There are numerous rural WISPs that are building wireless networks using wireless backhaul rather than fiber to get bandwidth to the towers. That solution might work at first, although I often see new wireless networks of this sort that can’t deliver the 25 Mbps bandwidth to every customer due to backhaul restraints.  It’s guaranteed that the bandwidth demands from customers on any broadband network will eventually grow to be larger than the size of the backbone feeding the network. Generally, over a few years a network using wireless backhaul will bog down at the busy hour while a fiber network can keep up with customer bandwidth demand.

One key component of the hybrid network is to bring fiber directly to customers that live close to the fiber. This means bringing fiber to any small towns or even small pockets of 20 or more homes that are close together. It also means giving fiber to farms and rural customers that happen to live along the fiber routes. Serving some homes with fiber helps to hold down customer density on the wireless portion of the network – which improves wireless performance. Depending on the layout of a rural county, a hybrid model might bring fiber to as much as 1/3 of the households in a county while serving the rest with wireless.

Another benefit of the hybrid model is that it moves fiber deeper into rural areas. This can provide the basis for building more fiber in the future or else upgrading wireless technologies over time for rural customers.

A side benefit of this business plan is that it often involves build a few new towers. Areas that need towers typically already have poor, or nonexistent cellular cover. The new towers can make it easier for the cellular companies to fill in their footprint and get better cellular service to everybody.

One reason the hybrid model can succeed is the high customer penetration rate that comes when building the first real broadband network into a rural area that’s never had it. I’ve now seen the customer numbers from numerous rural broadband builds and I’ve seen customer penetration rates range between 65% and 85%.

Unfortunately, this business plan won’t work everywhere, due to the limitations of wireless technology. It’s much harder to deploy a wireless network of this type in an area with heavy woods or lots of hills. This is a business plan for the open plains of the Midwest and West, and anywhere else with large areas of open farmland.

County governments often ask me how they can get broadband to everybody in their county. In areas where the wireless technology will work, a hybrid model seems like the most promising solution.

Facebook’s Gigabit WiFi Experiment

Facebook and the city of San Jose, California have been trying for several years to launch a gigabit wireless WiFi network in the downtown area of the city. Branded as Terragraph, the Facebook technology is a deployment of 60 GHz WiFi hotspots that promises data speeds as fast as a gigabit. This delays in the project are a good example of the challenges of launching a new technology and is a warning to anybody working on the cutting edge.

The network was first slated to launch by the end of 2016, but is now over a year late. The City or Facebook won’t commit on when the network will be launched, and they are also no longer making any guarantees of the speeds that will be achieved.

This delayed launch highlights many of the problems faced by a first-generation technology. Facebook first tested an early version of the technology on their Menlo Park campus, but has been having problems making it work in a real-life deployment. The deployment on light and traffic poles has gone much slower than anticipated, and Facebook is having to spend time after each deployment to make sure that traffic lights still work properly.

There are also business factors affecting the launch. Facebook has had turnover on the Terragraph team. The company has also gotten into a dispute over payments with an installation vendor. It’s not unusual to have business-related delays on a first-generation technology launch since the development team is generally tiny and subject to disruption and the distribution and vendor chains are usually not solidified. There is also some disagreement between the City and Facebook on who pays for the core electronics supporting the network.

Facebook had touted that the network would be significantly less expensive than deploying fiber. But the 60 GHz spectrum gets absorbed by oxygen and water vapor, so Facebook is having to deploy transmitters no more than 820 feet apart – a dense network deployment. Without fiber feeding each transmitter the backhaul is being done using wireless spectrum, which is likely to be contributing to the complication of the deployment as well as the lower expected data speeds.

For now, this deployment is in the downtown area and involves 250 pole-mounted nodes to serve a heavy-traffic business district which also sees numerous tourists. The City hopes to eventually find a way to deploy the technology citywide since 12% of the households in the City don’t currently have broadband access – mostly attributed to affordability. The City was hoping to get Google Fiber, but Google canceled plans last year to build in the City.

Facebook says they are still hopeful that they can make the technology work as planned, but that there is still more testing and research needed. At this point there is no specific planned launch date.

This experiment reminds me of other first-generation technology trials in the past. I recall several cities including Manassas, Virginia that deployed broadband over powerline. The technology never delivered speeds much greater than a few Mbps and never was commercially viable. I had several clients that nearly went bankrupt when trying to deploy point-to-point broadband using the LMDS spectrum. And I remember a number of failed trials to deploy citywide municipal WiFi, such as a disastrous trial in Philadelphia, and trials that fizzled in places like Annapolis, Maryland.

I’ve always cautioned my smaller clients to never be guinea pigs for a first-generation technology deployment. I can’t recall a time when a first-generation deployment did not come with scads of problems. I’ve seen clients suffer through first-generation deployments of all of the technologies that are now common – PON fiber, voice softswitches, IPTV, you name it. Vendors are always in a hurry to get a new technology to market and the first few ISPs that deploy a new technology have to suffer through all of the problems that crop up between a laboratory and a real-life deployment. The real victims of a first-generation deployment are often the customers using the network.

The San Jose trial won’t have all of the issues as are experienced by commercial ISPs since the service will be free to the public. But the City is not immune from the public spurning the technology if it doesn’t work as promised.

The problems experienced by this launch also provide a cautionary tale for the many 5G technology launches promised in 2018 and 2019. Every new launch is going to experience significant problems which is to be expected when a wireless technology bumps up against the myriad of issues experienced in a real-life deployment. If we have learned anything from the past, we can expect a few of the new launches to fizzle and die while a few of the new technologies and vendors will plow through the problems until the technology works as promised. But we’ve also learned that it’s not going to go smoothly and customers connected to an early 5G network can expect problems.

The WISP Dilemma

For the last decade I have been working with many rural communities seeking better broadband. For the most part these are places that the large telcos have neglected and never provided with any functional DSL. Rural America has largely rejected the current versions of satellite broadband because of the low data caps and because the latency won’t support streaming video or other real-time activities. I’ve found that lack of broadband is at or near the top of the list of concerns in communities without it.

But a significant percentage of rural communities have access today to WISPs (wireless ISPs) that use unlicensed frequency and point-to-multipoint radios to bring a broadband connection to customers. The performance of WISPs varies widely. There are places where WISPs are delivering solid and reliable connections that average between 20 – 40 Mbps download. But unfortunately there are many other WISPs that are delivering slow broadband in the 1 – 3 Mbps range.

The WISPs that have fast data speeds share two characteristics. They have a fiber connection directly to each wireless transmitter, meaning that there are no bandwidth constraints. And they don’t oversubscribe customers. Anybody who was on a cable modem five or ten years ago understands oversubscription. When there are too many people on a network node at the same time the performance degrades for everybody. A well-designed broadband network of any technology works best when there are not more customers than the technology can optimally serve.

But a lot of rural WISPs are operating in places where there is no easy or affordable access to a fiber backbone. That leaves them with no alternative but to use wireless backhaul. This means using point-to-point microwave radios to get bandwidth to and from a tower.

Wireless backhaul is not in itself a negative issue. If an ISP can use microwave to deliver enough bandwidth to a wireless node to satisfy the demand there, then they’ll have a robust product and happy customers. But the problems start happening when networks include multiple ‘hops’ between wireless towers. I often see WISP networks where the bandwidth goes from tower to tower to tower. In that kind of configuration all of the towers and all of the customers on those towers are sharing whatever bandwidth is sent to the first tower in the chain.

Adding hops to a wireless network also adds latency and each hop means it takes longer for the traffic to get to and from customers at the outer edges of one of these wireless chains. Latency, or time lag, in signal is an important factor in being able to perform real-time functions like data streaming, voice over IP, gaming, or functions like maintaining connections to an on-line class or a distant corporate WAN.

Depending upon the brand of the radios and the quality of the internet backbone connection, a wireless transmitter that is connected directly to fiber can have a latency similar to that of a cable or DSL network. But when chaining multiple towers together the latency can rise significantly, and real-time applications start to suffer at latencies of 100 milliseconds or greater.

WISPs also face other issues. One is the age of the wireless equipment. There is no part of our industry that has made bigger strides over the past ten years than the manufacturing of subscriber microwave radios. The newest radios have significantly better operating characteristics than radios made just a few years ago. WISPs are for the most part relatively small companies and have a hard time justifying upgrading equipment until it has reached its useful life. And unfortunately there is not much opportunity for small incremental upgrades of equipment. The changes in the technologies have been significant enough that that upgrading a node often means replacing the transmitters on towers as well as subscriber radios.

The final dilemma faced by WISPs is that they often are trying to serve customers that are in locations that are not ideally situated to receive a wireless signal. The unlicensed frequencies require good line-of-sight and also suffer degraded signals from foliage, rain and other impediments and it’s hard to serve customer reliably who are surrounded by trees or who live in places that are somehow blocked by the terrain.

All of the various issues mean that reviews of WISPs vary as widely as you can imagine. I was served by a WISP for nearly a decade and since I lived a few hundred feet from the tower and had a clear line-of-sight I was always happy with the performance I received. I’ve talked to a few people recently who have WISP speeds as fast as 50 Mbps. But I have also talked to a lot of rural people who have WISP connections that are slow and have high latency that provides a miserable broadband experience.

It’s going to be interesting to see what happens to some of these WISPs as rural telcos deploy CAF II money and provide a faster broadband alternative that will supposedly deliver at least 10 Mbps download. WISPs who can beat those speeds will likely continue to thrive while the ones delivering only a few Mbps will have to find a way to upgrade or will lose most of their customers.

Wireless Networks Need Fiber

As I examine each of the upcoming wireless technologies it looks like future wireless technology is still going to rely heavily on an underlying fiber network. While the amount of needed fiber will be less than building fiber to every customer premise, supporting robust wireless networks is still going to require significant construction of new fiber.

This is already true today for the traditional cellular network and most existing towers are fiber-fed, although some have microwave backhaul. The amount of bandwidth needed at traditional cell sites is already outstripping the 1 or 2 GB capacity of wireless backhaul technologies. Urban cell sites today are fed with as much as 5 – 10 GB pipes and most rural ones have (or would like to have) a gigabyte feed. I’ve seen recent contractual negotiations for rural cell sites asking for as much as 5 GB of backhaul within the next 5 – 10 years.

Looking at the specification for future 5G cellular sites means that fiber will soon be the only backhaul solution for cell sites. The specifications require that a single cell site be capable of as much as 20 GB download and 10 GB upload. The cellular world is currently exploring mini-cell sites (although that effort has slowed down) to some degree due to the issues with placing these devices closer to customers. To be practical these small cell sites must be placed on poles (existing or newly built), on rooftops and on other locations found near to areas with high usage demand. The majority of these small sites will require new fiber construction. Today these sites can probably use millimeter wave radio backhaul, but as bandwidth needs increase, this is going to mean bringing fiber to poles and rooftops.

Millimeter wave radios are also being touted as a way to bring gigabit speeds to consumers. But delivering fast speeds means getting the radios close to customers. These radios use extremely high frequencies, and as such travel for short distances. As a hot spot a millimeter wave radio is only good for a little over 100 feet. But even if formed into a tight microwave beam it’s a little over a mile – and also requires true line-of-sight. These radios will be vying for the same transmitter locations as mini-cell sites.

Because of the short distances that can be delivered by the millimeter wave radios, this technology is going to initially be of most interest in the densest urban areas. Perhaps as the radios get cheaper there will be more of a model for suburban areas. But the challenge of deploying wireless in urban areas is that is where fiber is the most expensive to build. It’s not unusual to see new fiber construction costs of $150,000 and $200,000 per mile in downtown areas. The urban wireless deployment faces the challenge of getting both fiber and power to poles, rooftops and sides of buildings. This is the issue that has already stymied the deployment of mini-cell sites, and it’s going to become more of an issue as numerous companies want to build competing wireless networks in our cities. I’m picturing having the four major cellular companies and half a dozen wireless ISPs all wanting access to the same prime transmitter sites. All of these companies will have to deal with the availability of fiber, or will need to build expensive fiber to support their networks.

Even rural wireless deployments needs a lot of fiber. A quality wireless point-to-point wireless network today needs fiber at each small tower. When that is available then the current technologies can deploy speeds between 20 Mbps and 100 Mbps. But using wireless backhaul instead of fiber drastically cuts the performance of these networks and there are scads of rural WISPs delivering bandwidth products of 5 Mbps or less. As the big telcos tear down their remaining rural copper, the need for rural fiber is going to intensify. But the business case is often difficult to justify to build fiber to supply bandwidth to only a small number of potential wireless or wireline customers.

All of the big companies that are telling Wall Street about their shift to wireless technologies are conveniently not talking about this need for lots of fiber. But when they go to deploy these technologies on any scale they are going to run smack into the current lack of fiber. And until the fiber issue is solved, these wireless technologies are not going to deliver the kinds of speeds and won’t be quickly available everywhere as is implied by the many press releases and articles talking about our wireless future. I have no doubt that there will eventually be a lot of customers using wireless last mile – but only after somebody first makes the investment in the fiber networks needed to support the wireless networks.