Five years ago, one of the most talked about trends in the broadband industry was the upcoming explosion in the deployment of small cellular cell sites. The industry expectation in 2017 was that there would be half a million new small cell sites added within a few years. That expectation was bolstered by an FCC order in 2018 that basically let cellular companies place small cell sites anywhere, with an accelerated timeline and with a small cap on permitting fees.
After the FCC order, municipalities braced themselves to process large numbers of small cell site permits. The public got up in arms because there were some examples of horrendously ugly small cell deployments that were widely covered on social media. The public also got riled by the idea of placing small cell sites directly adjacent to their homes – with the most widely discussed deployment I recall in Philadelphia that placed a small cell site ten feet from a baby’s bedroom window. Anybody building a last-mile fiber network was hoping to sell transport to small cell sites scattered throughout neighborhoods.
My blog today is prompted by several cities and a university recently that asked me why they have never received any small cell site permit requests. I don’t have a specific answer for their specific location, but overall, the number of small cell sites deployed has been a lot smaller than was anticipated in 2017. The hard question to answer is how much smaller – how many cell sites are there in the U.S.?
That turns out to be a difficult number to count, mostly due to what is counted as a cell site. In 2022 the CTIA, the industry group for cellular carriers, claimed recently that there are around 419,000 total operational cell sites, including tall tower sites. I saw another estimate in 2022 that put the number closer to 350,000.
One of the differences in the numbers is the definition of small cell site. For example, there are a lot of cellular devices deployed to enhance cellular coverage inside tall or large buildings. Some people count these as cellular repeaters, while others claim them as small cell sites. Similarly, there are cellular boosters deployed at places like stadiums and convention centers that are functionally something less than a full standalone small cell site.
But back to the original question. Why aren’t there as many small cell sites as touted in 2017? The answer to that question means looking back at cellular networks in 2017. While it wasn’t discussed much publicly at the time, the cellular carriers had big problems in 2017. The proliferation of folks using cell phone broadband swamped the cellular networks, particularly in busy places like shopping districts and busy commuter routes.
In 2017, the cellular carriers envisioned a multi-prong approach to relieve overcrowding on overstressed cellular networks. One was to deploy more cell sites, the second was to introduce new frequency bands, and the third was to improve cellular speeds.
The FCC reacted to the need for spectrum and made several major new swaths of mid-range spectrum available to cellular carriers. Carriers rolled out the new spectrum and labeled it as 5G, even though they are still using 4G technology. But labeling as 5G gave the handset makers a good reason to market a whole new line of phones, and handsets that used the new frequencies decreased the demand on existing frequencies – to the point where the traditional 4G frequency bands are now sometimes faster than the 5G bands, due to the number of people using each.
The carriers improved cellular speed dramatically by modernizing cell sites with the latest technologies, and cellular speeds on both the existing 4G and the newly labeled 5G network got faster. As I wrote in a blog the other day, the median cellular download speed nationwide measured by Ookla in 2017 was 22.6 Mbps, and at the end of 2022 climbed to 193.7 Mbps. Faster speed means that the time that a customer needs to use the network is reduced, freeing the network for other customers.
There was a fourth benefit to cellular networks that was not in the cellular carrier plans. During the last five year WiFi has become ubiquitous. A huge amount of cellular data traffic has been transferred to the landline networks through WiFi.
There have been plenty of small cell deployments to lighten the traffic load on tall tower cell sites. The early deployments of small cells have been in areas where the customer demand was greatest. But the carriers didn’t deploy half a million small cells because it wasn’t needed. The overcrowding on cellular networks has been mitigated through the new spectrum, faster cellular networks, and WiFi.
Cellular companies will eventually need to widely deploy the expected small cell sites. As the demand for cellular broadband continues to grow at a torrid pace, the networks will get busier again. There are still additional frequency bands that can be introduced to spread the traffic, but small cell sites are still a part of the long-term solution needed to keep cellular networks healthy in the coming decades.
The FCC recently opened a docket, at the prompting of federal legislation, that asks for examples of digital discrimination. The docket asks folks to share stories about how they have had a hard time obtaining or keeping broadband, specifically due to issues related to zip code, income level, ethnicity, race, religion, or national origin.
The big cable companies and telcos are all going to swear they don’t discriminate against anybody for any reason, and every argument they make will be pure bosh. Big corporations, in general, favor more affluent neighborhoods over poor ones. Neighborhoods that don’t have the best broadband networks are likely going to be the same neighborhoods that don’t have grocery stores, gas stations, retail stores, restaurants, banks, hotels, and a wide variety of other kinds of infrastructure investment from big corporations. The big cable companies and telcos are profit-driven and focused on stock prices, and they make many decisions based on the expected return to the bottom line – just like other large corporations.
There is clearly discrimination by ISPs by income level. It’s going to be a lot harder to prove discrimination by ethnicity, race, religion, or national origin, although it’s likely that some stories of this will surface in this docket. But discrimination based on income is everywhere we look. There are two primary types of broadband discrimination related to income – infrastructure discrimination and price discrimination.
Infrastructure discrimination for broadband has been happening for a long time. It doesn’t take a hard look to see that telecom networks in low-income neighborhoods are not as good as those in more affluent neighborhoods. Any telecom technician or engineer can point out a dozen of differences in the quality of the infrastructure between neighborhoods.
The first conclusive evidence of this came years ago from a study that overlaid upgrades for AT&T DSL over income levels, block by block in Dallas. The study clearly showed that neighborhoods with higher incomes got the upgrades to faster DSL during the early 2000s. The differences were stark, with some neighborhoods stuck with first-generation DSL that delivered 1-2 Mbps broadband while more affluent neighborhoods had been upgraded to 20 Mbps DSL or faster.
It’s not hard to put ourselves into the mind of the local AT&T managers in Dallas who made these decisions. The local manager would have been given an annual DSL upgrade budget and would have decided where to spend it. Since there wasn’t enough budget to upgrade everywhere, the local manager would have made the upgrades in neighborhoods where faster cable company competition was taking the most DSL customers – likely the more affluent neighborhoods that could afford the more expensive cable broadband. There were probably fewer customers fleeing the more affordable DSL option in poor neighborhoods where the price was a bigger factor for consumers than broadband speeds.
These same kinds of economic decisions have been played out over and over, year after year by the big ISPs until affluent neighborhoods grew to have better broadband infrastructure than poorer neighborhoods. Consider a few of the many examples of this:
I’ve always noticed that there are more underground utilities in newer and more affluent neighborhoods than in older and poorer ones. This puts broadband wires safely underground and out of reach from storm damage – which over time makes a big difference in the quality of the broadband being delivered. Interestingly, the decision of where to force utilities to be underground is done by local governments, and to some degree, cities have contributed to the difference in infrastructure between affluent and low-income neighborhoods.
Like many people in the industry, when I go to a new place, I automatically look up at the conditions of poles. While every place is different, there is clearly a trend to have taller and less cluttered poles in more affluent parts of a city. This might be because competition brought more wires to a neighborhood, which meant more make-ready work done to upgrade poles. But I’ve spotted many cases where poles in older and poorer neighborhoods are the worst in a community.
It’s easy to find many places where the Dallas DSL story is being replayed with fiber deployment. ISPs of all sizes cherry-pick the neighborhoods that they perceive to have the best earnings potential when they bring fiber to a new market.
We are on the verge of having AI software that can analyze data in new ways. I believe that we’ll find that broadband discrimination against low-income neighborhoods runs a lot deeper than the way we’ve been thinking about it. My guess is that if we map all of the infrastructure related to broadband we’d see firm evidence of the infrastructure differences between poor and more affluent neighborhoods.
I am sure that if we could gather the facts related to the age of the wires, poles, and other infrastructure, we’d find the infrastructure in low-income neighborhoods is significantly older than in other neighborhoods. Upgrades to broadband networks are usually not done in a rip-and-replace fashion but are done by dozens of small repairs and upgrades over time. I also suspect that if you could plot all of the small upgrades done over time to improve networks, you’d find more of these small upgrades, such as replacing cable company power taps and amplifiers, to have been done in more affluent neighborhoods.
We tend to think of broadband infrastructure as the network of wires that brings fast Internet to homes, but modern broadband has grown to be much more than that, and there is a lot of broadband infrastructure that is not aimed at home broadband. Broadband infrastructure has also come to mean small cell sites, smart grid infrastructure, and smart city infrastructure. I believe that if we could map everything related to these broadband investments we’d see more examples of discrimination.
Consider small cell sites. Cellular companies have been building fiber to install small cell sites to beef up cellular networks. I’ve never seen maps of small cell installations, but I would wager that if we mapped all of the new fiber and small cell sites we’d find a bias against low-income neighborhoods.
I hope one day to see an AI-generated map that overlays all of these various technologies against household incomes. My gut tells me that we’d find that low-income neighborhoods will come up short across the board. Low-income neighborhoods will have older wires and older poles. Low-income neighborhoods will have fewer small cell sites. Low-income neighborhoods won’t be the first to get upgraded smart grid technologies. Low-income neighborhoods won’t get the same share of smart city technologies, possibly due to the lack of other infrastructure.
This is the subtle discrimination that the FCC isn’t going to find in their docket because nobody has the proof. I could be wrong, and perhaps I’m just presupposing that low-income neighborhoods get less of every new technology. I hope some smart data guys can find the data to map these various technologies because my gut tells me that I’m right.
Price discrimination has been around for a long time, but I think there is evidence that it’s intensified in recent years. I first noticed price discrimination in the early price wars between the big cable companies and Verizon FiOS. This was the first widespread example of ISPs going head-to-head with decent broadband products where the big differentiator was the price.
I think the first time I heard the term ‘win-back program’ was related to cable companies working hard not to lose customers to Verizon. There are stories in the early days of heavy competition of Comcast keeping customers on the phone for a long time when a customer tried to disconnect service. The cable company would throw all sorts of price incentives to stop customers from leaving to go to Verizon. Over time, the win-back programs grew to be less aggressive, but they are still with us today in markets where cable companies face stiff competition.
I think price competition has gotten a lot more subtle, as witnessed by a recent study in Los Angeles that showed that Charter offers drastically different online prices for different neighborhoods. I’ve been expecting to see this kind of pricing for several years. This is a natural consequence of all of the work that ISPs have done to build profiles of people and neighborhoods. Consumers have always been leery about data gathered about them, and the Charter marketing practices by neighborhood are the natural endgame of having granular data about the residents of LA.
From a purely commercial viewpoint, what Charter is doing makes sense. Companies of all sorts use pricing to reward good existing customers and to lure new customers. Software companies give us a lower price for paying for a year upfront rather than paying monthly. Fast food restaurants, grocery stores, and a wide range of businesses give us rewards for being regular customers.
It’s going to take a whistleblower to disclose what Charter is really doing. But the chances are it has a sophisticated software system that gives a rating for individual customers and neighborhoods based on the likelihood of customers buying broadband or churning to go to somebody else. This software is designed to offer a deeper discount in neighborhoods where price has proven to be an effective technique to keep customers – without offering lower prices everywhere.
I would imagine the smart numbers guy who devised this software had no idea that it would result in blatant discrimination – it’s software that lets Charter maximize revenue by fine-tuning the price according to a computer prediction of what a given customer or neighborhood is willing to pay. There has been a lot of speculation about how ISPs and others would integrate the mounds of our personal data into their businesses, and it looks like it has resulted in finely-tuned price discrimination by city block.
Is There a Fix for Digital Discrimination?
The big news in the broadband industry is that we are in the process of throwing billions of dollars to solve the ultimate case of economic discrimination – the gap between urban and rural broadband infrastructure. The big telcos completely walked away from rural areas as soon as they were deregulated and could do so. The big cable companies never made investments in rural areas due to the higher costs. The difference between urban and rural broadband networks is so stark that we’ve decided to cure decades of economic discrimination by throwing billions of dollars to close the gap.
But nobody has been seriously looking at the more subtle manifestation of the same issue in cities. The FCC is only looking at digital discrimination because it was required by the Infrastructure Act. Does anybody expect that anything will come out of the stories of discrimination? ISPs are going to say that they don’t discriminate. If pinned down, they will say that what looks like discrimination is only the consequence of them making defensible economic decisions and that there was no intention to discriminate.
Most of the discrimination we see in broadband is due to the lack of regulation of ISPs. They are free to chase earnings as their top priority. ISPs have no regulatory mandate to treat everybody the same. The regulators in the country chose to deregulate broadband, and the digital discrimination we see in the market is the direct consequence of that choice. When AT&T was a giant regulated monopoly we required it to charge everybody the same prices and take profits from affluent customers to support infrastructure and prices in low-income neighborhoods and rural places. Regulation wasn’t perfect, but we didn’t have the current gigantic infrastructure and price gaps.
If people decide to respond to this FCC docket, we’ll see more evidence of discrimination based on income. We might even get some smoking gun evidence that some of the discrimination comes from corporate bias based on race and other factors. But discrimination based on income levels is so baked into the ways that corporations act that I can’t imagine that anybody thinks this docket is going to uncover anything we don’t already know.
I can’t imagine that this investigation is going to change anything. The FCC is not going to make big ISPs spend billions to clean up broadband networks in low-income neighborhoods. While Congress is throwing billions at trying to close the rural broadband gap, I think we all understand that anywhere that the big corporations take the rural grant funding that the infrastructure is not going to be maintained properly and that in twenty years we’ll be having this same conversation all over again. We know what is needed to fix this – which is regulation that forces ISPs to do the right thing. But I doubt we’ll ever have the political or regulatory will to force the big ISPs to act responsibly.
All of the discussion of the FCC maps lately made me start thinking about broadband connections. I realized that many of my clients are providing a lot of broadband connections that are not being considered by the FCC maps. That led me to think that the old definition of a broadband passing is quickly growing obsolete and that the FCC mapping effort is missing the way that America really uses broadband today.
Let me provide some real-life examples of broadband connections provided by my clients that are not being considered in the FCC mapping:
Broadband connections to farm irrigation systems.
Broadband to oil wells and mining locations.
Broadband to wind turbines and solar farms.
Fiber connections to small cell sites.
Broadband electric substations. I have several electric company clients that are in the process of extending broadband to a huge number of additional field assets like smart transformers and reclosers.
Broadband to water pumps and other assets that control water and sewer systems.
Broadband to grain elevators, corn dryers, and other locations associated with processing or storing crops.
I’m working with several clients who are extending broadband for smart-city applications like smart streetlights, smart parking, and smart traffic lights.
Broadband to smart billboards and smart road signs.
Broadband for train yards and train switching hubs.
There are many other examples, and this was just a quick list that came to mind.
The various locations described above have one thing in common. Most are locations that don’t have a 911 street address. As such, these locations are not being considered when trying to determine the national need for broadband.
A lot of these locations are rural in nature – places like grain elevators, mines, oil wells, irrigation systems, wind turbines, and others. In rural areas, these locations are a key part of the economy, and in many places are unserved or underserved.
We are putting a huge amount of national energy into counting the number of homes and businesses that have or don’t have broadband. In doing so, we have deliberately limited the definition of a business to a place with a brick-and-mortar building and a 911 address. But the locations above are often some of the most important parts of the local economy.
I’ve read predictions that say in a few decades there will be far more broadband connections to devices than to people, and that rings true to me. I look around at the multiple devices in my home that use WiFi, and it’s not hard to envision that over time we will connect more and more locations and devices to broadband.
After a decade of talking about the inadequate FCC broadband maps, we finally decided to throw money at the issue and devise new maps. But in the decade it took to move forward, we’ve developed multiple non-traditional uses for broadband, a trend that is likely to expand. If we are really trying to define our national need for broadband, we need to somehow make sure that the locations that drive the economy are connected to broadband. And the only way to do that is to count these locations and put them on the broadband map, so somebody tries to serve them. The current maps are doing a disservice by ignoring the huge number of these non-traditional broadband connections.
There is a lot of public sentiment against placing small cell sites on residential streets. There is a particular fear of broadcasting higher millimeter wave frequencies near to homes since these frequencies have never been in widespread use before. In the public’s mind, higher frequencies mean a higher danger of health problems related to exposure to radiofrequency emissions. The public’s fears are further stoked when they hear that Switzerland and Belgium are limiting the deployment of millimeter wave radios until there is better proof that they are safe.
The FCC released a report and order on December 4 that is likely to add fuel to the fire. The agency rejected all claims that there is any public danger from radiofrequency emissions and affirmed the existing frequency exposure rules. The FCC said that none of the thousand filings made in the docket provided any scientific evidence that millimeter wave, and other 5G frequencies are dangerous.
The FCC is right in their assertion that there are no definitive scientific studies linking cellular frequencies to cancer or other health issues. However, the FCC misses the point that most of those asking for caution, including scientists, agree with that. The public has several specific fears about the new frequencies being used:
First is the overall range of new frequencies. In the recent past, the public was widely exposed to relatively low frequencies from radio and TV stations, to a fairly narrow range of cellular frequencies, and two bands of WiFi. The FCC is in the process of approving dozens of new bands of frequency that will be widely used where people live and work. The fear is not so much about any given frequency being dangerous, but rather a fear that being bombarded by a large range of frequencies will create unforeseen problems.
People are also concerned that cellular transmitters are moving from tall towers, which normally have been located away from housing, to small cell sites on poles that are located on residential streets. The fear is that these transmitters are generating a lot of radiation close to the transmitter – which is true. The amount of frequency that strikes a given area decreases rapidly with distance from a transmitter. The anecdote that I’ve seen repeated on social media is of placing a cell site fifteen feet from the bedroom of a child. I have no idea if there is a real small cell site that is the genesis of this claim – but there could be. In dense urban neighborhoods, there are plenty of streets where telephone poles are within a few feet of homes. I admit that I would be leery about having a small cell site directly outside one of my windows.
The public worries when they know that there will always be devices that don’t meet the FCC guidelines. As an example, the Chicago Tribune tested eleven smartphones in August and found that a few of them were issuing radiation at twice the FCC maximum-allowable limit. The public understands that vendors play loose with regulatory rules and that the FCC largely ignores such violations.
The public has no particular reason to trust this FCC. The FCC under Chairman Pai has sided with the large carriers on practically every issue in front of the Commission. This is not to say that the FCC didn’t give this docket the full consideration that should be given to all dockets – but the public perception is that this FCC would side with the cellular carriers even if there was a public health danger.
The FCC order is also not particularly helped by citing the buy-in from the Food and Drug Administration on the safety of radiation. That agency has licensed dozens of medicines that later proved to be harmful, so that agency also doesn’t garner a lot of public trust.
The FCC made a few changes with this order. They have mandated a new set of warning signs to be posted around transmitters. It’s doubtful that anybody outside of the industry will understand the meaning of the color-coded warnings. The FCC is also seeking comments on whether exposure standards should be changed for frequencies below 100 kHz and above 6 GHz. The agency is also going to exempt certain kinds of transmitters from FCC testing.
I’ve read extensively on both sides of the issue and it’s impossible to know the full story. For example, a majority of scientists in the field signed a petition to the United Nations warning against using higher frequencies without more testing. But it’s also easy to be persuaded by other scientists who say that higher frequencies don’t even penetrate the skin. I’ve not heard of any studies that look at exposing people to a huge range of different low-power frequencies.
This FCC is in a no-win position. The public properly perceives the agency of being pro-carrier, and anything the FCC says is not going to persuade those worried about radiation risks. I tend to side with the likelihood that the radiation is not a big danger, but I also have to wonder if there will be any impact after expanding by tenfold the range of frequencies we’re exposed to. The fact is that we’re not likely to know until after we’ve all been exposed for a decade.
I don’t know about the rest of you, but I’ve noticed a lot of degradation in the cellular voice network over the last year or two, and the situation is noticeably worsening over time. For a decade or more the cellular network has been a bastion of strength and reliability. I rely heavily on my cellphone all day for work and for years I haven’t given the cellular network a thought because calls worked. Occasionally I’d get a bad voice connection that could be easily remedied by reinitiating a call. But that happened so infrequently that I barely noticed it – it was never something I considered as a problem.
Over the last year, this all changed. I’ve often had a problem making a call and have had to try the same number a half a dozen times to make a connection. Calls mysteriously drop in mid-call, or even stranger, half of the call goes silent and only one party can be heard. Possibly the worse problem is that there are a lot more calls with poor voice quality – something that I thought was a decade behind us.
I happen to work in a small city and it’s not hard to understand why my cell site would be stressed. Half of the homes in my neighborhood have at least one person working from home, and most spend a lot of time on the phone. Our street is only one block from a busy traffic corridor and is also full of businesses. We also have a significant number of teenagers. I would not be surprised to find that the busy hour on our local cellular network is during the afternoon.
However, this is not just a problem with urban cell sites. I’ve lately been asking others about their cellular calling and at least half of people I’ve asked tell me that the quality of the cellular networks in their own neighborhoods has gotten worse. Many of these folks live in small rural towns.
It’s not hard to understand why this is happening. The cellular companies have embraced the ‘unlimited’ data plans, which while not truly unlimited, have encouraged folks to use their cellular data plans. According to Cisco and OpenVault, the amount of data on cellular networks is now doubling every two years – a scorching growth rate that will accumulate to a 60-fold increase in data usage on the cellular networks in a decade. No network can sustain that kind of traffic growth for very long without first becoming congested and eventually collapsing under the load.
The cellular companies don’t want to openly talk about this crisis. I guess that the first cellular company to use the word ‘crisis’ will see their stock tank, so none of them are talking about why cellular performance is degrading. Instead, the cellular carriers have taken the tactic of saying that we need to remove barriers to 5G and that we need to win the 5G race – but what they want is to find solutions to fix the 4G networks before they crash.
The cellular companies have a 3-prong approach to fix the problem. First, they are deploying small cell sites to relieve the pressure from the big cellular towers. One small cell site in my neighborhood would likely eliminate most of the problems I’ve been having, at least for a little while. Unfortunately, in a network where traffic is doubling every two years, this is a temporary solution.
The cellular companies also have been screaming for new mid-range spectrum, because adding spectrum to cell sites and cellphones expands the data capability at each cell site. Unfortunately, working new spectrum into the cellular networks take time. The FCC continues to slog through the approval process for new cellular spectrum, with the best example being the mess happening with C-Band spectrum. Even when new spectrum is approved there is a significant market delay from the time of approval until a new spectrum has been installed in cell sites and phones.
Finally, the cellular carriers are counting on 5G. There a few aspects of 5G that will significantly improve cellular service. The most important is frequency slicing that will right-size the data path to each customer and will get rid of today’s network that provides a full channel to a customer who is doing some minor broadband task. 5G will also allow for a customer to be connected to a different cell site if their closest site is full. Finally, the 5G specifications call for a major expansion of the number of customers that can be served simultaneously from a cell site. Unfortunately for the cellular carriers, most of the major 5G improvements are still five years into the future. And like with new spectrum, there will be a market delay with each 5G breakthrough as updates make it into enough smartphones to make a difference.
There is a fourth issue that is a likely component of the degrading cellular networks. It’s likely with expanding broadband needs that the backhaul links to cell sites are overloaded at peak times and under stress. It doesn’t matter if all of the above changes are implemented if the backhaul is inadequate – because poor backhaul will degrade any broadband network. The big cellular carriers have been working furiously to build fiber to cell sites to eliminate leased backhaul. But much of the backhaul to cell sites is still leased, and the lease costs are one of the major expenses for cellular companies. The cellular companies are reluctant to pay a lot more for bandwidth, and so it’s likely that at the busiest times of the day that many backhaul routes are now overloaded.
The cellular companies need all of these fixes just to keep up with cellular demand growth. They need many more small cell sites, more frequency, 5G upgrades, and robust backhaul. What I find scary is that all of these fixes might not be enough to solve the problem if cellular demand continues to grow at the same torrid pace. I’ve been thinking about buying a landline for my office – something I got rid of 20 years ago – I don’t know if I can wait for the cellular companies to solve their crisis.
The US Court of Appeals for the D.C. Circuit ruled last week that the FCC had gone too far when it ruled that 5G cell site placement could bypass environmental and historic preservation review. The specific ruling looked at whether the FCC has the authority to bypass these kinds of reviews for sites of religious and cultural importance to federally recognized Indian Tribes. But the ruling has a far larger significance and applies to these kinds of reviews everywhere.
This type of court ruling seemed inevitable because of the brashness of the original FCC order. That order declared that the deployment of 5G is so important that all of the rules in the country applying to the deployment of new infrastructure don’t apply. For courts to buy that argument that must be convinced that 5G deployment is so important that it is indeed a national emergency.
I think everybody who understands the benefits of 5G understands that it is an important new technology – one that will create huge benefits for the country. But it’s hard to make an argument that 5G deployment is an emergency.
The biggest benefits of 5G are only going to manifest with the introduction of frequency slicing into the cellular network, and that looks to be 3 – 4 years away. The deployments that the cellular carriers are labeling as 5G today mostly marketing gimmicks and custoemrs are not yet seeing any of the real benefits from 5G.
I blame the original FCC 5G order on a poorly chosen strategy by the cellular carriers, abetted by the FCC. We are facing a cellular emergency in the country, but it’s a crisis of 4G and not 5G. Our existing 4G network is in serious trouble and it seems that the cellular carriers don’t want to admit it. Cellular data networks are swamped because customer data usage is not doubling every two years. I have seen big problems in my local AT&T network. There have been many days when it’s hard to make or hold a call – something that never happened before last year.
The explosive growth of cellular traffic is partially the fault of the cellular carriers – it comes as a result of ‘unlimited’ data plans that encourage people to watch video and use cellphone data. It wasn’t that long ago when it cost a lot to buy a data plan that exceeded 1 or 2 gigabytes of usage per month. The average customer with an unlimited plan now uses 6 GB per month, and that number is growing rapidly.
The other cause of the increased demand on cellular networks comes from the success of the industry convincing in convincing everybody to use a smartphone. A recent Pew poll showed that 95% of teens and young adults now have a smartphone. The sheer number of customers is swamping the networks.
There is a path out of the current data crisis for cellular networks. It’s a 3-prong approach that involves building more cell sites, adding more bands of frequency onto cellphones, and finally layering on the frequency slicing capabilities of 5G.
It takes at 3 – 5 years to introduce a new frequency into the cellular network. That involves upgrading cell sites, but more importantly, it means building the capability into handsets and then getting the new phones into the hands of enough people to make a difference.
With real 5G benefits still a few years off, the only immediate way to relieve pressure on the cellular network is to add small cell sites. Each small cell site grabs local callers and keeps them off the big tall cell towers. All of the hectic small cell site construction we see is not being done for 5G – it’s being done to take the pressure off the 4G network.
The big cellular companies seem unwilling to admit that their networks are hurting and are in danger of overload – the first company brave enough to say that probably loses customers. Instead, the cellular industry elected to push the 5G narrative as the reason for bypassing the normal way that we build infrastructure. In this case, the courts didn’t buy that 5G is an emergency, and the court is right because 5G isn’t even here yet. If the cellular carriers and the FCC would have declared a 4G emergency I think everybody would have gotten it. We all want our cellphones to work.
The courts are still reviewing the appeal of an issue with even more potential dire consequences to the cellular carriers. Probably the most important aspect of the FCC’s 5G ruling is that cities have little say about the placement of small cell sites and also must expedite permitting for new small cell sites. That ruling was challenged by numerous cities and is being reviewed by the US Court of Appeals for the Ninth Circuit. That issue also boils down to the question of whether deploying 5G is an emergency. I wonder if it’s too late for the cellular carriers to fess up and admit that the emergency is really for 4G – even appeal court judges would likely understand that.
A lot of my clients make money by selling transport to the big traditional cell sites. Except for a few of them that operate middle-mile networks, the extra money from cell site transport adds a relatively high-margin product into the last-mile network.
Companies are now wondering how hard they should pursue small cell sites. They keep reading about the real-estate grab in the major cities where a number of companies are competing to build small cell enclosures, hoping to attract multiple carriers. They want to understand the size of the potential market for small cells outside of the major metropolitan areas. It’s not an easy question to answer.
The cellular carriers are building small cell sites in larger markets because they have exhausted the capabilities of the traditional large cell sites. The cellular companies have pushed bigger data plans and convinced customers that it’s okay to watch video on cellphones, and now they find that they are running out of bandwidth capacity. The only two immediate fixes are to build additional cell sites (thus, the small cells) or else add more spectrum. They eventually will layer on full 5G capability that will stretch spectrum a lot farther.
There are varying estimates for the eventual market for small cell sites. For example, the CTIA, the lobbying group for the wireless industry, estimates that small cells will grow from 86,000 in 2018 to 800,000 by 2026. The Wall Street analyst firm Cowan estimates 275,000 small cells by the end of 2023.
The big companies that are in the cellular backhaul business are asking the same questions as my clients. Crown Castle is embracing the small cell opportunity and sees it as a big area of future growth. Its competitor American Tower is more cautious and only chases small cell opportunities that have high margins. They caution that the profit opportunity for small cells is a lot less than at big towers. Other companies like Zayo and CenturyLink are pursuing small cells where it makes sense, but neither has yet made this a major part of their growth strategy – they are instead hoping to monetize the opportunity by adding small cells where they already own fiber.
The question that most of my clients want to understand is if the small cell building craze that has hit major metropolitan areas will ever make it out to smaller cities. In general, the big cellular carriers report that the amount of data used on their cell sites is doubling every two years. That’s a huge growth rate that can’t be sustained for very long on any network. But it’s likely that this rate of growth is not the same everywhere, and there are likely many smaller markets where cell sites are still underutilized.
Metropolitan cell sites were already using a lot of broadband even before customers started using more data. We know this because the cellular carriers have been buying and using robust data backhaul to urban sites of a gigabit or more in capacity. One good way to judge the potential for small cell sites is to look at the broadband used on existing tall tower sites. If a big tower site is using only a few hundred Mbps of bandwidth, then the cell site is not overloaded and still has room to accommodate broadband growth.
Everybody also wants to understand the revenue potential. The analyst firm Cowan estimates that the revenue opportunity per small cell site will average between $500 and $1,000 per site per month. That seems like a high price outside of metropolitan areas, where fiber is really expensive. I’ve already been seeing the big cellular carriers pushing for much lower transport rates for the big cell sites and in smaller markets carriers want to pay less than $1,000 per big tower. It probably takes 5 – 7 small cells to fully replace a large tower and it’s hard to envision the cellular carriers greatly expanding their backhaul bill unless they have no option.
It’s also becoming clear that both Verizon and AT&T have a strategy of building their own fiber anyplace where the backhaul costs are too high. We’ve already seen each carrier build some fiber in smaller markets in the last few years to avoid high transport cost situations. If both companies continue to be willing to overbuild to avoid transport costs, they have great leverage for negotiating reasonable, and lower transport costs.
As usual, I always put pen to paper. If the CTIA is right and there will be 800,000 small cell sites within six years that would mean a new annual backhaul cost of almost $5 billion annually for the cellular companies at a cost of $500 per site. While this is a profitable industry, the carriers are not going to absorb that kind of cost increase unless they have no option. If the 800,000 figure is a good estimate, I predict that within that same 6-year period that the cellular carriers will build fiber to a significant percentage of the new sites.
Perhaps the most important factor about the small cell business is that it’s local. I have one client in a town of 7,000 that recently saw several small cell sites added. I have clients in much larger communities where the carriers are not currently looking at small cell sites.
The bottom line for me is that anybody that owns fiber ought to probably provide backhaul for small cells on existing fiber routes. I’d be a lot more cautious about building new fiber for small cell sites. If that new fiber doesn’t drive other good revenue opportunities then it’s probably a much riskier investment than building fiber for the big tower cell sites. It’s also worth understanding the kind of small cell site being constructed. Many small cells sites will continue to be strictly used for cellular service while others might also support 5G local loops. Every last mile fiber provider should be leery about providing access to a broadband competitor.
One of the more unusual things ordered by the current FCC was setting a low cap on local fees that a City can charge to review an application for placing a small cell site. The FCC capped the application fee at up to $500 for a request to up to five small cell sites and $100 per site after that. The FCC also set a cap of $270 for an annual fee to use the rights-of-way for each small cell site.
Cities have an option to charge a more and can bill a ‘reasonable approximation’ of actual costs, but a City can expect a legal fight from wireless carriers for fees that are much higher than the FCC caps.
It’s worth looking back at the history of the issue. Wireless carriers complained to the FCC that they were being charged exorbitant fees to put equipment on utility poles in the public rights-of-way. The wireless carriers cited examples of having to pay north of $10,000 per small cell site. In most cases, fees have been far smaller than that, but citing the worst examples gave cover to the FCC for capping fees.
However, some of the examples of high fees cited by the carriers were for installations that would not be considered as a small cell. I’ve seen applications requests for hanging devices the size of a refrigerator on poles and also placing large cabinet on the sidewalk under a pole. The FCC acknowledged this in their order and set a size limit on what constitutes a small cell as a device occupying something less than 28 cubic feet.
It’s worth noting that much of the FCC’s order for small cell sites are under appeal. The most controversial issues being challenged are aspects of the order that stripped cities of the ability to set local rules on what can and cannot be hung on poles. The FCC basically said that cellular carriers are free to do what they want anywhere in the public rights-of-way and cities are arguing that the order violates the long-standing precedent that rights-of-ways issues should be decided locally.
Communities all over the country are upset with the idea that they have to allow a small cell site any place that the carriers want to put one. There are also active citizen’s groups protesting the implementation of millimeter wave cell sites due to public health concerns. A lot of the prominent radio scientists from around the world have warned of the potential public health consequences for prolonged exposure to the millimeter wave spectrum – similar to the spectrum used in airport scanners, but which would be broadcast continuously from poles in front of homes. There is also a lot of concern that carriers that hang millimeter wave transmitters are going to want aggressive tree trimming to maintain lines-of-sight to homes. Finally, there are concerns about the wild proliferation of devices if multiple wireless providers install devices on the same street.
The cap on local fees has already been implemented and cities are now obligated to charge the low rates unless they undertake the effort (and the likely legal fight) for setting higher fees. It is the setting of low fees that is the most puzzling aspect of the FCC order. It seems that the FCC has accepted the wireless carrier’s claim that high fees would kill the deployment of 5G small cell sites everywhere.
I live in a city that is probably pretty typical and that has an application process and inspectors for a huge range of processes, from building inspection, restaurant inspections, electrical and gas installation inspections and inspections of anything that disturbs a city street surface or is hung in the public rights-of-way. The city takes a strong position in assuring that the public rights-of-way are maintained in a way that provides the best long-term opportunity for the many uses of the rights-of-way. They don’t let any utility or entity take steps that make it harder for the next user to gain the needed access.
The $100 fee is to compensate the city for processing the application for access, to survey the site of the requested access and to then inspect that the wireless carrier really did what they promised and didn’t create unsafe conditions or physical hindrances in the right-of-way. It’s hard to think that $100 will compensate any city for the effort required. It will be interesting to see how many cities acquiesce to the low FCC rates instead of fighting to implement fair rates. Cities know that fights with carriers can be costly and they may not be willing to tackle the issue. But they also need to realize that the wireless carriers could pepper their rights-of-ways with devices that are likely to hang in place for decades. If they don’t tackle the issue up front they will have no latitude later to rectify small cell sites that were hung incorrectly or unsafely. I’ve attended hundreds of city council meetings and have always been amazed at the huge number of different issues that local politicians have to deal with. This is just one more issue added to that long list, and it will be understandable if many cities acquiesce to the low fees.
Governor Jerry Brown of California recently vetoed a bill, SB 649, that would have given wireless carriers cheap and easy access to poles. He said the bill was too much in the favor of the wireless companies and that a more balanced solution is needed.
This law highlights the legislative efforts of the cellular industry and the big telcos working to deploy 5G networks who want cheap and fast access to poles. There were similar pushes in many state legislative bodies this past year including in Texas, Florida and Washington. I think we can expect this to appear in many more state legislatures next year. This is obviously a big priority for the carriers who reportedly spent tens of millions of dollars lobbying for this in the recent legislative sessions.
It’s not hard to understand why the carriers want a legislative solution, because the alternative is the regulatory path. This is a complicated issue and the carriers know that if they try to get this through state regulatory commissions that it will take a long time and that regulators are likely to provide a balanced solution that the carriers don’t want.
There is one regulatory push on the issue and the FCC is considering it. The FCC voted in May to begin an investigation on the issues involved. One of the things they are examining are the regulatory impediments at the state and local levels that affect the issue. But the carriers know that the FCC path is a slow one. First, any FCC decision is likely to be challenged in court, a tactic that the carriers themselves often use to slow down the regulatory process. But there is also a big jurisdictional question, because today the states have the authority to override FCC rules concerning pole issues.
The issue is important because it’s at the heart of the hottest area of telecom growth in the deployment of mini-cell sites and the upcoming deployment of the various kinds of 5G. Not only do the carriers need to deploy millions of such connections to implement the networks they are promising to stockholders, but they also will have to be building a lot of new fiber to support the new wireless deployments.
It’s easy to sympathize with the carriers. I’ve herd the horror stories of it taking two years to get a wireless attachment approved in some cities, which is an obvious impediment to any sensible business plan deployment. But as is typical with these carriers, rather than asking for sensible rule changes that everybody can agree on they are promoting plans that are heavily lopsided in their favor. They want to deploy wireless devices using a method they are calling one-touch – which they interpret to mean installing devices on poles and telling the pole owner after it’s done. They also want these connections for dirt cheap. And they don’t want to have to be concerned with the safety issues involved in adding boxes and live electric connections into the mix of wires on existing poles.
The issue is interesting from the perspective of small CLECs and fiber overbuilders because small carriers have been yelling for years about the problems associated with getting access to poles – and nobody has been listening. In fact, one of the big proponents of the legislative process is AT&T, which is still fighting Google and others about getting access to AT&T poles. It’s not surprising to see that the proposed new laws favor wireless deployments without necessarily making it any easier for fiber overbuilders.
Since the carriers are throwing a lot of money at this it certainly seems likely that they will win this issue in some states. There are a number of states where the lobbying money of the big carriers has always gotten the carriers what they wanted. But there are plenty of states where this won’t pass, and so we are likely going to end up with a hodgepodge of rules, state by state, on the issue.
I’m not even sure where I stand on the issue. As a consumer I want to see advanced wireless technologies deployed. But as a homeowner I don’t necessarily want to see an ugly proliferation of big boxes on poles everywhere. And I certainly don’t want to see 120-foot poles deployed in my neighborhood and the trees decimated to accommodate line-of-sight wireless connections to homes. And as somebody who mostly works for smaller carriers I’m naturally biased against anything that benefits the big carriers over everybody else. I don’t know if there is a better indication about how complicated this is when somebody with my knowledge has mixed feelings about the issue.
The industry is full of hype right now about the impending roll-out of 5G cellular. This is largely driven by the equipment vendors who want to stir up excitement among their stockholders. But not everybody in the industry thinks that there will be a big rush to implement 5G. For example, a group called RAN Research issued a report last year that predicted a slow 5G implementation. They think that 4G will be the dominant wireless technology until at least 2030 and maybe longer.
They cite a number of reasons for this belief. First, 4G isn’t even fully developed yet and the standards and implementation coalition 3GPP plans to continue to develop 4G until at least 2020. There are almost no 4G deployments in the US that fully meet the 4G standards, and RAN Research expects the wireless carriers to continue to make incremental upgrades, as they have always done, to improve cellular along the 4G path.
They also point out that 5G is not intended as a forklift upgrade to 4G, but is instead intended to coexist alongside. This is going to allow a comfortable path for the carriers to implement 5G first in those places that most need it, but not rush to upgrade places that don’t. This doesn’t mean that the cellular carriers won’t be claiming 5G deployments sometime in the next few years, much in the way that they started using the name 4G LTE for minor improvements in 3G wireless. It took almost five years after the first marketing rollout of 4G to get to what is now considered 3.5G. We are just now finally seeing 4G that comes close to meeting the full standard.
But the main hurdle that RAN Research sees with a rapid 5G implementation is the cost. Any wireless technology requires a widespread and rapid deployment in order to achieve economy of scale savings. They predict that the cost of producing 5G-capable handsets is going to be a huge impediment to implementation. Very few people are going to be willing to pay a lot more for a 5G handset unless they can see an immediate benefit. And they think that is going to be the big industry hurdle to overcome.
Implementing 5G is going to require a significant expenditure in small dense cell-sites in order to realize the promised quality improvements. It turns out that implementing small cell sites is a lot costlier and lot more expensive than the cellular companies had hoped. It also turns out that the technology will only bring major advantages to those areas where there is the densest concentration of customers. That means big city business districts, stadiums, convention centers and hotel districts – but not many other places.
That’s the other side of the economy of scale implementation issue. If 5G is only initially implemented in these dense customer sites, then the vast majority of people will see zero benefit from 5G since they don’t go to these densely packed areas very often. And so there are going to be two economy of scale issues to overcome – making enough 5G equipment to keep the vendors solvent while also selling enough more-expensive phones to use the new 5G cell sites. And all of this will happen as 5G is rolled out in drabs and dribbles as happened with 4G.
The vendors are touting that software defined networking will lower the cost to implement 5G upgrades. That is likely to become true with the electronics after they are first implemented. It will be much easier to make the tiny incremental 5G improvements to cell sites after they have first been upgraded to 5G capability. But RAN Research thinks it’s that initial deployment that is going to be the hurdle. The wireless carriers are unlikely to rush to implement 5G in suburban and rural America until they see overwhelming demand for it – enough demand that justifies upgrading cell sites and deploying small cell sites.
There are a few trends that are going to affect the 5G deployment. The first is the IoT. The cellular industry is banking on cellular becoming the default way to communicate with IoT devices. Certainly that will be the way to communicate with things like smart cars that are mobile, but there will be a huge industry struggle to instead use WiFi, including the much-faster indoor millimeter wave radios for IoT. My first guess is that most IoT users are going to prefer to dump IoT traffic into their landline data pipe rather than buy separate cellular data plans. For now, residential IoT is skewing towards the WiFi and towards smart devices like the Amazon Echo which provide a voice interface for using the IoT.
Another trend that could help 5G would be some kind of government intervention to make it cheaper and easier to implement small cell sites. There are rule changes being considered at the FCC and in several state legislatures to find ways to speed up implementation of small wireless transmitters. But we know from experience that there is a long way to go after a regulatory rule change until we see change in the real world. It’s been twenty years now since the Telecommunications Act of 1996 required that pole owners make their poles available to fiber overbuilders – and yet the resistance of pole owners is still one of the biggest hurdles to fiber deployment. Changing the rules always sounds like a great idea, but it’s a lot harder to change the mindset and behavior of the electric companies that own most of the poles – the same poles that are going to be needed for 5G deployment.
I think RAN Research’s argument about achieving 5G economy of scale is convincing. Vendor excitement and hype aside, they estimated that it would cost $1,800 today to build a 5G capable handset, and the only way to get that price down would be to make hundreds of millions of 5G capable handsets. And getting enough 5G cell sites built to drive that demand is going to be a significant hurdle in the US.