The Impending Cellular Data Crisis

There is one industry statistic that isn’t getting a lot of press – the fact that cellular data usage is more than doubling every two years. You don’t have to plot that growth rate very many years into the future to realize that existing cellular networks will be inadequate to handle the increased demand in just a few years. What’s even worse for the cellular industry is that the growth is the nationwide average. I have many clients who tell me there isn’t nearly that much growth at rural cellular towers – meaning there is likely even faster growth at some urban and suburban towers.

Much of this growth is a self-inflicted wound by the cellular industry. They’ve raised monthly data allowances and are often bunding in free video with cellular service, thus driving up usage. The public is responding to these changes by using the extra bandwidth made available to them.

There are a few obvious choke points that will be exposed with this kind of growth. Current cellphone technology limits the number of simultaneous connections that can be made from any given tower. As customers watch more video they eat up slots on the cell tower that otherwise could have been used to process numerous short calls and text messages. The other big chokepoint is going to be the broadband backhaul feeding each cell cite. When usage grows this fast it’s going to get increasingly expensive to buy leased backbone bandwidth – which explains why Verizon and AT&T are furiously building fiber to cell sites to avoid huge increases in backhaul costs.

5G will fix some, but not all of these issues. The growth is so explosive that cellular companies need to use every technique possible to make cell towers more efficient. Probably the best fix is to use more spectrum. Adding an additional spectrum to a cell site immediately adds capacity. However, this can’t happen overnight. Any new spectrum is only useful if customers can use it and it takes a number of years to modify cell sites and cellphones to work on a new spectrum. The need to meet growing demand is the primary reason that the CTIA recently told the FCC they need an eye-popping 400 MHz of new mid-range spectrum for cellular use. The industry painted that as being needed for 5G, but it’s needed now for 4G LTE.

Another fix for cell sites is to use existing frequency more efficiently. The most promising way to do this is with the use of MIMO antenna arrays – a technology to deploy multiple antennas in cellphones to combine multiple spectrum together to create a larger data pipe. MIMO technology can make it easier to respond to a request from a large bandwidth user – but it doesn’t relieve the overall pressure on a cell tower. If anything, it might do the exact opposite and let cell towers prioritize those that want to watch video over smaller users who might then be blocked from making voice calls or sending text messages. MIMO is also not an immediate fix and also needs to work through the cycle of getting the technology into cellphones.

The last strategy is what the industry calls densification, which is adding more cell sites. This is the driving force behind placing small cell sites on poles in areas with big cellular demand. However, densification might create as many problems as it solves. Most of the current frequencies used for cellular service travel a decent distance and placing cell sites too close together will create a lot of interference and noise between neighboring towers. While adding new cell sites adds additional local capacity, it also decreases the efficiency of all nearby cell sites using traditional spectrum – the overall improvement from densification is going to be a lot less than might be expected. The worse thing about this is that interference is hard to predict and is very much a local issue. This is the primary reason that the cellular companies are interested in millimeter wave spectrum for cellular – the spectrum travels a short distance and won’t interfere as much between cell sites placed closely together.

5G will fix some of these issues. The ability of 5G to do frequency slicing means that a cell site can provide just enough bandwidth for every user – a tiny slice of spectrum for a text message or IoT signal and a big pipe for a video stream. 5G will vastly expand the number of simultaneous users that can share a single cell site.

However, 5G doesn’t provide any additional advantages over 4G in terms of the total amount of backhaul bandwidth needed to feed a cell site. And that means that a 5G cell site will get equally overwhelmed if people demand more bandwidth than a cell site has to offer.

The cellular industry has a lot of problems to solve over a relatively short period of time. I expect that in the middle of the much-touted 5G roll-out we are going to start seeing some spectacular failures in the cellular networks at peak times. I feel sympathy for cellular engineers because it’s nearly impossible to have a network ready to handle data usage that doubles every two years. Even should engineers figure out strategies to handle five or ten times more usage, in only a few years the usage will catch up to those fixes.

I’ve never believed that cellular broadband can be a substitute for landline broadband. Every time somebody at the FCC or a politician declares that the future is wireless I’ve always rolled my eyes, because anybody that understands networks and the physics of spectrum can easily demonstrate that there are major limitations on the total bandwidth capacity at a given cell site, along with a limit on how densely cell sites can be packed in an area. The cellular networks are only carrying 5% of the total broadband in the country and it’s ludicrous to think that they could be expanded to carry most of it.

The Slow Deployment of 5G

Somebody asked me a few days ago why I write so much about 5G. My response is that I am intrigued by the 5G hype. The major players in the industry have been devoting big dollars to promote a technology that is still mostly vaporware. The most interesting thing about 5G is how politicians, regulators and the public have bought into the hype. I’ve never seen anything like it. I can remember other times when the world was abuzz over a new technology, but this was usually a reaction to an actual technology you could buy like the first laptop computers, the first iPhone and the first iPod.

Anybody that understands our industry knew that it will take a number of years to roll out any major new technology, particularly a wireless technology since wireless behaves differently in the field compared to the lab. We’re only a year past the release of 5G standards, and it’s unrealistic to think those standards could be translated into operation hardware and software systems in such a short time. You only have to look back at the history of 4G, which started as slowly as 5G and which finally had the first fully-compliant 4G cell site late last year.  It’s going to take just as long until we see a fully functional 5G cell site. What we will see, over time, is the incremental introduction of some of the aspects of 5G as they get translated from lab to the field. That rollout is further complicated for cellular use by the timeline needed to get 5G-ready handsets into peoples’ hands.

This blog was prompted by a Verizon announcement that 5G mobile services will be coming to 30 cities later this year. Of course, the announcement was short on details, because those details would probably be embarrassing for Verizon. I would expect that the company will introduce a tiny few aspects of 5G into the cell sites in business districts of major cities and claim that as a 5G roll-out.

What does that a roll-out this year mean for cellular customers? There are not yet any 5G capable cellphones. Both AT&T and Verizon have been working with Samsung to introduce a 5G version of their S10 phone later this year. Verizon has also been reported to be working with Lenovo for a 5G modular upgrade later this year. I’m guessing these phones are going to come with a premium price tag for the early adaptors willing to pay for 5G bragging rights. These phones will only work as 5G from the handful of cell sites with 5G gear – and that will only be for a tiny subset of the 5G specifications. I remember when one of my friends bought one of the first 4G phones and crowed about how it worked in downtown DC. At the time I told him his great performance was because he was probably the only guy using 4G – and sure enough, his performance dropped as others joined the new technology.

On the same day that I saw this Verizon announcement I also saw a prediction by Cisco that only 3% of cellular connections will occur over a 5G network by the end of 2022. This might be the best thing I’ve seen that pops the 5G hype. Even for folks buying the early 5G phones, there will be a dearth of cell sites around the country that will work with 5G for a number of years. Anybody who understands the lifecycle of cellular upgrades agrees with the Cisco timeline. It takes years to work through the cycle of upgrading cell sites, upgrading handsets and then getting those handsets to the public.

The same is true for the other technologies that are also being called 5G. Verizon made a huge splash just a few months ago about introducing 5G broadband using millimeter wave spectrum in four cities. Even at the time of that announcement, it was clear that those radios were not using the 5G standard, and Verizon quietly announced recently that they were ceasing those deployments while they wait for actual 5G technology. Those deployments were actually a beta test of millimeter wave radios, not the start of a rapid nationwide deployment of 5G broadband from poles.

AT&T had an even more ludicrous announcement at the end of 2018 where they announced 5G broadband that involved deployment of WiFi hotspots that were supposedly fed by 5G. However, this was a true phantom product for which they had no pricing and that nobody could order. And since no AT&T cell sites have been upgraded to 5G, one had to wonder how this involved any 5G technology. It’s clear this was technology roll-out by press release only so that they could have the bragging rights of saying they were the first ones to have 5G.

The final announcement I saw on that same day was one by T-Mobile saying they would begin deploying early 5G in cell sites in 2020. But the real news is that they aren’t planning on charging any more for any extra 5G speeds or features.

I come back to my original question about why I write about 5G so often. A lot of my clients ask me if they should be worried about 5G and I don’t have an answer for them. I can see that actual 5G technology is going to take a lot longer to come to market than the big carriers would have you believe. But I look at T-Mobile’s announcement on price and I also have to wonder what the cellular companies will really do once 5G works. Will AT&T and Verizon both spend billions to put 5G small cells in residential neighborhoods if it doesn’t drive any new cellular revenues? I have to admit that I’m skeptical – we’re going to have to wait to see what the carriers do rather than listen to what they say.

Google Fiber Leaving Louisville

Most readers have probably heard by now that Google Fiber is leaving Louisville because of failures with their fiber network. They are giving customers two months of free service and sending them back to the incumbent ISPs in the city. The company used a construction technique called micro-trenching where they cut a tiny slit in the road, one inch wide and few inches deep to carry the fiber. Only a year after construction the fiber is popping out of the micro-trenches all over the city.

Everybody I’ve talked to is guessing that it’s a simple case of ice heaving. While a micro-trench is sealed, it’s likely that small amounts of moisture seep into the sealed micro-trench and freezes when it gets cold. The first freeze would create tiny cracks, and with each subsequent freeze the cracks would get a little larger until the trench finally fills up with water, fully freezes and ejects the fill material. The only way to stop this would be to find a permanent seal that never lets in moisture. That sounds like a tall task in a city like Louisville that might freeze and thaw practically every night during the winter.

Nobody other than AT&T or Charter can be happy about this. The reason that Google Fiber elected to use micro-trenching is that both big ISPs fought tooth and nail to block Google Fiber from putting fiber on the utility poles in the city. The AT&T suit was resolved in Google’s favor, with the Charter one is still in court. Perhaps Google Fiber should have just waited out the lawsuits – but the business pressure was there to get something done. Unfortunately, the big ISPs are being rewarded for their intransigence.

One obvious lesson learned is not to launch a new network using an untried and untested construction technique. In this case, the micro-trenches didn’t just fail, they failed spectacularly, in the worst way imaginable. Google Fiber says the only fix for the problem would be to build the network again from scratch, which makes no financial sense.

Certainly, the whole industry is going to now be extremely leery about micro-trenching, but there is a larger lesson to be learned from this. For example, I’ve heard from several small ISPs who are ready to leap into the 5G game and build networks using millimeter wave radios installed on poles. This is every bit a new and untested technology like micro-trenching. I’m not predicting that anybody pursuing that business plan will fail – but I can assuredly promise that they will run into unanticipated problems.

Over my career, I can’t think of a single example where an ISP that took a chance on a cutting-edge technology didn’t have big problems – and some of those problems were just as catastrophic as what Google Fiber just ran into. For example, I can remember half a dozen companies that tried to deploy broadband networks using the LMDS spectrum. I remember one case where the radios literally never worked and the venture lost their $2 million investment. I remember several others where the radios had glitches that caused major customer outages and were largely a market disaster.

One thing that I’ve seen over and over is that telecom vendors take shortcuts. When they introduce a new technology they are under extreme pressure to get it to market and drive new revenues. Ideally, a vendor would hold small field trials of new technology for a few years to work out the bugs. But if a vendor finds an ISP willing to take a chance on a beta technology, they are happy to let the customers of that ISP be the real guinea pigs for the technology, and for the ISP to take the hit for the ensuing problems.

I can cite similar stories for the first generation of other technologies including the first generation of DSL, WiFi mesh networks, PON fiber-to-the-home and IPTV. The companies that were the first pioneers deploying these technologies had costly and sometimes deadly problems. So perhaps the lesson learned is that pioneers pay a price. I’m sure that this failure of micro-trenching will result in changing or abandoning the technique. Perhaps we’ll learn to not use micro-trenches in certain climates. Or perhaps they’ll find a way to seal the micro-trenches against humidity. But none of those future solutions will make up for Google Fiber’s spectacular failure.

The real victims of this situation are the households in Louisville who had changed to Google Fiber – and everybody else in the City. Because of Google Fiber’s lower prices, both Charter and AT&T lowered prices everywhere in the city. You can bet it’s not going to take long to get the market back to full prices. Any customers crawling back to the incumbents from Google Fiber can probably expect to pay full price immediately – there is no real incentive to give them a low-price deal. As a whole, every household in the City is going to be spending $10 or $20 more per month for broadband – which is a significant penalty on the local economy.

AT&T’s 5G Strategy

AT&T recently described their long-term 5G strategy using what they call the 3 pillars of 5G – the three areas where the company is putting their 5G focus. The first pillar is a concentration on 5G cellular, and the company’s goal is to launch a 5G-based cellular service, with some cities coming on board in the second half of 2020. This launch will use frequencies in the sub-6 GHz range. This admission that there won’t be any AT&T 5G until at least 2020 contradicts the AT&T marketing folks who are currently trying to paint the company’s 4G LTE as pre-5G.

The biggest problem for the public will be getting a 5G cellphone. AT&T is working with Samsung to hopefully launch two phones later this year that have some 5G capability. As always with a new generation of wireless technology, the bottleneck will be in handsets. The cell phone makers can’t just make generic 5G phones – they have to work with the carriers to be ready to support the spectific subset of 5G features that are released. You might recall that the 5G cellular specification contains 13 improvements, and only the first generation of a few of those will be included in the first generation 5G cell sites. Cellphone manufacturers will also have to wrestle with the fact that each big cellular carrier will introduce a different set of 5G features.

This is a real gamble for cellphone makers because a 5G phone will become quickly obsolete. A 5G phone sold in late 2019 probably won’t include all of the 5G features that will be on the market by late 2020 – and this is likely to be true for the next 3 or 4 years as the carriers roll out incremental 5G improvements. It’s also a gamble for customers because anybody that buys an early 5G cellphone will have early bragging rights, but those cool benefits can be out of date in six months. I think most people will be like me and will wait a few years until the 5G dust settles.

AT&T’s second pillar is fixed wireless. This one is a head-scratcher because they are talking about the fixed cellular product they’ve already been using for several years – and that product is not 5G. This is the product that delivers broadband to homes using existing low-band cellular frequencies. This is not the same as Verizon’s product that delivers hundreds of megabits per second but is instead a product that delivers speeds up to 50 Mbps depending upon how far a customer lives from a cell tower – with reports that most households are getting 15 Mbps at best. This is the product that AT&T is mostly using to satisfy its CAF II requirements in rural America. All of the engineers I’ve talked to don’t think that 5G is going to materially improve this product.

The final pillar of AT&T’s strategy is edge computing. What AT&T means by this is to put fast processors at customer sites when there is the need to process low-latency, high-bandwidth data. Like other carriers, AT&T has found that not everything is suited for the cloud and that trying to send big data to and from the cloud can create a bandwidth bottleneck and add latency. This strategy doesn’t require 5G and AT&T has already been deploying edge routers. However, 5G will enhance this ability at customer sites that need to connect a huge number of devices simultaneously. 5G can make it easier to connect to a huge number of IoT devices in a hospital or to 50,000 cell phones in a stadium. The bottom line is that the migration to more edge computing is not a 5G issue and applies equally to AT&T’s fiber customers.

There is really nothing new in the three-pillar announcement and AT&T has been talking about all three applications from some time – but the announcement does highlight the company’s focus for stockholders.

In what was mostly a dig at Verizon, AT&T’s CEO Randall Stephenson did hold out the possibility of AT&T following Verizon into the 5G fixed wireless local loop using millimeter wave spectrum – however, he said such a product offering is probably three to five years into the future. He envisions the product as an enhancement to AT&T’s fiber products, not necessarily a replacement. He emphasized that AT&T is happy with the current fiber deployments. He provided some new statistics on a recent earnings call and said the company is seeing customer penetration rates between 33% and 40% within 18 months of new fiber deployment and penetration around 50% after three years. Those are impressive statistics because AT&T’s fiber deployments have been largely in urban areas competing with the big cable companies.

A year ago, Stephenson said that getting sufficient backhaul was his number one concern with deploying high-bandwidth wireless. While he hasn’t repeated that recently, it fits in with his narrative of seeing millimeter wave radio deployments in the 3-5 year time frame. The company recently released a new policy paper on its AirGig product that says that the product is still under development and might play well with 5G. AirGig is the mysterious wireless product that shoots wireless signals along power lines and somehow uses the power lines to maintain focus of the signal. Perhaps the company is seeing a future path for using AirGig as the backhaul to 5G fixed wireless deployments.

ISPs Are Violating the Old Net Neutrality Rules

It’s been just over a year since the FCC repealed net neutrality. The FCC’s case is being appealed and oral arguments are underway in the appeal as I write this blog. One would have to assume that until that appeal is finished that the big ISPs will be on their best behavior. Even so, the press has covered a number of ISP actions during the last year that would have violated net neutrality if the old rules were still in place.

It’s not surprising that the cellular carriers were the first ones to violate the old net neutrality rules. This is the most competitive part of the industry and the cellular carriers are not going to miss any opportunity to gain a marketing edge.

AT&T is openly advertising that cellular customers can stream the company’s DirecTV Now product without it counting against monthly data caps. Meanwhile, all of the competing video services like Sling TV, Paystation Vue, YouTube TV, Netflix or Amazon Prime count against AT&T data caps – and video can quickly kill a monthly data plan download allotment. AT&T’s behavior is almost a pure textbook example of why net neutrality rules were put into place – to stop ISPs from putting competitor’s products at an automatic disadvantage. AT&T is the biggest cellular provider in the country and this creates a huge advantage for DirecTV Now. All of the major cellular carriers are doing something similar in allowing some video to not count against the monthly data cap, but AT&T is the only one pushing their own video product.

In November a large study of 100,000 cellphone users by Northeastern University and the University of Massachusetts showed that Sprint was throttling Skype. This is not something that the carrier announced, but it’s a clear case of pushing web traffic to the ‘Internet slow lane’. We can only speculate why Sprint would do this, but regardless of their motivation this is clearly a violation of net neutrality.

This same study showed numerous incidents where all of the major cellular carriers throttled video services at times. YouTube was the number one target of throttling, followed by Netflix, Amazon Prime, and the NBC Sports app. This throttling wasn’t as widespread as Sprint’s throttling of Skype, but the carriers must have algorithms in their network that throttles specific video traffic when cell sites get busy. In contrast to the big carriers, the smaller independent cellular carrier C.Spire had almost no instances of differentiation among video streams.

Practices that might violate net neutrality were not limited to cellular carriers. For example, Verizon FiOS recently began giving free Netflix for a year to new broadband customers. AT&T also started giving out free HBO to new customers last year. This practice is more subtle than the cellular carrier practice of blocking or throttling content. One of the purposes of net neutrality was for ISPs to not discriminate against web traffic. By giving away free video services the landline broadband companies are promoting specific web services over competitors.

This doesn’t sound harmful, but the discussions in the net neutrality order warned about a future where the biggest ISPs would partner with a handful of big web services like Facebook or Netflix to the detriment of all smaller and start-up web services. A new video service will have a much harder time gaining customers if the biggest ISPs are giving away their competitors for free.

There are probably more bad practices going on that we don’t know about. We wouldn’t have known about the cellular throttling of services without the big study. A lot of discrimination can be done through the network routing practices of the ISPs, which are hard to prove. For example, I’ve been seeing a growing number of complaints from consumers recently who are having trouble with streaming video services. If you recall, net neutrality first gained traction when it became known that the big ISPs like Comcast were blatantly interfering with Netflix streaming. There is nothing today to stop the big ISPs from implementing network practices that degrade certain kinds of traffic. There is also nothing stopping them from demanding payments from web services like Netflix so that their product is delivered cleanly.

Interestingly, most of the big ISPs made a public pledge to not violate the spirit of net neutrality even if the rules were abolished. That seems to be a hollow promise that was to soothe the public that worried about the end if net neutrality. The FCC implemented net neutrality to protect the open Internet. The biggest ISPs have virtual monopolies in most markets and public opinion is rarely going to change an ISP behavior if the ISP decides that the monetary gain is worth the public unhappiness. Broadband customers don’t have a lot of options to change providers and Cable broadband is becoming a near-monopoly in urban areas. There is no way for a consumer to avoid the bad practices of the cellular companies if they all engage in the same bad practices.

There is at least some chance that the courts will overturn the FCC repeal of net neutrality, but that seems unlikely to me. If the ISPs win in court and start blocking traffic and discriminating against web traffic it does seem likely that some future FCC or Congress will reinstitute net neutrality and starts the fight all over again. Regardless of the court’s decision, I think we are a long way from hearing the last about net neutrality.

What’s the Future for Big Towers?

Late last year AT&T announced that is had contracted for the construction of hundreds of new big cellular towers through Tillman Infrastructure. AT&T and Verizon jointly struck a deal to build with Tillman in 2017 and by late last year some of the new towers came online. This doesn’t sound like big news because towers are built every year – but these new towers were built to directly compete with and replace existing big towers. AT&T’s announcement was a warning to existing tower owners – lower your prices or we’ll bypass you.

You can’t blame AT&T and Verizon for this because they pay some of the highest prices for any telecom products to hang radios and to bring bandwidth to big towers. To a large degree, this is a problem of their own making, and the history of big towers is a great example of economics that has gone awry.

When the two companies first got into the cellular business they mostly built their own towers. There were some tall towers in existence – some to support public safety radio networks and many more that were part of the AT&T, MCI, and Verizon microwave backbone networks. You might remember the towers with the big horn antennas. When AT&T longlines started to replace microwave backhaul with fiber in the 1980s they sold the whole tower network to a newly formed company, American Tower. American Tower went on to remove the big horn antennas and leased space back on these towers to AT&T and Verizon for cellular use.

Within a few years, both big cellular carriers agreed to lease towers almost everywhere from American Tower and a few other big tower companies. At the time, both AT&T and Verizon were spinning off huge cash from the rapidly growing cellular business and they both decided to avoid the capital costs of building towers and allowed others to invest in the key infrastructure component of cellular networks. Both carriers also made similar choices about allowing others to construct the fiber needed to connect to their cell sites. Their decision to avoid capital costs turns out to have been a giant mistake in the long run.

Today, cellular companies are feeling huge pressure from competition as the prices of cellular plans have tumbled. Had the big carriers decided years ago to own their key infrastructure – towers and fiber – they would have minimal costs for operating these assets today. Instead, they are paying ever-escalating prices for tower space and fiber transport.

AT&T is now demanding big reductions in tower space rental prices. Building the new towers is an obvious threat that the company is willing to bypass anybody who won’t cut prices. A few hundred new towers is barely a blip in the tower market, but the AT&T message is clear. Last year Verizon used the same tactic to put pressure on fiber providers to lower transport costs – at the risk of Verizon building fiber to their towers and bypassing existing fiber.

All of this is happening at a time when we’re also seeing the proliferation of small cell sites. When I look at the architecture of cellular networks, a significant number of tall towers could be replaced with a network of small cell sites. The cellular network today is really two separate networks. There is the network built to provide cellular traffic along major highways – you see these towers at every few exits along every interstate highway. These towers are not likely to go away, and in fact, the tall towers are needed to provide coverage across large stretches of highway.

But there are a lot of cellular towers that have been built to serve where people live and work. There has been a long-standing unease in many communities about having the big towers in somebody’s back yard. Over time the cellular companies can make many of these towers obsolete as the smaller cell sites take over. (Of course, there is also now unease about having a lot of smaller towers in neighborhoods).

The big tower companies understand this transition. American Tower is leading the way in acquiring pole rights and is building electronics vaults along city streets for small cell sites to support 5G. Like other parts of the telecom market, the cell tower market segment is facing big changes. Just five years ago the big cellular carriers, the tower companies, and the fiber transport companies were all making big money from the cellular market. Today, all are feeling the pinch due to the advent of cellular price competition. It’s going to be interesting to see if AT&T and Verizon make the same choice all over again and lease small cell sites rather than building themselves.

Broadening the USF Funding Base

The funding mechanism to pay for the Universal Service Fund is broken. The USF is funded from fees added to landline telephones, cell phones and on large business data connections that are still billed using telco special access products (T1s and larger circuits). The USF fee has now climbed to an exorbitant month tax of 20% of the portion of those services that are deemed to be Interstate by the FCC. This equates to a monthly fee of between a dollar or more for every landline phone and cellphone (the amount charged varies by carrier).

The funding mechanism made sense when it was originally created. The fee at that time was assessed on landlines and was used to built and strengthen landline service in rural America. When the USF fee was introduced the nationwide penetration rate of landlines in urban America was over 98%, and the reasoning was that those with phone service ought to be charged a small fee to help bring phone service to rural America. The concept behind universal service is that everybody in the country is better off when we’re all connected to the communications network.

However, over time the use of the Universal Service Fund has changed drastically and this money is now the primary mechanism that FCC is using to pay for the expansion of rural broadband. This pot of money was used to fund the original CAF II programs for the big telcos and the A-CAM program for the smaller ones. It’s also the source of the Mobility Fund which is used to expand rural cellular coverage.

Remember the BDAC? That’s the Broadband Deployment Advisory Committees that was created by Chairman Ajit Pai when he first took the reins at the FCC. The BDAC was split into numerous subcommittees that looked at specific topics. Each BDAC subcommittee issued a report of recommendations on their topic, and since then little has been heard from them. But the BDAC subcommittees are still meeting and churning out recommendations.

The BDAC subcommittee tasked with creating a State Model Code has suggested the broadening of the funding for the USF. This is the one committee that is not making recommendations for the FCC but rather suggesting ideas that states ought to consider. The Committee has suggested that states establish a fee, similar to the federal USF fee and use the fee to expand broadband in each state. Many states have already done something similar and have created state Universal Service Funds.

The recommendation further suggests that states tax anybody that benefits from broadband. This would include not just ISPs and customers of ISPs, but also the big users of the web like Netflix, Google, Amazon, Facebook, etc. The reasoning is that those that benefit from broadband ought to help pay to expand broadband to everybody. The BDAC recommended language has been modified a few times because the original language was so broad that almost everybody in the country would be subject to the tax, and we’ve learned over years that taxation language needs to be precise.

This is not the first time that this idea has been floated. There are many who suggested in the past to the FCC that USF funding should be expanded to include broadband customers. Just as telephone customers were charged to fund the expansion of the telephone network it makes sense to tax broadband customers to expand broadband. But this idea has always been shot down because early in the life of the Internet the politicians in DC latched onto the idea of not taxing the Internet. This made sense at the time when we needed to protect the fledgling ISP industry – but that concept is now quaintly obsolete since Internet-related companies are probably collectively the world’s biggest industry and hardly need shielding from taxation.

AT&T is a member of this BDAC subcommittee and strongly supports the idea. However, AT&T’s motivations are suspect since they might be the biggest recipient of state USF funds. We saw AT&T lobbyists hijack the state broadband grant program in California and grab all of the money that would have been used to build real rural broadband in the state. The big carriers have an overly large influence in statehouses due to decades of lobbying, and so there is a concern that they support this idea for their own gain rather than supporting the idea of spreading broadband. We just saw AT&T lobbyists at the federal level sneak in language that makes it hard to use the e-Connectivity grants from competing with them.

But no matter how tainted the motivation of those on the BDAC committee, this is an idea with merit. It’s hard to find politicians anywhere who don’t think we should close the broadband gap. It’s clear that it’s going to take some government support to make this work. Currently, there are a number of state broadband grant programs, but these programs generally rely annually on allocations from the legislature – something that is always used annually as a bargaining chip against other legislative priorities. None of these grant programs have allocated enough money to make a real dent in the broadband shortfalls in their states. If states are going to help solve the broadband gap they need to come up with a lot more money.

Setting up state USF funds with a broad funding base is one way to help solve the rural broadband divide. This needs to be done in such a way that the money is used to build the needed fiber infrastructure that is needed to guarantee broadband for the rest of the century – such funds will be worthless if the money is siphoned instead to the pockets of the big telcos. It makes sense to assess the fees on a wider base, and I can’t see any reasonable objection against charging broadband customers but also charging big broadband-reliant companies like Netflix, Google, Amazon, and Facebook. The first state to try this will get a fight from those companies, but hopefully the idea of equity will win since it’s traffic from these companies that is driving the need for better broadband infrastructure.

The End of Satellite TV?

DirecTV launched their most recent satellite in May of 2015. The company has launched 16 satellites in its history, and with twelve remaining in service is the largest commercial satellite company in the world. AT&T, the owner of DirecTV announced at the end of last year that there would be no more future satellite launches. Satellites don’t last forever, and that announcement marks the beginning of the death of DirecTV. The satellites launched before 2000 are now defunct and the satellites launch after that will start going dark over time.

AT&T is instead going to concentrate of terrestrial cable service delivered over the web. They are now pushing customers to subscribe to DirecTV Now or WatchTV rather than the satellite service. We’ve already seen evidence of this shift and DirecTV was down to 19.6 million customers, having lost a net of 883,000 customers for the first three quarters of 2018. The other satellite company, Dish Networks lost 744,000 customers in the same 9-month period.

DirecTV is still the second largest cable provider, now 2.5 million customers smaller than Comcast, but 3 million customers larger than Charter. It can lose a few million customers per year and still remain as a major cable provider for a long time.

In much of rural America, the two satellite companies are the only TV option for millions of customers. Households without good broadband don’t have the option of going online. I was at a meeting with rural folks last week who were describing their painful attempts to stream even a single SD-quality stream over Netflix.

For many years the satellite providers competed on price and were able to keep prices low since they didn’t have to maintain a landline network and the associated technician fleet. However, both satellite providers looked to have abandoned that philosophy. DirecTV just announced rate increase that range from $3 to $8 per month for various packages. They also raised the price for regional sports networks by $1. Dish just announced rate increases that average $6 per month for its packages. These are the two largest rate increases in the history of these companies and will shrink the difference between satellite and terrestrial cable prices.

These rate increases will make it easier for rural cable providers to compete. Many of them have tried to keep rates within a reasonable range of the satellite providers, and these rate increases will shrink the differences in rates.

In the long run the consequences of not having the satellite option will create even more change in a fast-changing industry. For years the satellite companies have been the biggest competitor of the big cable companies – and they don’t just serve in rural America. I recently did a survey in a community of 20,000 where almost half of the households use satellite TV. As the satellite companies drop subscribers, some of them will revert to traditional cable providers. The recent price increases ought to accelerate that shift.

Nobody has a crystal ball for the cable industry. Just a year ago it seemed like industry-wide consensus that we were going to see a rapid acceleration of cord cutting. While cord cutting gets a lot of headlines, it hasn’t yet grown to nearly the same magnitude of change that we saw with households dropping telephone landlines. Surprisingly, even after nearly a decade of landline losses there are still around 40% of homes with a landline. Will we see the same thing with traditional cable TV, or will the providers push customers online?

Recently I’ve seen a spate of articles talking about how it’s becoming as expensive to buy online programming as it is to stick with cable companies, and if this becomes the public perception, we might see a slowdown in the pace of cord cutting. It’s possible that traditional cable will be around for a long time. The satellite cable companies lost money for many years, mostly due to low prices. It’s possible that after a few more big rate increases that these companies might become profitable and reconsider their future.

Windstream Turns Focus to Wireless

Windstream CEO Tony Thomas recently told investors that the company plans to stress wireless technology over copper going into the future. The company has been using point-to-point wireless to serve large businesses for several years. The company has more recently been using fixed point-to-multipoint wireless technology to satisfy some of it’s CAF II build-out requirements.

Thomas says that the fixed wireless technology blows away what could be provided over the old copper plant with DSL. In places with flat and open terrain like Iowa and Nebraska the company is seeing rural residential broadband speeds as fast as 100 Mbps with wireless – far faster than can be obtained with DSL.

Thomas also said that the company is also interested in fixed 5G deployments, similar to what Verizon is now starting to deploy – putting 5G transmitters on poles to serve nearby homes. He says the company is interested in the technology in places where they are ‘fiber rich’. While Windstream serves a lot of extremely rural locations, there also serve a significant number of towns and small cities in their incumbent service areas that might be good candidates for 5G.

The emphasis on wireless deployments puts Windstream on the same trajectory as AT&T. AT&T has made it clear numerous times to the FCC that they company would like to tear down rural copper wherever it can to serve customers with wireless. AT&T’s approach differs in that AT&T will be using its licensed cellular spectrum and 4G LTE in rural markets while Windstream would use unlicensed spectrum like various WISPs.

This leads me to wonder if Windstream will join the list of big telcos that will largely ignore its existing copper plant moving into the future. Verizon has done it’s best to sell rural copper to Frontier and seems to be largely ignoring its remaining copper plant – it’s the only big telcos that didn’t even bother to chase the CAF II money that could have been used to upgrade rural copper.

The new CenturyLink CEO made it clear that the company has no desire to make any additional investments that will earn ‘infrastructure returns’, meaning investing in last mile networks, both copper and fiber. You can’t say that Frontier doesn’t want to continue to support copper, but the company is clearly cash-stressed and is widely reported to be ignoring needed upgrades and repairs to rural copper networks.

The transition from copper to wireless is always scary for a rural area. It’s great that Windstream can now deliver speeds up to 100 Mbps to some customers. However, the reality of wireless networks are that there are always some customers who are out of reach of the transmitters. These customers may have physical impediments such as being in a valley or behind a hill and out of line-of-sight from towers. Or customers might just live to far away from a tower since all of the wireless technologies only work for some fixed distance from a tower, depending upon the specific spectrum being used.

It makes no sense for a rural telco to operate two networks, and one has to wonder what happens to the customers that can’t get the wireless service when the day comes when the copper network gets torn down. This has certainly been one of the concerns at the FCC when considering AT&T’s requests to tear down copper. The current FCC has relaxed the hurdles needed to tear down copper and so this situation is bound to arise. In the past the telcos had carrier of last-resort obligations for anybody living in the service area. Will they be required to somehow get wireless signal to those customers that fall between the cracks? I doubt that anybody will force them to do so. It’s not far-fetched to imagine customers living within a regulated telcos service area who can’t get telephone or broadband service from the telco.

Customers in these areas also have to be concerned with the future. We have wide experience that the current wireless technologies don’t last very long. We’ve seen electronics wear out and become functionally obsolete within seven years. Will Windstream and the other telcos chasing the wireless technology path dedicate enough capital to constantly replace electronics? We’ll have to wait for that answer – but experience says that they will cut corners to save money.

I also have to wonder what happens to the many parts of the Windstream service areas that are too hilly or too wooded for the wireless technology. As the company becomes wireless-oriented will they ignore the parts of the company stuck with copper? I just recently visited some rural counties that are heavily wooded, and which were told by local Windstream staff that the upgrades they’ve already seen on copper (which did not seem to make much difference) were the last upgrades they might ever see. If Windstream joins the other list of big telcos that will ignore rural copper, then these networks will die a natural death from neglect. The copper networks of all of the big telcos are already old and it won’t take much neglect to push these networks into the final death spiral.

Small Fiber Builders Making an Impact

The research firm RVA, LLC conducted a study for the Fiber Broadband Association looking at the number of homes and businesses that are now passed and/or served with fiber. The numbers show that smaller fiber providers are collectively having a big impact on the industry.

RVA found that as of September 2018 there were 18.4 million homes with fiber, up from 15 million a year earlier. To put that into perspective, at the end of 2017 there was just over 126 million US households, meaning that fiber has now made it into over 14% of US homes. What’s most impressive, though, about that finding is that 2.7% of homes got fiber in that one-year period. The number of fiber households has been creeping up slowly over the decade, but the speed of deployment is accelerating.

RVA also looked at passings and says that 39.2 million or 31% of homes are now passed with fiber. Comparing the 18.4 million fiber customers to the 39.2 million passings shows a fiber penetration rate of 47%. RVA also says that there are 1.6 million homes that are passed by two fiber providers – no doubt in the markets like Kansas City, Austin and the Research Triangle in North Carolina where Google and the incumbents both built fiber. RVA shows that when accounting for homes that have no broadband that fiber networks are achieving a 60% penetration rate.

Small fiber providers are collectively having a big impact on the industry. RVA says there are over 1,000 smaller fiber providers in the country. They quantify the overall market share of these providers as follows: smaller telcos (10.3%), fiber overbuilders (6.4%), cable companies (5.5%), municipalities (3.7%), real estate development integrators (1.1%) and electric cooperatives (0.5%).

In 2018 the small providers built to 29% of the new homes passed with the rest built by four Tier one providers. RVA didn’t identify these big providers, but clearly the biggest fiber builder right now is AT&T. The company has built fiber to over 10 million passings in the past four years and says they will reach about 14 million passings by mid-2019. A lot of the AT&T fiber passings come from an aggressive plan to build to MDUs (apartments and condominium complexes). However, the company is also making fiber available to homes within close range of its numerous existing neighborhood fiber POPs that are near to existing larger AT&T fiber customers.

The other biggest fiber builder right now is Altice. They announced a little over a year ago that they are planning to build fiber across their footprints from the Cable Vision and Suddenlink acquisitions – nearly 8 million passings. The company seems to be fulfilling that promise with a flurry of press releases in 2018 talking about active fiber deployments. Altice is currently trying to sell off some of its European fiber networks to lighten debt load and assumedly raise the cash needed to complete the US fiber build.

Most other large providers have more modest fiber plans. We know that the CenturyLink fiber expansion that was hot news just two years ago is likely now dead. Verizon is now putting its effort into fixed 5G wireless. The big cable companies all build fiber in new subdivisions but have all committed to DOCSIS 3.1 on their existing cable networks.

Looking forward a few years and most of the new fiber is likely to come from smaller providers. AT&T hasn’t announced any plans past the 2019 schedule and by then will have effectively passed all of the low-hanging fruit within range of its existing fiber network. Altice says it will take until at least 2022 to finish its fiber construction. There are no other big companies with announced plans to build fiber.

All of this is good news for the US households lucky enough to get fiber. It’s always been industry wisdom that the industry wouldn’t develop gigabit applications until there are enough fiber households to make it economically viable. While most customers on fiber probably are subscribing to speeds less than a gigabit, there ought to finally be enough gigabit fiber customers nationwide to create a gigabit market.