5G For Rural America?

FCC Chairman Ajit Pai recently addressed the NTCA-The Rural Broadband Association membership and said that he saw a bright future for 5G in rural America. He sees 5G as a fixed-wireless deployment that fits in well with the fiber deployment already made by NTCA members.

The members of NTCA are rural telcos and many of these companies have upgraded their networks to fiber-to-the-home. Some of these telcos tackled building fiber a decade or more ago and many more are building fiber today using money from the ACAM program – part of the Universal Service Fund.

Chairman Pai was talking to companies that largely have been able to deploy fiber, and since Pai is basically the national spokesman for 5G it makes sense that he would try to make a connection between 5G and rural fiber. However, I’ve thought through every business model for marrying 5G and rural fiber and none of them make sense to me.

Consider the use of millimeter wave spectrum in rural America. I can’t picture a viable business case for deploying millimeter wave spectrum where a telco has already deployed fiber drops to every home. No telco would spend money to create wireless drops where they have already paid for fiber drops. One of the biggest benefits from building fiber is that it simplifies operations for a telco – mixing two technologies across the same geographic footprint would add unneeded operational complications that nobody would tackle on purpose.

The other business plan I’ve heard suggested is to sell wholesale 5G connections to other carriers as a new source of income. I also can’t imagine that happening. Rural telcos are going to fight hard to keep out any competitor that wants to use 5G to compete with their existing broadband customers. I can’t imagine a rural telco agreeing to provide fiber connections to 5G transmitters that would sit outside homes and compete with their existing broadband customers, and a telco that lets in a 5G competitor would be committing economic suicide. Rural business plans are precarious, by definition, and most rural markets don’t generate enough profits to justify two competitors.

What about using 5G in a competitive venture where a rural telco is building fiber outside of their territory? There may come a day when wireless loops have a lower lifecycle cost than fiber loops. But for now, it’s hard to think that a wireless 5G connection with electronics that need to be replaced at least once a decade can really compete over the long-haul with a fiber drop that might last 50 or 75 years. If that math flips we’ll all be building wireless drops – but that’s not going to happen soon. It’s probably going to take tens of millions of installations of millimeter wave drops until telcos trust 5G as a substitute for fiber.

Chairman Pai also mentioned mid-range spectrum in his speech, specifically the upcoming auction for 3.5 GHz spectrum. How might mid-range spectrum create a rural 5G play that works with existing fiber? It might be a moot question since few rural telcos are going to have access to licensed spectrum.

But assuming that telcos could find mid-range licensed spectrum, how would that benefit from their fiber? As with millimeter wave spectrum, a telco is not going to deploy this technology to cover the same areas where they already have fiber connections to homes. The future use of mid-range spectrum will be the same as it is today – to provide wireless broadband to customers that don’t live close to fiber. The radios will be placed on towers, the taller the better. These towers will then make connections to homes using dishes that can communicate with the tower.

Many of the telcos in the NTCA are already deploying this fixed wireless technology today outside of their fiber footprint. This technology benefits from having towers fed by fiber, but this rarely the same fiber that a telco is using to serve customers. In most cases this business plan requires extending fiber outside of the existing service footprint – and Chairman Pai said specifically that he saw advantage for 5G from existing fiber.

Further, it’s a stretch to label mid-range spectrum point-to-multipoint radio systems as 5G. From what numerous engineers have told me, 5G is not going to make big improvements over the way that fixed wireless operates today. 5G will add flexibility for the operator to fine-tune the wireless connection to any given customer, but the 5G technology won’t inherently increase the speed of the wireless broadband connection.

I just can’t find any business plan that is going to deliver 5G in rural America that takes advantage of the fiber that the small telcos have already built. I would love to hear from readers who might see a possibility that I have missed. I’ve thought about this a lot and I struggle to find the benefits for 5G in rural markets that Chairman Pai has in mind. 5G clearly needs a fiber-rich environment – but companies who have already built rural fiber-to-the-home are not going to embrace a second overlay technology or openly allow competitors onto their networks.

New Net Neutrality Legislation

On February 7, as hearings were being held on net neutrality, Congressional Republicans said they were going to offer up three different versions of a bill intended to reinstate net neutrality principles. The newest bill, the Open Internet Act of 2019, was introduced by Rep Bob Latta of Ohio. They also offered up bills previously introduced by Rep. Greg Walden of Oregon and Sen John Thune of South Dakota.

All three bills would reestablish rules against ISP blocking web traffic, throttling customers or implementing paid-prioritization, which has been referred to as creating fast lanes that give some web traffic prioritization over other traffic. Hanging over all of these bills is a court review of a challenge of the FCC’s right to kill net neutrality – a successful challenge would reinstate the original FCC net neutrality rules. There are also a number of states poised to introduce their own net neutrality rules should the court challenge fail.

The court case and the threat of state net neutrality rules are prodding Congress to enact net neutrality legislation. Legislation has always been the preferred solution for imposing any major changes in regulation. When there’s no legislation, then rules like net neutrality are subject to being changed every time there is a new FCC or a new administration. Nobody in the country benefits – not ISPs and not citizens – when policies like net neutrality change every time there is a new administration.

These three bills were clearly influenced by the big ISPs. They include nearly the identical talking points that are being promoted by NCTA, the lobbying arm of the largest ISPs, headed by ex-FCC Commissioner Michael Powell. There are two primary differences in these bills and the original net neutrality rules that were established by the last FCC.

The first is a provision that the legislation would allow the ISPs to stray from the net neutrality principles if there is a ‘public benefit’ from doing so. That would allow ISPs to adopt any web practice they want as long as they can concoct a story about how the practice creates a public benefit. Since there are winners and losers from almost any network practice of ISPs, it wouldn’t be hard to identify those that benefit from a given practice. From a regulatory perspective, this is as close as we can come to a joke. If a regulated entity gets to decide when a regulation applies, then it’s not really a regulation.

The other big difference from the proposed legislation and the original net neutrality order is the lack of what is called a ‘general conduct standard’. The original net neutrality order understood that the Internet is a rapidly evolving and that any specific rules governing Internet behavior would be obsolete almost as soon as they are enacted. ISPs and the other big players on the web are able to design ways around almost any imaginable legislative rules.

The original net neutrality order took the tactic of establishing the three basic net neutrality principles but didn’t provide any specific direction on how the FCC was supposed to enforce them. The concept of the general conduct standard is that the FCC will look at each bad practice of an ISP to see if it violates the net neutrality principles. Any FCC ruling would thus be somewhat narrow, except that a ruling against a specific ISP practice would generally apply to others doing the same thing.

The original net neutrality order envisioned a cycle where the FCC rules against bad practices and the ISPs then try to find another way to get what they want – so there would be a continuous cycle of ISPs introducing questionable behavior with the FCC deciding each time if the new practice violates the intent of the net neutrality principles. This was a really clever solution for trying to regulate an industry that changes as quickly as the ISP and web world.

The proposed legislation does away with the general conduct standard. That means that the FCC would not have the ability to judge specific ISP behavior as meeting or not meeting the net neutrality standards. This would take all of the teeth out of net neutrality rules since the FCC would have little authority to ban specific bad practices. This was summarized most succinctly by former FCC Chairman Tom Wheeler who testified in the recent Congressional hearings that if Congress established net neutrality rules it ought to allow for “a referee on the field with the ability to throw the flag for unjust and unreasonable activity.”

The bottom line is that the proposed legislation would reintroduce the basic tenets of net neutrality but would give the FCC almost no authority to enforce the rules. It’s impossible to imagine these bills being passed by a divided Congress, so we’re back to waiting on the Courts or perhaps on states trying to regulate net neutrality on their own – meaning a long-term muddled period of regulatory uncertainty.

Google Fiber Leaving Louisville

Most readers have probably heard by now that Google Fiber is leaving Louisville because of failures with their fiber network. They are giving customers two months of free service and sending them back to the incumbent ISPs in the city. The company used a construction technique called micro-trenching where they cut a tiny slit in the road, one inch wide and few inches deep to carry the fiber. Only a year after construction the fiber is popping out of the micro-trenches all over the city.

Everybody I’ve talked to is guessing that it’s a simple case of ice heaving. While a micro-trench is sealed, it’s likely that small amounts of moisture seep into the sealed micro-trench and freezes when it gets cold. The first freeze would create tiny cracks, and with each subsequent freeze the cracks would get a little larger until the trench finally fills up with water, fully freezes and ejects the fill material. The only way to stop this would be to find a permanent seal that never lets in moisture. That sounds like a tall task in a city like Louisville that might freeze and thaw practically every night during the winter.

Nobody other than AT&T or Charter can be happy about this. The reason that Google Fiber elected to use micro-trenching is that both big ISPs fought tooth and nail to block Google Fiber from putting fiber on the utility poles in the city. The AT&T suit was resolved in Google’s favor, with the Charter one is still in court. Perhaps Google Fiber should have just waited out the lawsuits – but the business pressure was there to get something done. Unfortunately, the big ISPs are being rewarded for their intransigence.

One obvious lesson learned is not to launch a new network using an untried and untested construction technique. In this case, the micro-trenches didn’t just fail, they failed spectacularly, in the worst way imaginable. Google Fiber says the only fix for the problem would be to build the network again from scratch, which makes no financial sense.

Certainly, the whole industry is going to now be extremely leery about micro-trenching, but there is a larger lesson to be learned from this. For example, I’ve heard from several small ISPs who are ready to leap into the 5G game and build networks using millimeter wave radios installed on poles. This is every bit a new and untested technology like micro-trenching. I’m not predicting that anybody pursuing that business plan will fail – but I can assuredly promise that they will run into unanticipated problems.

Over my career, I can’t think of a single example where an ISP that took a chance on a cutting-edge technology didn’t have big problems – and some of those problems were just as catastrophic as what Google Fiber just ran into. For example, I can remember half a dozen companies that tried to deploy broadband networks using the LMDS spectrum. I remember one case where the radios literally never worked and the venture lost their $2 million investment. I remember several others where the radios had glitches that caused major customer outages and were largely a market disaster.

One thing that I’ve seen over and over is that telecom vendors take shortcuts. When they introduce a new technology they are under extreme pressure to get it to market and drive new revenues. Ideally, a vendor would hold small field trials of new technology for a few years to work out the bugs. But if a vendor finds an ISP willing to take a chance on a beta technology, they are happy to let the customers of that ISP be the real guinea pigs for the technology, and for the ISP to take the hit for the ensuing problems.

I can cite similar stories for the first generation of other technologies including the first generation of DSL, WiFi mesh networks, PON fiber-to-the-home and IPTV. The companies that were the first pioneers deploying these technologies had costly and sometimes deadly problems. So perhaps the lesson learned is that pioneers pay a price. I’m sure that this failure of micro-trenching will result in changing or abandoning the technique. Perhaps we’ll learn to not use micro-trenches in certain climates. Or perhaps they’ll find a way to seal the micro-trenches against humidity. But none of those future solutions will make up for Google Fiber’s spectacular failure.

The real victims of this situation are the households in Louisville who had changed to Google Fiber – and everybody else in the City. Because of Google Fiber’s lower prices, both Charter and AT&T lowered prices everywhere in the city. You can bet it’s not going to take long to get the market back to full prices. Any customers crawling back to the incumbents from Google Fiber can probably expect to pay full price immediately – there is no real incentive to give them a low-price deal. As a whole, every household in the City is going to be spending $10 or $20 more per month for broadband – which is a significant penalty on the local economy.

AT&T’s 5G Strategy

AT&T recently described their long-term 5G strategy using what they call the 3 pillars of 5G – the three areas where the company is putting their 5G focus. The first pillar is a concentration on 5G cellular, and the company’s goal is to launch a 5G-based cellular service, with some cities coming on board in the second half of 2020. This launch will use frequencies in the sub-6 GHz range. This admission that there won’t be any AT&T 5G until at least 2020 contradicts the AT&T marketing folks who are currently trying to paint the company’s 4G LTE as pre-5G.

The biggest problem for the public will be getting a 5G cellphone. AT&T is working with Samsung to hopefully launch two phones later this year that have some 5G capability. As always with a new generation of wireless technology, the bottleneck will be in handsets. The cell phone makers can’t just make generic 5G phones – they have to work with the carriers to be ready to support the spectific subset of 5G features that are released. You might recall that the 5G cellular specification contains 13 improvements, and only the first generation of a few of those will be included in the first generation 5G cell sites. Cellphone manufacturers will also have to wrestle with the fact that each big cellular carrier will introduce a different set of 5G features.

This is a real gamble for cellphone makers because a 5G phone will become quickly obsolete. A 5G phone sold in late 2019 probably won’t include all of the 5G features that will be on the market by late 2020 – and this is likely to be true for the next 3 or 4 years as the carriers roll out incremental 5G improvements. It’s also a gamble for customers because anybody that buys an early 5G cellphone will have early bragging rights, but those cool benefits can be out of date in six months. I think most people will be like me and will wait a few years until the 5G dust settles.

AT&T’s second pillar is fixed wireless. This one is a head-scratcher because they are talking about the fixed cellular product they’ve already been using for several years – and that product is not 5G. This is the product that delivers broadband to homes using existing low-band cellular frequencies. This is not the same as Verizon’s product that delivers hundreds of megabits per second but is instead a product that delivers speeds up to 50 Mbps depending upon how far a customer lives from a cell tower – with reports that most households are getting 15 Mbps at best. This is the product that AT&T is mostly using to satisfy its CAF II requirements in rural America. All of the engineers I’ve talked to don’t think that 5G is going to materially improve this product.

The final pillar of AT&T’s strategy is edge computing. What AT&T means by this is to put fast processors at customer sites when there is the need to process low-latency, high-bandwidth data. Like other carriers, AT&T has found that not everything is suited for the cloud and that trying to send big data to and from the cloud can create a bandwidth bottleneck and add latency. This strategy doesn’t require 5G and AT&T has already been deploying edge routers. However, 5G will enhance this ability at customer sites that need to connect a huge number of devices simultaneously. 5G can make it easier to connect to a huge number of IoT devices in a hospital or to 50,000 cell phones in a stadium. The bottom line is that the migration to more edge computing is not a 5G issue and applies equally to AT&T’s fiber customers.

There is really nothing new in the three-pillar announcement and AT&T has been talking about all three applications from some time – but the announcement does highlight the company’s focus for stockholders.

In what was mostly a dig at Verizon, AT&T’s CEO Randall Stephenson did hold out the possibility of AT&T following Verizon into the 5G fixed wireless local loop using millimeter wave spectrum – however, he said such a product offering is probably three to five years into the future. He envisions the product as an enhancement to AT&T’s fiber products, not necessarily a replacement. He emphasized that AT&T is happy with the current fiber deployments. He provided some new statistics on a recent earnings call and said the company is seeing customer penetration rates between 33% and 40% within 18 months of new fiber deployment and penetration around 50% after three years. Those are impressive statistics because AT&T’s fiber deployments have been largely in urban areas competing with the big cable companies.

A year ago, Stephenson said that getting sufficient backhaul was his number one concern with deploying high-bandwidth wireless. While he hasn’t repeated that recently, it fits in with his narrative of seeing millimeter wave radio deployments in the 3-5 year time frame. The company recently released a new policy paper on its AirGig product that says that the product is still under development and might play well with 5G. AirGig is the mysterious wireless product that shoots wireless signals along power lines and somehow uses the power lines to maintain focus of the signal. Perhaps the company is seeing a future path for using AirGig as the backhaul to 5G fixed wireless deployments.

Forecasting the Future of Video

I recently saw several interesting forecasts about the cable industry. The research firm SNL Kagan predicts that broadband-only homes in the US – those that don’t subscribe to traditional linear cable TV – will increase from 23.3 million in 2018 to 40.8 million by 2023. In another forecast Parks Associates predicts that the number of worldwide OTT subscribers – households that subscribe to at least one online video service – will grow to 310 million by 2024.

These kinds of forecasts have always intrigued me. I doubt that there is anybody in the industry that doesn’t think that cord cutting won’t keep growing or that the market for services like Netflix won’t keep growing. What I find most interesting about these total-market forecasts is the specificity of the predictions, such as when Kagan predicts the 40.8 million number of broadband-only homes. I suspect if we did deeper into what Kagan says that they have probably predicted a range of possible future outcomes and were not that specific. But I also understand that sometimes putting a number on things is the best way to make a point in a press release.

What I’ve always found interesting about future predictions is how hard it is to predict where a whole industry is going. If I look back ten years I could find a dozen experts predicting the death of traditional landline telephones, and yet not one of them would have believed that by 2019 that landline penetration rates would still be around 40%. I imagine every one of them would have bet against that possibility. It’s easy to understand the trajectory of an industry, but it’s another thing to predict specifically where an industry will land in the future. It wasn’t hard ten years ago to predict the trajectory of the landline business, but it was nearly impossible to know how many landlines would still be around after ten years.

That doesn’t mean that somebody doesn’t have to try to make these predictions. There are huge dollars riding on the future of every telecom industry segment. Companies that invest in these industries want outside opinions on the direction of an industry. If I was developing a new OTT product like Apple is doing, I’d want some feel for the potential of my new investment. I’d want to gather as many different predictions about the future of the OTT market as possible. The above two predictions were announced publicly, but corporations regularly pay for private market assessments that never see the light of day.

To show how hard it is to make such predictions, I want to look a little more closely at the Kagan prediction. They are predicting that in five years there will be 17.5 million more homes that buy broadband and don’t buy a traditional TV product. There a number of factors and trends that would feed into that number:

  • It looks like first-time households of millennials and generation Z don’t subscribe to cable TV at nearly the same levels as their parents. Some portion of the increase in broadband-only homes will come from these new households.
  • While final numbers are still not in for 2018 it appears that there will be around 2 million homes that cut the cord last year and dropped cable TV. Is the future pace of cord cutting going to be faster, slow or stay the same? Obviously, predicting the future of cord cutting is a huge piece of the prediction.
  • It’s becoming a lot more complicated for a household to replace traditional cable. It looks like every major owner of content wants to put their unique content into a separate OTT service like CBS All Access did with the Star Trek franchise. The cost of subscribing to multiple OTT services is already getting expensive and is likely to get even costlier over time. Surveys have shown that households cut the cord to save money, so how will cord cutting be impacted if there are no savings from cutting the cord?
  • The big cable companies are creating new video products aimed at keeping subscribers. For instance, Comcast is bundling in Netflix and other OTT products and is also rolling out smaller and cheaper bundles of traditional programming. They are also allowing customers to view the content on any device, so buying a small bundle from Comcast doesn’t feel much different to the consumer than buying Sling TV. What impact will these countermeasures from the cable companies have on cord cutting?

I’m sure there are other factors that go into predicting the number of future homes without traditional cable TV and these few popped into my mind. I know that companies like Kagan and Parks have detailed current statistics on the industry that are not available to most of us. But statistics only take you so far, and anybody looking out past the end of 2019 is entering crystal ball territory. Five years is forever in a market that is as dynamic as cable TV and OTT content.

We aso know from past experience that there will be big changes in these industries that will change the paridigm. For example, the content owners might all decide that there is no profit in the OTT market and could kill their own OTT products and cause an OTT market contraction. Or a new entrant like Apple might become a major new competitor for Netflix and the demand for OTT services might explode even faster than expected. I don’t know how any prediction can anticipate big market events that might disrupt the whole industry.

Understand that I am not busting on these two predictions – I don’t know enough to have the slightest idea if these predictions are good are bad. These companies are paid to make their best guess and I’m glad that there are firms that do that. For example, Cisco has been making predictions annually for many years about the trajectory of broadband usage and that information is a valuable piece of the puzzle for a network engineer designing a new network. However, predicting how all of the different trends that affect video subscriptions over five years sounds like an unsolvable puzzle. Maybe if I’m still writing this blog five years from now I can check to see how these predictions fared.  One thing I know is that I’m not ready to take any five-year forecast of the cable industry to the bank.

The Status of the CAF II Deployments

The Benton Foundation noted last month that both CenturyLink and Frontier have not met all of their milestones for deployment of CAF II. This funding from the FCC is supposed to be used to improve rural broadband to speeds of at least 10/1 Mbps. As of the end of 2018, the CAF II recipients were to have completed upgrades to at least 60% of the customers in each state covered by the funding.

CenturyLink took funding to improve broadband in 33 states covering over 1 million homes and businesses. CenturyLink claims to have met the 60% milestone in twenty-three states but didn’t make the goal in eleven states: Colorado, Idaho, Kansas, Michigan, Minnesota, Missouri, Montana, Ohio, Oregon, Washington, and Wisconsin.

Frontier received CAF II funding to improve broadband to over 774,000 locations in 29 states. Frontier says they’ve met the milestone in 27 states but haven’t reached the 60% deployment milestone in Nebraska and New Mexico.  There were a number of other large telcos that took CAF Ii funding like AT&T, Windstream, and Consolidated, and I have to assume that they’ve reported meeting the 60% milestone.

Back in 2014 when it looked like the CAF II program might be awarded by reverse auction, we helped a number of clients take a look at the CAF II service areas. In many cases, these are large rural areas that cover 50% or more of most of the rural counties in the country. Most of my clients were interested in the CAF II money as a funding mechanism to help pay for rural fiber, but all of the big telcos other than AT&T announced originally that they planned to upgrade existing DSL. AT&T announced a strategy early on to used fixed cellular wireless to satisfy their CAF II requirements. Since then a few big telcos like Frontier and Windstream have said that they are also using fixed wireless to meet their obligations.

To us, the announcement that the telcos were going to upgrade DSL set off red flag alarms. In a lot of rural counties there are only a small number of towns, and those towns are the only places where the big telcos have DSLAMs (the DSL hub). Rural telephone exchanges tend to be large and the vast majority of rural customers have always been far out of range of DSL that originates in the small towns. One only has to go a few miles – barely outside the towns – to see DSL speeds fall off to nothing.

The only way to make DSL work in the CAF II areas would be to build fiber to rural locations and establish new DSL hub sites. As any independent telco can tell you who deployed DSL the right way, this is expensive because it takes a lot of the rural DSLAMs to get within range of every customer. By electing DSL upgrades, the big telcos like CenturyLink and Frontier had essentially agreed to build a dozen or more fiber DSLAMs in each of the rural counties covered by CAF II. My back-of-the-envelope math showed that was going to cost a lot more than what the companies were receiving from the CAF fund. Since I knew these telcos didn’t want to spend their own money in rural America, I predicted execution failures for many of the planned DSL deployments.

I believe the big telcos are now facing a huge dilemma. They’ve reached 60% of customers in many places (but not all). However, it is going to cost two to three times more per home to reach the remaining 40% of homes. The remaining customers are the ones on extremely long copper loops and DSL is an expensive technology use for reaching these last customers. A DSLAM built to serve the customers at the ends of these loops might only serve a few customers – and it’s hard to justify the cost of the fiber and electronics needed to reach them.

I’ve believed from the beginning that the big telcos building DSL for the CAF II program would take the approach of covering the low hanging fruit – those customers that can be reached by the deployment of a few DSLAMs in a given rural area. If that’s true, then the big telcos aren’t going to spend the money to reach the most remote customers, meaning a huge number of CAF II customers are going to see zero improvements in broadband. The telcos mostly met their 60% targets by serving the low-hanging fruit. They are going to have a huge challenge meeting the next milestones of 80% and 100%.

Probably because I write this blog, I hear from folks at all levels of the industry about rural broadband. I’ve heard a lot of stories from technicians telling me that some of the big telcos have only tackled the low-hanging fruit in the CAF builds. I’ve heard from others that some telcos aren’t spending more than a fraction of the CAF II money they got from the FCC and are pocketing much of it. I’ve heard from rural customers who supposedly already got a CAF II upgrade and aren’t seeing speeds improved to the 10/1 threshold.

The CAF II program will be finished soon and I’m already wondering how the telcos are going to report the results to the FCC if they took shortcuts and didn’t make all of the CAF II upgrades. Will they say they’ve covered everybody when some homes saw no improvement? Will they claim 10/1 Mbps speeds when many households were upgraded to something slower? If they come clean, how will the FCC react? Will the FCC try to find the truth or sweep it under the rug?

Telecom R&D

In January AT&T announced the creation of the WarnerMedia Innovation Lab, which is a research group that will try to combine AT&T technology advances and the company’s huge new media content. The lab, based in New York City, will consider how 5G, the Internet of Things, artificial intelligence, machine learning and virtual reality can work to create new viewer entertainment experiences.

This is an example of a highly directed R&D effort to create specific results – in this case the lab will be working on next-generation technologies for entertainment. This contrasts with labs that engage in basic research that allow scientists to explore scientific theories. The closest we’ve ever come to basic research from a commercial company was with Bell Labs that was operated by the old Ma Bell monopoly.

Bell Labs was partially funded by the government and also got research funds from ratepayers of the nationwide monopoly telco. Bell Labs research was cutting edge and resulted in breakthroughs like the transistor, the charge coupled device, Unix, fiber optics, lasers, data networking and the creation of the big bang theory. The Lab created over 33,000 patents and its scientists won eight Nobel Prizes. I was lucky enough to have a tour of Bell Labs in the 80s and I was a bit sad today when I had to look on the Internet to see if it still exists; it does and is now called Nokia Bell Labs and operates at a much smaller scale than the original lab.

Another successor to Bell Labs is AT&T Labs, the research division of AT&T. The lab engages in a lot of directed research, but also in basic research. AT&T Labs is investigating topics such as the physics of optical transmission and the physics of computing. Since its creation in 1996 AT&T Labs has been issued over 2,000 US patents. The lab’s directed research concentrates on technologies involved in the technical challenges of large networks and of working with huge datasets. The Lab was the first to be able to transmit 100 gigabits per second over fiber.

Verizon has also been doing directed research since the spin-off of Nynex with the divestiture of the Bell system. Rather than operate one big public laboratory the company has research groups engaged in topics of specific interest to the company. Recently the company chose a more public profile and announced the creation of its 5G Lab in various locations. The Manhattan 5G Lab will focus on media and finance tech; the Los Angeles lab will work with augmented reality (AR) and holograms; the Washington DC lab will work on public safety, first responders, cybersecurity, and hospitality tech; the Palo Alto lab will look at emerging technologies, education, and big data; and its Waltham, Massachusetts, lab will focus on robotics, healthcare, and real-time enterprise services.

Our industry has other labs engaged in directed research. The best known of these is CableLabs, the research lab outside Denver that was founded in 1988 and is jointly funded by the world’s major cable companies. This lab is largely responsible for the cable industry’s success in broadband since the lab created the various generations of DOCSIS technology that have been used to operate hybrid-fiber coaxial networks. CableLabs also explores other areas of wireless and wired communications.

While Comcast relies on CableLabs for its underlying technology, the company has also created Comcast Labs. This lab is highly focused on the customer experience and developed Comcast’s X1 settop box and created the integrated smart home product being sold by Comcast. Comcast Labs doesn’t only develop consumer devices and is involved in software innovation efforts like OpenStack and GitHub development. The lab most recently announced a breakthrough that allows cable networks to deliver data speeds up to 10 Gbps.

Shrinking Competition for Transport

Bloomberg reported that CenturyLink and Alphabet are interested in buying Zayo. It’s been anticipated that Zayo would be the next fiber acquisition target since the Level 3 merger with CenturyLink since they are the largest remaining independent owner of fiber.

As you might expect, the biggest owners of fiber are the big telcos and cable companies. Consider the miles of fiber owned by the ten biggest fiber owners – I note these miles of fiber are from the end of 2017 and a few of these companies like Verizon have been building a lot of fiber since then.

AT&T 1,100 K
Verizon 520 K
CenturyLink / Level 3 450 K
Charter 233 K
Windstream 147 K
Comcast 145 K
Frontier 140 K
Zayo 113 K
Cogent 57 K
Consolidated 36 K

You might wonder why this matters? First, Zayo is the largest company on the list who’s only business is to sell transport. All of Zayo’s fiber is revenue producing. While the companies above it on the list have a lot more fiber, a lot of that fiber is in the last mile in neighborhoods where there is not a lot of opportunity to sell access to others. The biggest independent fiber owner used to be Level 3, with 200,000 miles of revenue-producing fiber before they merged with CenturyLink.

The numbers on this chart don’t tell the whole story. Companies like Zayo also swap fiber with other networks. They may trade a pair of fibers on a route they own for a route elsewhere that they want to reach. These swapping arrangements mean the transport providers like Zayo, Cogent and Level 3 control a lot more fiber than is indicated by these numbers.

It matters because as soon as you get outside of the metropolitan areas there are not many options for fiber transport. A few years ago I helped a City look for fiber transport and the three options they found that were reasonably priced were CenturyLink, Level 3 and Zayo. If CenturyLink buys Zayo they will have purchased both competitors in this region and will effectively eliminated fiber transport competition for this community. Without that competition it’s inevitable that transport prices will rise.

I think back to the early days of competition after the Telecommunications Act of 1996. I remember working with clients in the 1990s looking for fiber transport, and there were many cases where there was only one provider willing to sell transport to a community. If the sole provider was the local telco or cable company it was likely that the cost of transport was four or five times more expensive than prices in nearby communities with more choices. When I worked with rural providers in the early 2000s, one of the first question I always asked was about the availability of  transport – because lack of transport sometimes killed business plans.

Since then there has been a lot of rural fiber built by companies like statewide fiber networks and others who saw a market for rural transport. Much of the rural construction was egged on by the need to get to cellular towers.

My fear is that we’ll slide back to the bad-old-days when rural fiber was a roadblock for providing broadband. I don’t so much fear for the most rural places because those fiber networks are owned by smaller companies and they aren’t going away. I fear more for places like county seats. I worked with a city in Pennsylvania a few years ago where there was a decent number of competitors for transport – Verizon, Zayo, Level 3 and XO. Since then Verizon bought XO and CenturyLink might own the other two. That city is not going to lose transport options, but the reduction from four providers to two giant ones almost surely means higher transport costs over time.

I am intrigued that Alphabet (the parent of Google Fiber) would look at buying an extensive fiber network like Zayo. Google is one of the biggest users of bandwidth in the country due to the web traffic to Google and YouTube. Their desire for fiber might be as simple as wanting to control the fiber supply chain they use. If so, that’s almost as disconcerting as CenturyLink buying Zayo if Google wouldn’t remain as a fierce transport competitor.

ISPs Are Violating the Old Net Neutrality Rules

It’s been just over a year since the FCC repealed net neutrality. The FCC’s case is being appealed and oral arguments are underway in the appeal as I write this blog. One would have to assume that until that appeal is finished that the big ISPs will be on their best behavior. Even so, the press has covered a number of ISP actions during the last year that would have violated net neutrality if the old rules were still in place.

It’s not surprising that the cellular carriers were the first ones to violate the old net neutrality rules. This is the most competitive part of the industry and the cellular carriers are not going to miss any opportunity to gain a marketing edge.

AT&T is openly advertising that cellular customers can stream the company’s DirecTV Now product without it counting against monthly data caps. Meanwhile, all of the competing video services like Sling TV, Paystation Vue, YouTube TV, Netflix or Amazon Prime count against AT&T data caps – and video can quickly kill a monthly data plan download allotment. AT&T’s behavior is almost a pure textbook example of why net neutrality rules were put into place – to stop ISPs from putting competitor’s products at an automatic disadvantage. AT&T is the biggest cellular provider in the country and this creates a huge advantage for DirecTV Now. All of the major cellular carriers are doing something similar in allowing some video to not count against the monthly data cap, but AT&T is the only one pushing their own video product.

In November a large study of 100,000 cellphone users by Northeastern University and the University of Massachusetts showed that Sprint was throttling Skype. This is not something that the carrier announced, but it’s a clear case of pushing web traffic to the ‘Internet slow lane’. We can only speculate why Sprint would do this, but regardless of their motivation this is clearly a violation of net neutrality.

This same study showed numerous incidents where all of the major cellular carriers throttled video services at times. YouTube was the number one target of throttling, followed by Netflix, Amazon Prime, and the NBC Sports app. This throttling wasn’t as widespread as Sprint’s throttling of Skype, but the carriers must have algorithms in their network that throttles specific video traffic when cell sites get busy. In contrast to the big carriers, the smaller independent cellular carrier C.Spire had almost no instances of differentiation among video streams.

Practices that might violate net neutrality were not limited to cellular carriers. For example, Verizon FiOS recently began giving free Netflix for a year to new broadband customers. AT&T also started giving out free HBO to new customers last year. This practice is more subtle than the cellular carrier practice of blocking or throttling content. One of the purposes of net neutrality was for ISPs to not discriminate against web traffic. By giving away free video services the landline broadband companies are promoting specific web services over competitors.

This doesn’t sound harmful, but the discussions in the net neutrality order warned about a future where the biggest ISPs would partner with a handful of big web services like Facebook or Netflix to the detriment of all smaller and start-up web services. A new video service will have a much harder time gaining customers if the biggest ISPs are giving away their competitors for free.

There are probably more bad practices going on that we don’t know about. We wouldn’t have known about the cellular throttling of services without the big study. A lot of discrimination can be done through the network routing practices of the ISPs, which are hard to prove. For example, I’ve been seeing a growing number of complaints from consumers recently who are having trouble with streaming video services. If you recall, net neutrality first gained traction when it became known that the big ISPs like Comcast were blatantly interfering with Netflix streaming. There is nothing today to stop the big ISPs from implementing network practices that degrade certain kinds of traffic. There is also nothing stopping them from demanding payments from web services like Netflix so that their product is delivered cleanly.

Interestingly, most of the big ISPs made a public pledge to not violate the spirit of net neutrality even if the rules were abolished. That seems to be a hollow promise that was to soothe the public that worried about the end if net neutrality. The FCC implemented net neutrality to protect the open Internet. The biggest ISPs have virtual monopolies in most markets and public opinion is rarely going to change an ISP behavior if the ISP decides that the monetary gain is worth the public unhappiness. Broadband customers don’t have a lot of options to change providers and Cable broadband is becoming a near-monopoly in urban areas. There is no way for a consumer to avoid the bad practices of the cellular companies if they all engage in the same bad practices.

There is at least some chance that the courts will overturn the FCC repeal of net neutrality, but that seems unlikely to me. If the ISPs win in court and start blocking traffic and discriminating against web traffic it does seem likely that some future FCC or Congress will reinstitute net neutrality and starts the fight all over again. Regardless of the court’s decision, I think we are a long way from hearing the last about net neutrality.

Why Big ISPs Screw Up

I was recently joking with a colleague about some of the really dumb things that some of the big ISPs do – those things that get negative press or that make customers permanently dislike them. But after thinking about it a bit, it struck me that bad behavior by the big companies is almost inevitable – it’s a challenge for a big company to not behave badly. I can think of a number of reasons for the poor decisions that big ISPs seem to repeatedly make.

Good Intentions but Bad Policies. Some of the ugliest stories in the press from our industry have come from Comcast customer service. Customers have recorded customer service representatives saying some of the most awful things. Comcast executives have often been quoted as saying that they want to do a better job of customer service and the company has thrown big bucks at the issue over the last decade to try to improve.

But Comcast has corporate policies that undo all of their good intentions. Some of the most memorable press stories came from customer service reps who are compensated for stopping customers from disconnecting service or for upselling additional services to customers. Win-back programs and upselling are good for the Comcast bottom line, but they tempt poorly paid customer service reps into saying anything to stop a customer from disconnecting or entice a customer service rep to sneak unwanted products onto a customer’s bill. The bottom line is that policies that promote good behavior go out the window when employees are compensated for bad behavior.

Decentralized Management. I remember reading last year about the big push at Verizon to bring all of their fiber assets under one regime. The company built fiber over the years under a lot of different business units and there has been no centralized fiber inventory. This has to have cost Verizon a fortune over the years with lost revenue opportunities on fiber that already exists. An outsider like me looks at this and wonders why something this common sense wasn’t done fifteen years ago. Unfortunately, the poor communications inside the company is a natural consequence of operating different business units, each in silos. The FiOS folks never knew what the enterprise or the cellular folks were doing, and so the company frittered away the huge synergies that could have been gained by making all fiber available to all business units. We’ve seen attempts at the big ISPs to make the kind of consolidation Verizon is doing, but if they aren’t careful, in time they’ll slip back to the old bad practices.

No Emphasis on Being Good Corporate Citizens. I worked at Southwestern Bell pre-divestiture. There were some negative sides from being a giant monopoly,  but the company also put a lot of effort into instilling the message internally that the company had a nationwide mandate to do a good job. The company constantly extolled its accomplishments to employees and effectively indoctrinated them into being good citizens. I happened to sit close to the person who took ‘executive’ complaints – complaints from customers that had escalated to upper management. The company made a legitimate effort to deal with every problem that made it that high in the company. Employees were rewarded for loyalty and good behavior with lifetime jobs – phone company people were joked to have bell-shaped heads.

Big ISPs no longer promise jobs for life and working at a big ISP today is just a job. I know a mountain of people who currently work for the big ISPs and none of them have that same esprit de corps that was normal at Ma Bell.

Quarterly Profit-Driven. A lot of the problems I see from the big ISPs come from the modern emphasis on quarterly earnings. This emphasis permeates down into the ranks of management at an ISP. For example, a department head might decide to not make a major repair or upgrade if it causes a blip in the department’s budget. The constant drive for quarterly earnings improvements drives ISPs to lay-off needed technicians to meet an earnings goal. It drives companies to raise rates even when they haven’t increased costs. It makes companies chase new shiny ideas like 5G even if the technology is half-baked and premature. Unfortunately, Wall Street matters more than both employees and customers – and it shows.