San Jose Tackles the Digital Divide

As a country we have done well with 85% of households in most areas now buying some form of broadband connection. But that still means that 15% of homes don’t have broadband. Certainly there are some homes that don’t want broadband, but it’s clear that a significant percentage of those without broadband can’t afford it.

Affordability is going to become more of an issue now that we see a strategy of the big ISPs to raise rates every year. I don’t think there’s much doubt that the cost of broadband is going to climb faster than the overall rate of inflation. We recently saw Charter raise the rate of bundled broadband by $5 per month. Wall Street is crediting the higher earnings of several big cable companies due to the trend that the companies are cutting back on their willingness to offer special prices for term contracts – I think the cable companies are finally acknowledging that they have won the war against DSL.

San Jose is no different than any big city in that it has big numbers of homes without broadband. The city recently estimated that there are 95,000 residents of the city without a home broadband connection. The city just announced a plan to begin solving the digital divide and pledged $24 million to kick off the effort. They claim this is the biggest effort being taken by a major city to solve the digital divide.

The digital divide became apparent soon after the introduction of DSL and cable modems in the late 1990s. Even then there were households locked out from the new technology due to the cost of buying broadband service. The digital divide gets more acute every year as more and more of our daily lives migrate online. It’s grown to become unimaginable for a student to have an even chance in school without access to broadband. Anybody with broadband only has to stop and imagine for a second what it would be like to lose broadband access – and then realize that there are huge numbers of homes that are missing out on many of the basic benefits that those with broadband take for granted.

The San Jose plan is light on detail at this early stage, but it’s clear that the city will be looking for infrastructure plans to extend broadband rather than subsidizing service from incumbent ISPs. Consider the mayor’s stated vision for broadband:

“Ensure all residents, businesses, and organizations can participate in and benefit from the prosperity and culture of innovation in Silicon Valley . . . Broaden access to basic digital infrastructure to all residents, especially our youth, through enabling free or low cost, high-speed, 1 gigabit or faster broadband service in several low-income communities, and increasing access to hardware, including tablets and laptops, for low-income students.”

The city won’t be tackling the issue alone and is hoping for involvement from the business and charitable organizations in the city. For example, the city is already working with the Knight Foundation that has been addressing this issue for years. The city is interested in technologies like Facebook’s terragraph wireless technology that plans to use 60 GHz spectrum to create fast outdoor wireless broadband.

The city recognizes that there are no instant fixes and already recognizes that it might take a decade to bring fast affordable broadband to everybody in the city. I’m sure that $24 million is also just a downpayment towards a permanent broadband solution. But this plan puts the city ahead of every other major metropolitan area in the willingness to tackle the problem head-on.

There has been a cry for solving the digital divide for twenty years. Some communities have found solutions that help, like the charitable effort by E2D in Charlotte, NC that is bringing laptops and wireless broadband to large numbers of homeless and low-income school students. But no city has directly tackled the problem before with a pledge of serious taxpayer funds to help find a solution. It’s been obvious from the beginning of the digital divide discussions that it was going to take money and broadband infrastructure to solve the problem. I’m sure that many other cities will be watching San Jose because the broadband gap is becoming a significant contributor to creating an underclass that has less access to education, healthcare and the chance for good paying jobs. I’m willing to make a bet that the long-term economic benefits from solving the digital divide in San Jose will be far greater than the money they are putting into the effort.

Excess Dark Fiber

A few weeks ago I wrote about a recommendation from one of the BDAC subcommittees to expand the base for the fees collected to fund the Universal Service Fund. BDAC is the acronym for the Broadband Deployment Advisory Committees created by FCC Chairman Ajit Pai to advise on ideas to promote better broadband.

That BDAC subcommittee is the one that is tasked with developing Model State Codes – ideas for states to consider in legislation. The subcommittee came up with another real doozy of an idea. In their latest draft report to the FCC in Article 4 – Rights to Access to Existing Network Support Infrastructure,  the group suggests that broadband could be more affordably expanded if excess fiber built by municipalities was made available to commercial providers for cheap prices.

The BDAC subcommittee suggests that any excess municipal fiber that is not in a 50-year fiber plan must be made available for lease to other carriers. The group also oddly proposes that this would also apply to municipal buildings, I guess to save carriers from having to build huts. I can think of a hundred reasons why forcing government buildings to house carriers is an extremely dumb idea, but let’s look closer at the fiber idea.

The BDAC suggestion clearly comes from the big ISPs who would love to get their hands onto municipal fiber for a bargain price. The way I know that the idea comes from the big ISPs is that they are suggesting that this would only be applied to municipal fiber. If the group had been looking for ways to improve broadband deployment they would have expanded this idea to include all excess dark fiber, regardless of the owner.

I always hear that one of the reasons we don’t have more fiber-to-the-home is that there is not enough fiber already in our communities. I don’t think that’s true. If I look at my city of Asheville, NC I would bet there is already fiber within a quarter mile of everybody in the City. The City might own fiber to connect schools or other government buildings. There is probably some fiber that supports public-safety networks and traffic lights. The incumbent cable company and telco deploys fiber to get to neighborhood nodes. There is fiber built to reach to large businesses. There’s fiber built to get to cellular towers. There is certainly fiber built to places like our large regional hospital complex, the universities, and various federal government office buildings. There is fiber owned by the electric company, and perhaps also by the gas and water companies. And as a regional hub at the nexus of a few major highways, there is likely long-haul fiber passing through here on the way to somewhere else, plus NCDOT fiber used for more local uses.

I’m positive that if all of this fiber was mapped that Asheville would look like a fiber-rich City – as would many places. Even rural counties often have a surprising amount of existing fiber that satisfies these same kinds of purposes. Yet most existing fiber was built to satisfy a single purpose and isn’t available for all of the other ways that fiber could benefit a community. Asheville might be fiber rich, but that fiber is off-limits to somebody interested in building fiber-to-the-home.

That’s the implied justification for the BDAC suggestion – that excess fiber shouldn’t sit idle if it could benefit better broadband. That’s also the basis for my suggestion of expanding this concept to all fiber, not just to government fiber. If AT&T builds a 24-fiber cable to a cell tower and will never use more than a few strands, then why shouldn’t they be required to sell the excess fiber capacity for cheap if it benefits the community?

The idea of forcing big ISPs to make fiber available is not a new one. In the Telecommunications Act of 1996, Congress required the big telcos to unbundle their excess dark fiber and make it available to anybody. However, the telcos actively resisted that order and began immediately to petition the FCC to soften the requirement, and as a consequence, very little dark fiber has ever been provided to others. I helped a few dozen companies try to get access to telco dark fiber and only a few succeeded. However, Congress was on the right track by recognizing that idle dark fiber is a valuable asset that could benefit the larger community.

I wrote a blog a few weeks back that talked about how Romania has the best broadband in Europe based upon hundreds of small ISPs that have built fiber just in their immediate neighborhood. I think that if all of the excess fiber capacity in a city was made available that it would unleash all sorts of creative entrepreneurs to do similar things. I know I would consider building a fiber network in my own neighborhood if there was a way for me to backhaul to a larger partner ISP.

However, the BDAC suggestion is not quite as altruistic as it might sound – the BDAC subcommittee is not worried that the public is missing out on the benefits from excess dark fiber. Remember that the big ISPs largely control the BDAC committees and I think this suggestion comes from AT&T and Comcast that want to punish any city with the audacity to build fiber to compete with them. This requirement would allow the big ISPs to take advantage of those competitive networks to effectively squash municipal competition.

But we shouldn’t let the vindictive nature of the suggestion erase the larger concept. I’ve rarely gotten a chance in our industry to say that, “What’s good for the goose is good for the gander” – but this is that opportunity. The BDAC has correctly identified the fact that broadband deployment would be easier everywhere if we could unleash the capacity of unused dark fiber. The BDAC subcommittee just didn’t take this idea to the natural conclusion by applying it to all existing fiber. I’m certain that if a state embraced applying this concept to all fiber that we’d see the big ISP screaming about confiscation of capital – which is exactly what it is.

5G For Rural America?

FCC Chairman Ajit Pai recently addressed the NTCA-The Rural Broadband Association membership and said that he saw a bright future for 5G in rural America. He sees 5G as a fixed-wireless deployment that fits in well with the fiber deployment already made by NTCA members.

The members of NTCA are rural telcos and many of these companies have upgraded their networks to fiber-to-the-home. Some of these telcos tackled building fiber a decade or more ago and many more are building fiber today using money from the ACAM program – part of the Universal Service Fund.

Chairman Pai was talking to companies that largely have been able to deploy fiber, and since Pai is basically the national spokesman for 5G it makes sense that he would try to make a connection between 5G and rural fiber. However, I’ve thought through every business model for marrying 5G and rural fiber and none of them make sense to me.

Consider the use of millimeter wave spectrum in rural America. I can’t picture a viable business case for deploying millimeter wave spectrum where a telco has already deployed fiber drops to every home. No telco would spend money to create wireless drops where they have already paid for fiber drops. One of the biggest benefits from building fiber is that it simplifies operations for a telco – mixing two technologies across the same geographic footprint would add unneeded operational complications that nobody would tackle on purpose.

The other business plan I’ve heard suggested is to sell wholesale 5G connections to other carriers as a new source of income. I also can’t imagine that happening. Rural telcos are going to fight hard to keep out any competitor that wants to use 5G to compete with their existing broadband customers. I can’t imagine a rural telco agreeing to provide fiber connections to 5G transmitters that would sit outside homes and compete with their existing broadband customers, and a telco that lets in a 5G competitor would be committing economic suicide. Rural business plans are precarious, by definition, and most rural markets don’t generate enough profits to justify two competitors.

What about using 5G in a competitive venture where a rural telco is building fiber outside of their territory? There may come a day when wireless loops have a lower lifecycle cost than fiber loops. But for now, it’s hard to think that a wireless 5G connection with electronics that need to be replaced at least once a decade can really compete over the long-haul with a fiber drop that might last 50 or 75 years. If that math flips we’ll all be building wireless drops – but that’s not going to happen soon. It’s probably going to take tens of millions of installations of millimeter wave drops until telcos trust 5G as a substitute for fiber.

Chairman Pai also mentioned mid-range spectrum in his speech, specifically the upcoming auction for 3.5 GHz spectrum. How might mid-range spectrum create a rural 5G play that works with existing fiber? It might be a moot question since few rural telcos are going to have access to licensed spectrum.

But assuming that telcos could find mid-range licensed spectrum, how would that benefit from their fiber? As with millimeter wave spectrum, a telco is not going to deploy this technology to cover the same areas where they already have fiber connections to homes. The future use of mid-range spectrum will be the same as it is today – to provide wireless broadband to customers that don’t live close to fiber. The radios will be placed on towers, the taller the better. These towers will then make connections to homes using dishes that can communicate with the tower.

Many of the telcos in the NTCA are already deploying this fixed wireless technology today outside of their fiber footprint. This technology benefits from having towers fed by fiber, but this rarely the same fiber that a telco is using to serve customers. In most cases this business plan requires extending fiber outside of the existing service footprint – and Chairman Pai said specifically that he saw advantage for 5G from existing fiber.

Further, it’s a stretch to label mid-range spectrum point-to-multipoint radio systems as 5G. From what numerous engineers have told me, 5G is not going to make big improvements over the way that fixed wireless operates today. 5G will add flexibility for the operator to fine-tune the wireless connection to any given customer, but the 5G technology won’t inherently increase the speed of the wireless broadband connection.

I just can’t find any business plan that is going to deliver 5G in rural America that takes advantage of the fiber that the small telcos have already built. I would love to hear from readers who might see a possibility that I have missed. I’ve thought about this a lot and I struggle to find the benefits for 5G in rural markets that Chairman Pai has in mind. 5G clearly needs a fiber-rich environment – but companies who have already built rural fiber-to-the-home are not going to embrace a second overlay technology or openly allow competitors onto their networks.

New Net Neutrality Legislation

On February 7, as hearings were being held on net neutrality, Congressional Republicans said they were going to offer up three different versions of a bill intended to reinstate net neutrality principles. The newest bill, the Open Internet Act of 2019, was introduced by Rep Bob Latta of Ohio. They also offered up bills previously introduced by Rep. Greg Walden of Oregon and Sen John Thune of South Dakota.

All three bills would reestablish rules against ISP blocking web traffic, throttling customers or implementing paid-prioritization, which has been referred to as creating fast lanes that give some web traffic prioritization over other traffic. Hanging over all of these bills is a court review of a challenge of the FCC’s right to kill net neutrality – a successful challenge would reinstate the original FCC net neutrality rules. There are also a number of states poised to introduce their own net neutrality rules should the court challenge fail.

The court case and the threat of state net neutrality rules are prodding Congress to enact net neutrality legislation. Legislation has always been the preferred solution for imposing any major changes in regulation. When there’s no legislation, then rules like net neutrality are subject to being changed every time there is a new FCC or a new administration. Nobody in the country benefits – not ISPs and not citizens – when policies like net neutrality change every time there is a new administration.

These three bills were clearly influenced by the big ISPs. They include nearly the identical talking points that are being promoted by NCTA, the lobbying arm of the largest ISPs, headed by ex-FCC Commissioner Michael Powell. There are two primary differences in these bills and the original net neutrality rules that were established by the last FCC.

The first is a provision that the legislation would allow the ISPs to stray from the net neutrality principles if there is a ‘public benefit’ from doing so. That would allow ISPs to adopt any web practice they want as long as they can concoct a story about how the practice creates a public benefit. Since there are winners and losers from almost any network practice of ISPs, it wouldn’t be hard to identify those that benefit from a given practice. From a regulatory perspective, this is as close as we can come to a joke. If a regulated entity gets to decide when a regulation applies, then it’s not really a regulation.

The other big difference from the proposed legislation and the original net neutrality order is the lack of what is called a ‘general conduct standard’. The original net neutrality order understood that the Internet is a rapidly evolving and that any specific rules governing Internet behavior would be obsolete almost as soon as they are enacted. ISPs and the other big players on the web are able to design ways around almost any imaginable legislative rules.

The original net neutrality order took the tactic of establishing the three basic net neutrality principles but didn’t provide any specific direction on how the FCC was supposed to enforce them. The concept of the general conduct standard is that the FCC will look at each bad practice of an ISP to see if it violates the net neutrality principles. Any FCC ruling would thus be somewhat narrow, except that a ruling against a specific ISP practice would generally apply to others doing the same thing.

The original net neutrality order envisioned a cycle where the FCC rules against bad practices and the ISPs then try to find another way to get what they want – so there would be a continuous cycle of ISPs introducing questionable behavior with the FCC deciding each time if the new practice violates the intent of the net neutrality principles. This was a really clever solution for trying to regulate an industry that changes as quickly as the ISP and web world.

The proposed legislation does away with the general conduct standard. That means that the FCC would not have the ability to judge specific ISP behavior as meeting or not meeting the net neutrality standards. This would take all of the teeth out of net neutrality rules since the FCC would have little authority to ban specific bad practices. This was summarized most succinctly by former FCC Chairman Tom Wheeler who testified in the recent Congressional hearings that if Congress established net neutrality rules it ought to allow for “a referee on the field with the ability to throw the flag for unjust and unreasonable activity.”

The bottom line is that the proposed legislation would reintroduce the basic tenets of net neutrality but would give the FCC almost no authority to enforce the rules. It’s impossible to imagine these bills being passed by a divided Congress, so we’re back to waiting on the Courts or perhaps on states trying to regulate net neutrality on their own – meaning a long-term muddled period of regulatory uncertainty.

Google Fiber Leaving Louisville

Most readers have probably heard by now that Google Fiber is leaving Louisville because of failures with their fiber network. They are giving customers two months of free service and sending them back to the incumbent ISPs in the city. The company used a construction technique called micro-trenching where they cut a tiny slit in the road, one inch wide and few inches deep to carry the fiber. Only a year after construction the fiber is popping out of the micro-trenches all over the city.

Everybody I’ve talked to is guessing that it’s a simple case of ice heaving. While a micro-trench is sealed, it’s likely that small amounts of moisture seep into the sealed micro-trench and freezes when it gets cold. The first freeze would create tiny cracks, and with each subsequent freeze the cracks would get a little larger until the trench finally fills up with water, fully freezes and ejects the fill material. The only way to stop this would be to find a permanent seal that never lets in moisture. That sounds like a tall task in a city like Louisville that might freeze and thaw practically every night during the winter.

Nobody other than AT&T or Charter can be happy about this. The reason that Google Fiber elected to use micro-trenching is that both big ISPs fought tooth and nail to block Google Fiber from putting fiber on the utility poles in the city. The AT&T suit was resolved in Google’s favor, with the Charter one is still in court. Perhaps Google Fiber should have just waited out the lawsuits – but the business pressure was there to get something done. Unfortunately, the big ISPs are being rewarded for their intransigence.

One obvious lesson learned is not to launch a new network using an untried and untested construction technique. In this case, the micro-trenches didn’t just fail, they failed spectacularly, in the worst way imaginable. Google Fiber says the only fix for the problem would be to build the network again from scratch, which makes no financial sense.

Certainly, the whole industry is going to now be extremely leery about micro-trenching, but there is a larger lesson to be learned from this. For example, I’ve heard from several small ISPs who are ready to leap into the 5G game and build networks using millimeter wave radios installed on poles. This is every bit a new and untested technology like micro-trenching. I’m not predicting that anybody pursuing that business plan will fail – but I can assuredly promise that they will run into unanticipated problems.

Over my career, I can’t think of a single example where an ISP that took a chance on a cutting-edge technology didn’t have big problems – and some of those problems were just as catastrophic as what Google Fiber just ran into. For example, I can remember half a dozen companies that tried to deploy broadband networks using the LMDS spectrum. I remember one case where the radios literally never worked and the venture lost their $2 million investment. I remember several others where the radios had glitches that caused major customer outages and were largely a market disaster.

One thing that I’ve seen over and over is that telecom vendors take shortcuts. When they introduce a new technology they are under extreme pressure to get it to market and drive new revenues. Ideally, a vendor would hold small field trials of new technology for a few years to work out the bugs. But if a vendor finds an ISP willing to take a chance on a beta technology, they are happy to let the customers of that ISP be the real guinea pigs for the technology, and for the ISP to take the hit for the ensuing problems.

I can cite similar stories for the first generation of other technologies including the first generation of DSL, WiFi mesh networks, PON fiber-to-the-home and IPTV. The companies that were the first pioneers deploying these technologies had costly and sometimes deadly problems. So perhaps the lesson learned is that pioneers pay a price. I’m sure that this failure of micro-trenching will result in changing or abandoning the technique. Perhaps we’ll learn to not use micro-trenches in certain climates. Or perhaps they’ll find a way to seal the micro-trenches against humidity. But none of those future solutions will make up for Google Fiber’s spectacular failure.

The real victims of this situation are the households in Louisville who had changed to Google Fiber – and everybody else in the City. Because of Google Fiber’s lower prices, both Charter and AT&T lowered prices everywhere in the city. You can bet it’s not going to take long to get the market back to full prices. Any customers crawling back to the incumbents from Google Fiber can probably expect to pay full price immediately – there is no real incentive to give them a low-price deal. As a whole, every household in the City is going to be spending $10 or $20 more per month for broadband – which is a significant penalty on the local economy.

AT&T’s 5G Strategy

AT&T recently described their long-term 5G strategy using what they call the 3 pillars of 5G – the three areas where the company is putting their 5G focus. The first pillar is a concentration on 5G cellular, and the company’s goal is to launch a 5G-based cellular service, with some cities coming on board in the second half of 2020. This launch will use frequencies in the sub-6 GHz range. This admission that there won’t be any AT&T 5G until at least 2020 contradicts the AT&T marketing folks who are currently trying to paint the company’s 4G LTE as pre-5G.

The biggest problem for the public will be getting a 5G cellphone. AT&T is working with Samsung to hopefully launch two phones later this year that have some 5G capability. As always with a new generation of wireless technology, the bottleneck will be in handsets. The cell phone makers can’t just make generic 5G phones – they have to work with the carriers to be ready to support the spectific subset of 5G features that are released. You might recall that the 5G cellular specification contains 13 improvements, and only the first generation of a few of those will be included in the first generation 5G cell sites. Cellphone manufacturers will also have to wrestle with the fact that each big cellular carrier will introduce a different set of 5G features.

This is a real gamble for cellphone makers because a 5G phone will become quickly obsolete. A 5G phone sold in late 2019 probably won’t include all of the 5G features that will be on the market by late 2020 – and this is likely to be true for the next 3 or 4 years as the carriers roll out incremental 5G improvements. It’s also a gamble for customers because anybody that buys an early 5G cellphone will have early bragging rights, but those cool benefits can be out of date in six months. I think most people will be like me and will wait a few years until the 5G dust settles.

AT&T’s second pillar is fixed wireless. This one is a head-scratcher because they are talking about the fixed cellular product they’ve already been using for several years – and that product is not 5G. This is the product that delivers broadband to homes using existing low-band cellular frequencies. This is not the same as Verizon’s product that delivers hundreds of megabits per second but is instead a product that delivers speeds up to 50 Mbps depending upon how far a customer lives from a cell tower – with reports that most households are getting 15 Mbps at best. This is the product that AT&T is mostly using to satisfy its CAF II requirements in rural America. All of the engineers I’ve talked to don’t think that 5G is going to materially improve this product.

The final pillar of AT&T’s strategy is edge computing. What AT&T means by this is to put fast processors at customer sites when there is the need to process low-latency, high-bandwidth data. Like other carriers, AT&T has found that not everything is suited for the cloud and that trying to send big data to and from the cloud can create a bandwidth bottleneck and add latency. This strategy doesn’t require 5G and AT&T has already been deploying edge routers. However, 5G will enhance this ability at customer sites that need to connect a huge number of devices simultaneously. 5G can make it easier to connect to a huge number of IoT devices in a hospital or to 50,000 cell phones in a stadium. The bottom line is that the migration to more edge computing is not a 5G issue and applies equally to AT&T’s fiber customers.

There is really nothing new in the three-pillar announcement and AT&T has been talking about all three applications from some time – but the announcement does highlight the company’s focus for stockholders.

In what was mostly a dig at Verizon, AT&T’s CEO Randall Stephenson did hold out the possibility of AT&T following Verizon into the 5G fixed wireless local loop using millimeter wave spectrum – however, he said such a product offering is probably three to five years into the future. He envisions the product as an enhancement to AT&T’s fiber products, not necessarily a replacement. He emphasized that AT&T is happy with the current fiber deployments. He provided some new statistics on a recent earnings call and said the company is seeing customer penetration rates between 33% and 40% within 18 months of new fiber deployment and penetration around 50% after three years. Those are impressive statistics because AT&T’s fiber deployments have been largely in urban areas competing with the big cable companies.

A year ago, Stephenson said that getting sufficient backhaul was his number one concern with deploying high-bandwidth wireless. While he hasn’t repeated that recently, it fits in with his narrative of seeing millimeter wave radio deployments in the 3-5 year time frame. The company recently released a new policy paper on its AirGig product that says that the product is still under development and might play well with 5G. AirGig is the mysterious wireless product that shoots wireless signals along power lines and somehow uses the power lines to maintain focus of the signal. Perhaps the company is seeing a future path for using AirGig as the backhaul to 5G fixed wireless deployments.

Forecasting the Future of Video

I recently saw several interesting forecasts about the cable industry. The research firm SNL Kagan predicts that broadband-only homes in the US – those that don’t subscribe to traditional linear cable TV – will increase from 23.3 million in 2018 to 40.8 million by 2023. In another forecast Parks Associates predicts that the number of worldwide OTT subscribers – households that subscribe to at least one online video service – will grow to 310 million by 2024.

These kinds of forecasts have always intrigued me. I doubt that there is anybody in the industry that doesn’t think that cord cutting won’t keep growing or that the market for services like Netflix won’t keep growing. What I find most interesting about these total-market forecasts is the specificity of the predictions, such as when Kagan predicts the 40.8 million number of broadband-only homes. I suspect if we did deeper into what Kagan says that they have probably predicted a range of possible future outcomes and were not that specific. But I also understand that sometimes putting a number on things is the best way to make a point in a press release.

What I’ve always found interesting about future predictions is how hard it is to predict where a whole industry is going. If I look back ten years I could find a dozen experts predicting the death of traditional landline telephones, and yet not one of them would have believed that by 2019 that landline penetration rates would still be around 40%. I imagine every one of them would have bet against that possibility. It’s easy to understand the trajectory of an industry, but it’s another thing to predict specifically where an industry will land in the future. It wasn’t hard ten years ago to predict the trajectory of the landline business, but it was nearly impossible to know how many landlines would still be around after ten years.

That doesn’t mean that somebody doesn’t have to try to make these predictions. There are huge dollars riding on the future of every telecom industry segment. Companies that invest in these industries want outside opinions on the direction of an industry. If I was developing a new OTT product like Apple is doing, I’d want some feel for the potential of my new investment. I’d want to gather as many different predictions about the future of the OTT market as possible. The above two predictions were announced publicly, but corporations regularly pay for private market assessments that never see the light of day.

To show how hard it is to make such predictions, I want to look a little more closely at the Kagan prediction. They are predicting that in five years there will be 17.5 million more homes that buy broadband and don’t buy a traditional TV product. There a number of factors and trends that would feed into that number:

  • It looks like first-time households of millennials and generation Z don’t subscribe to cable TV at nearly the same levels as their parents. Some portion of the increase in broadband-only homes will come from these new households.
  • While final numbers are still not in for 2018 it appears that there will be around 2 million homes that cut the cord last year and dropped cable TV. Is the future pace of cord cutting going to be faster, slow or stay the same? Obviously, predicting the future of cord cutting is a huge piece of the prediction.
  • It’s becoming a lot more complicated for a household to replace traditional cable. It looks like every major owner of content wants to put their unique content into a separate OTT service like CBS All Access did with the Star Trek franchise. The cost of subscribing to multiple OTT services is already getting expensive and is likely to get even costlier over time. Surveys have shown that households cut the cord to save money, so how will cord cutting be impacted if there are no savings from cutting the cord?
  • The big cable companies are creating new video products aimed at keeping subscribers. For instance, Comcast is bundling in Netflix and other OTT products and is also rolling out smaller and cheaper bundles of traditional programming. They are also allowing customers to view the content on any device, so buying a small bundle from Comcast doesn’t feel much different to the consumer than buying Sling TV. What impact will these countermeasures from the cable companies have on cord cutting?

I’m sure there are other factors that go into predicting the number of future homes without traditional cable TV and these few popped into my mind. I know that companies like Kagan and Parks have detailed current statistics on the industry that are not available to most of us. But statistics only take you so far, and anybody looking out past the end of 2019 is entering crystal ball territory. Five years is forever in a market that is as dynamic as cable TV and OTT content.

We aso know from past experience that there will be big changes in these industries that will change the paridigm. For example, the content owners might all decide that there is no profit in the OTT market and could kill their own OTT products and cause an OTT market contraction. Or a new entrant like Apple might become a major new competitor for Netflix and the demand for OTT services might explode even faster than expected. I don’t know how any prediction can anticipate big market events that might disrupt the whole industry.

Understand that I am not busting on these two predictions – I don’t know enough to have the slightest idea if these predictions are good are bad. These companies are paid to make their best guess and I’m glad that there are firms that do that. For example, Cisco has been making predictions annually for many years about the trajectory of broadband usage and that information is a valuable piece of the puzzle for a network engineer designing a new network. However, predicting how all of the different trends that affect video subscriptions over five years sounds like an unsolvable puzzle. Maybe if I’m still writing this blog five years from now I can check to see how these predictions fared.  One thing I know is that I’m not ready to take any five-year forecast of the cable industry to the bank.

The Status of the CAF II Deployments

The Benton Foundation noted last month that both CenturyLink and Frontier have not met all of their milestones for deployment of CAF II. This funding from the FCC is supposed to be used to improve rural broadband to speeds of at least 10/1 Mbps. As of the end of 2018, the CAF II recipients were to have completed upgrades to at least 60% of the customers in each state covered by the funding.

CenturyLink took funding to improve broadband in 33 states covering over 1 million homes and businesses. CenturyLink claims to have met the 60% milestone in twenty-three states but didn’t make the goal in eleven states: Colorado, Idaho, Kansas, Michigan, Minnesota, Missouri, Montana, Ohio, Oregon, Washington, and Wisconsin.

Frontier received CAF II funding to improve broadband to over 774,000 locations in 29 states. Frontier says they’ve met the milestone in 27 states but haven’t reached the 60% deployment milestone in Nebraska and New Mexico.  There were a number of other large telcos that took CAF Ii funding like AT&T, Windstream, and Consolidated, and I have to assume that they’ve reported meeting the 60% milestone.

Back in 2014 when it looked like the CAF II program might be awarded by reverse auction, we helped a number of clients take a look at the CAF II service areas. In many cases, these are large rural areas that cover 50% or more of most of the rural counties in the country. Most of my clients were interested in the CAF II money as a funding mechanism to help pay for rural fiber, but all of the big telcos other than AT&T announced originally that they planned to upgrade existing DSL. AT&T announced a strategy early on to used fixed cellular wireless to satisfy their CAF II requirements. Since then a few big telcos like Frontier and Windstream have said that they are also using fixed wireless to meet their obligations.

To us, the announcement that the telcos were going to upgrade DSL set off red flag alarms. In a lot of rural counties there are only a small number of towns, and those towns are the only places where the big telcos have DSLAMs (the DSL hub). Rural telephone exchanges tend to be large and the vast majority of rural customers have always been far out of range of DSL that originates in the small towns. One only has to go a few miles – barely outside the towns – to see DSL speeds fall off to nothing.

The only way to make DSL work in the CAF II areas would be to build fiber to rural locations and establish new DSL hub sites. As any independent telco can tell you who deployed DSL the right way, this is expensive because it takes a lot of the rural DSLAMs to get within range of every customer. By electing DSL upgrades, the big telcos like CenturyLink and Frontier had essentially agreed to build a dozen or more fiber DSLAMs in each of the rural counties covered by CAF II. My back-of-the-envelope math showed that was going to cost a lot more than what the companies were receiving from the CAF fund. Since I knew these telcos didn’t want to spend their own money in rural America, I predicted execution failures for many of the planned DSL deployments.

I believe the big telcos are now facing a huge dilemma. They’ve reached 60% of customers in many places (but not all). However, it is going to cost two to three times more per home to reach the remaining 40% of homes. The remaining customers are the ones on extremely long copper loops and DSL is an expensive technology use for reaching these last customers. A DSLAM built to serve the customers at the ends of these loops might only serve a few customers – and it’s hard to justify the cost of the fiber and electronics needed to reach them.

I’ve believed from the beginning that the big telcos building DSL for the CAF II program would take the approach of covering the low hanging fruit – those customers that can be reached by the deployment of a few DSLAMs in a given rural area. If that’s true, then the big telcos aren’t going to spend the money to reach the most remote customers, meaning a huge number of CAF II customers are going to see zero improvements in broadband. The telcos mostly met their 60% targets by serving the low-hanging fruit. They are going to have a huge challenge meeting the next milestones of 80% and 100%.

Probably because I write this blog, I hear from folks at all levels of the industry about rural broadband. I’ve heard a lot of stories from technicians telling me that some of the big telcos have only tackled the low-hanging fruit in the CAF builds. I’ve heard from others that some telcos aren’t spending more than a fraction of the CAF II money they got from the FCC and are pocketing much of it. I’ve heard from rural customers who supposedly already got a CAF II upgrade and aren’t seeing speeds improved to the 10/1 threshold.

The CAF II program will be finished soon and I’m already wondering how the telcos are going to report the results to the FCC if they took shortcuts and didn’t make all of the CAF II upgrades. Will they say they’ve covered everybody when some homes saw no improvement? Will they claim 10/1 Mbps speeds when many households were upgraded to something slower? If they come clean, how will the FCC react? Will the FCC try to find the truth or sweep it under the rug?

Telecom R&D

In January AT&T announced the creation of the WarnerMedia Innovation Lab, which is a research group that will try to combine AT&T technology advances and the company’s huge new media content. The lab, based in New York City, will consider how 5G, the Internet of Things, artificial intelligence, machine learning and virtual reality can work to create new viewer entertainment experiences.

This is an example of a highly directed R&D effort to create specific results – in this case the lab will be working on next-generation technologies for entertainment. This contrasts with labs that engage in basic research that allow scientists to explore scientific theories. The closest we’ve ever come to basic research from a commercial company was with Bell Labs that was operated by the old Ma Bell monopoly.

Bell Labs was partially funded by the government and also got research funds from ratepayers of the nationwide monopoly telco. Bell Labs research was cutting edge and resulted in breakthroughs like the transistor, the charge coupled device, Unix, fiber optics, lasers, data networking and the creation of the big bang theory. The Lab created over 33,000 patents and its scientists won eight Nobel Prizes. I was lucky enough to have a tour of Bell Labs in the 80s and I was a bit sad today when I had to look on the Internet to see if it still exists; it does and is now called Nokia Bell Labs and operates at a much smaller scale than the original lab.

Another successor to Bell Labs is AT&T Labs, the research division of AT&T. The lab engages in a lot of directed research, but also in basic research. AT&T Labs is investigating topics such as the physics of optical transmission and the physics of computing. Since its creation in 1996 AT&T Labs has been issued over 2,000 US patents. The lab’s directed research concentrates on technologies involved in the technical challenges of large networks and of working with huge datasets. The Lab was the first to be able to transmit 100 gigabits per second over fiber.

Verizon has also been doing directed research since the spin-off of Nynex with the divestiture of the Bell system. Rather than operate one big public laboratory the company has research groups engaged in topics of specific interest to the company. Recently the company chose a more public profile and announced the creation of its 5G Lab in various locations. The Manhattan 5G Lab will focus on media and finance tech; the Los Angeles lab will work with augmented reality (AR) and holograms; the Washington DC lab will work on public safety, first responders, cybersecurity, and hospitality tech; the Palo Alto lab will look at emerging technologies, education, and big data; and its Waltham, Massachusetts, lab will focus on robotics, healthcare, and real-time enterprise services.

Our industry has other labs engaged in directed research. The best known of these is CableLabs, the research lab outside Denver that was founded in 1988 and is jointly funded by the world’s major cable companies. This lab is largely responsible for the cable industry’s success in broadband since the lab created the various generations of DOCSIS technology that have been used to operate hybrid-fiber coaxial networks. CableLabs also explores other areas of wireless and wired communications.

While Comcast relies on CableLabs for its underlying technology, the company has also created Comcast Labs. This lab is highly focused on the customer experience and developed Comcast’s X1 settop box and created the integrated smart home product being sold by Comcast. Comcast Labs doesn’t only develop consumer devices and is involved in software innovation efforts like OpenStack and GitHub development. The lab most recently announced a breakthrough that allows cable networks to deliver data speeds up to 10 Gbps.