More Crowding in the OTT Market

It seems like I’ve been seeing news almost weekly about new online video providers. This will put even more pressure on cable companies as more people find an online programming option to suit them. This also means that a likely shakeout of the OTT industry with such a crowded field of competitors all vying for the same pool of cord-cutters.

NewTV. This is an interesting new OTT venture that was founded by Jeffrey Katzenberg, former chairman of Walt Disney and headed by Meg Whitman, former CEO of Hewlett Packard Enterprise and also from Disney. The company has raised $1 billion in and has support from every major Hollywood studio including 21st Century Fox, Disney, NBCUniversal, Sony Pictures Entertainment, and Viacom.

Rather than take on Netflix and other OTT content directly the company plans to develop short 10-minute shows aimed exclusively at cellphone users. They plan both free content supported by advertising and a subscription plan that would use the ‘advertising-light’ option used by Hulu.

AT&T already owns a successful OTT product with HBO Now that has over 5 million customers. John Stankey, the head of WarnerMedia says the plan is to create additional bundles of content centered around HBO that bring in other WarnerMedia content and selected external content. He admits that HBO alone does not represent enough content to be a full-scale OTT alternative for customers.

AT&T’s goal is to take advantage of HBO’s current reputation and to position their content in the market as premium and high quality as a way to differentiate themselves from other OTT providers.

Apple has been talking about getting into the content business for a decade, and they have finally pulled the trigger. The company invested $1 billion this year and now has 24 original series in production as the beginning of a new content platform. Among the new shows is a series about a morning TV show starring Reese Witherspoon and Jennifer Aniston.

The company hired Jamie Erlicht and Zack Van Amburg from Sony Pictures Television to operate the new business and has since hired other experienced television executives. They also are working on other new content and just signed a multiyear deal with Oprah Winfrey. The company has not announced any specific plans for airing and using the new content, but that will be coming soon since the first new series will probably be ready by March of 2019.

T-Mobile. As part of the proposed merger with Sprint, T-Mobile says they plan to launch a new ‘wireless first’ TV platform that will deliver 4K video using its cellular platform. On January T-Mobile purchased Layer3 which has been offering a 275 channel HD line-up in a few major markets.

The T-Mobile offering will be different than other OTT in that the company is shooting for what they call the quad play that bundles video, in-home broadband (delivered using cellular frequency), mobile broadband and voice. The company says that the content will only be made available to T-Mobile customers and they view it as a way to reduce churn and gain cellular market share.

The Layer 3 subsidiary will also continue to pursue partnerships to gain access to customers through fiber networks, such as the arrangement they currently have with the municipal fiber network in Longmont, Colorado.

Disney. Earlier this year the company announced the creation of a direct-to-consumer video service based upon the company’s huge library of popular content. Disney gained the needed technology by purchasing BAMTech, the company that supports Major League Baseball online. Disney also is bolstering its content portfolio through the purchase of Twenty-First Century Fox.

Disney plans to launch an ESPN-based sports bundle in early 2019. They have not announced specific plans on how and when to launch the rest of their content, but they canceled an agreement with Netflix for carrying Disney content.

Verizon’s Case for 5G, Part 4

Ronan Dunne, an EVP and President of Verizon Wireless recently made Verizon’s case for aggressively pursuing 5G. This last blog in the series looks at Verizon’s claim that they are going to use 5G to offer residential broadband. The company has tested the technology over the last year and announced plans to soon introduce the technology into a number of cities.

I’ve been reading everything I can about Verizon and I think I finally figured out what they are up to. They have been saying that within a few years that they will make fixed 5G broadband available to millions of homes. One of the first cities they will be building is Sacramento. It’s clear that in order to offer fast speeds that each 5G transmitter will have to be fiber fed. To cover all neighborhoods in Sacramento would require building a lot of new fiber. Building new fiber is both expensive and time-consuming. And it’s still a head scratcher about how this might work in neighborhoods without poles where other utilities are underground.

Last week I read of an announcement by Lee Hick’s of Verizon for a new initiative called One Fiber. Like many large telecoms Verizon has numerous divisions that own fiber assets like the FiOS group, the wireless group and the old MCI business CLEC group. The new policy will consolidate all of this fiber under into a centralized system, making existing and new fiber available to every part of the business. It might be hard for people to believe, but within Verizon each of these groups managed their own fiber separately. Anybody who has ever worked with the big telcos understands what a colossal undertaking it will be to consolidate this.

Sharing existing fiber and new fiber builds among its various business units is the change that will unleash the potential for 5G deployment. My guess is that Verizon has eyed AT&T’s fiber the strategy and is copying the best parts of it. AT&T has quietly been extending its fiber-to-the-premise (FTTP) network by extending fiber for short distances around the numerous existing fiber nodes in the AT&T network. A node on an AT&T fiber built to get to a cell tower or to a school is now also a candidate to function as a network node for FTTP. Using existing fiber wisely has allowed AT&T to claim they will soon be reaching over 12 million premises with fiber – without having to build a huge amount of new fiber.

Verizon’s One Fiber policy will enable them to emulate AT&T. Where AT&T has elected to build GPON fiber-to-the-premise, Verizon is going to try 5G wireless. They’ll deploy 5G cell sites at their existing fiber nodes where it makes financial sense. Verizon doesn’t have as extensive of a fiber network as AT&T and I’ve seen a few speculations that they might pass as many as 7 million premises with 5G within five years.

Verizon has been making claims about 5G that it can deliver gigabit speeds out to 3,000 feet. It might be able to do that in ideal conditions, but their technology is proprietary and nobody knows the real capabilities. One thing we know about all wireless technologies is that it’s temperamental and varies a lot by local conditions. The whole industry is waiting to the speeds and distances Verizon will really achieve with the first generation gear.

The company certainly has some work in front of it to pursue this philosophy. Not all fiber is the same and their existing fiber network probably has fibers of many sizes, ages and conditions using a wide range of electronics. After inventorying and consolidating control over the fiber they will have to upgrade electronics and backbone networks to enable the kind of bandwidth needed for 5G.

The Verizon 5G network is likely to consist of a series of cell sites serving small neighborhood circles – the size of the circle depending upon topography. This means the Verizon networks will  not likely be ubiquitous in big cities – they will reach out to whatever is in range of 5G cell sites placed on existing Verizon fiber. After the initial deployment, which is likely to take a number of years, the company will have to assess if building additional fiber makes economic sense. That determination will consider all of the Verizon departments and not just 5G.

I expect the company to follow the same philosophy they did when they built FiOS. They were disciplined and only built in places that met certain cost criteria. This resulted in a network that, even today, bring fiber to one block but not the one next door. FiOS fiber was largely built where Verizon could overlash fiber onto their telephone wires or drag fiber through existing conduits – I expect their 5G expansion to be just as disciplined.

The whole industry is dying to see what Verizon can really deliver with 5G in the wild. Even if it’s 100 Mbps broadband they will be a competitive alternative to the cable companies. If they can really deliver gigabit speeds to entire neighborhoods then will have shaken the industry. But in the end, if they stick to the One Fiber model and only deploy 5G where it’s affordable they will be bringing a broadband alternative to those that happen to live near their fiber nodes – and that will mean passing millions of homes and tens of millions.

Big ISPs Fighting Privacy

Old padlock with the key in the keyhole lying on a wooden board

One of the quietest regulatory battles is happening at statehouses rather than with regulators. The large ISPs and big Silicon Valley companies have joined forces to kill any legislation that would create Internet privacy.

The privacy battle got started in 2016 when the FCC passed new privacy rules that required ISPs to get permission from customers before selling their personal data or browsing history. Those new rules would have gone into effect in April of 2017. But Congress intervened to kill the new privacy rules before they went into effect. In an effort led by Senator Jeff Flake, Congress added language to the Congressional Review Act, the bill used to approve the federal government budget, that rolled back the FCC’s new rules and that also prohibited the agency from introducing new rules that were ‘substantially similar’.

Since that time there have been numerous attempts in state legislatures to provide privacy rights for citizens. According to Michael Gaynor of Motherboard there have been over 70 bills in state legislatures in the last year that have attempted to introduce consumer privacy – and all have failed.

That’s an amazing statistic considering the public sentiment for putting curbs on ISPs being able to use customer data. A Pew Research poll from earlier this year showed that over two-thirds of people support stronger privacy rules.

The legislative failures have all come due to intense lobbying from ISPs. The big telcos and cable companies have always had a strong presence in statehouses and have contributed to campaign funds for key legislators for years. The lobbying effort has paid off many times in the past, but not always. The lobbying effort for the privacy issue has been particular effective since the big Silicon Valley companies like Google and Facebook have joined forces with the big ISPs.

Those two sets of companies are rarely on the same side on issues, but they all have a vested interest in monetizing customer data. The big web companies like Facebook and Google make most of their money by leveraging customer data. The big ISPs are newer to this business line, but they all have acquired data firms over the last two years to help them compete with Google for advertising dollars.

It’s not talked about a lot, but Silicon Valley firms now spend more money on lobbying in DC than the big ISPs. These companies are newer to lobbying at the state level, but the privacy issue has drawn them into local lobbying in a big way.

The privacy laws passed by the last FCC are similar to those in effect in Europe. Web users there get the choice to opt out of being tracked by online companies and ISPs. Interestingly, a lot of people in Europe elect to make their data available to the web companies. Many people like the personalized advertising and other benefits that comes along with the surveillance. It turns out that many people, particularly Millennials don’t mind being tracked, and are not opting out. Apparently, though, that’s not good enough for the big web companies who want to track everybody online.

There are still ways for consumers who don’t want to be tracked to reduce their web presence. People can use VPNs to bypass their ISP, although there is still a risk of the VPN provider harvesting their data. There are several companies working on creating an encrypted DNS service that hides web searches from ISPs. Numerous people (like me) have dropped services like Facebook that are openly tracking everything done inside the platform. Search engines like Duck Duck Go, which don’t record web searches are growing in popularity.

Of course, one of the best ways to cut down on surveillance is to change service to a small ISP. Small telcos, WISPs, fiber overbuilders and municipal ISPs don’t track and monetize customer data. Unfortunately, most people don’t have an option other than a big ISP. I always advice my clients, who are all small ISPs to emphasize that they don’t spy on their customers – it’s a strong selling point to people who care about privacy.

Technology Promises

I was talking to one of my buddies the other day and he asked what happened to the promise made fifteen years ago that we’d be able to walk up to vending machines and buy products without having to use cash or a credit card. The promise that this technology was coming was based upon a widespread technology already in use at the time in Japan. Japan has vending machines for everything and Japanese consumers had WiFi-based HandiPhones that were tied into many vending machines.

However, this technology never made it to the US, and in fact largely disappeared in Japan. Everybody there, and here converted to smartphones and the technology that used WiFi phones faded away. As with many technologies, the ability to do something like this requires a whole ecosystem of meshing parts – in this case it requires vending machines able to communicate with the customer device, apps on the consumer device able to make purchases, and a banking system ready to accept the payments. We know that smartphones can be made to do this, and in fact there has been several attempts to do so.

But the other two parts of the ecosystem are problems. First, we’ve never equipped vending machines to be able to communicate using cellular spectrum. The holdup is not the technology, but rather the fear of hacking. In today’s world we are leery about installing unmanned edge devices that are linked to the banking system for fear that such devices can become entry points for hackers. This same fear has throttled the introduction of any new financial technology and is why the US was years behind Europe in implementing the credit card readers that accept chips.

The biggest reason we don’t have cellular vending machines is that the US banking system has never gotten behind the idea of micropayments, which means accepting small cash transactions – for example, charging a nickel every time somebody reads a news article. Much of the online world is begging for a micropayment system, but the banking fee structure is unfriendly to the idea of processing small payments – even if there will be a lot of them. The security and micropayment issues have largely been responsible for the slow rollout of ApplePay and other smartphone cash payment systems.

This is a perfect example of an unfulfilled technology. One of the most common original claims for the benefits of ubiquitous cellular was a cashless society where we could wave our phone to buy things – but the entrenched old-technology banking system effectively squashed the technology, although people still want it.

I look now at the many promises being made for 5G and I already see technology promises that are not likely to be delivered. I have read hundreds of articles that are promising that 5G is going to completely transform our world. It’s supposed tp provide gigabit cellular service that will make landline connections obsolete. It will enable fleets of autonomous vehicles sitting ready to take us anywhere at a moment’s notice. It will provide the way to communicate with hordes of sensors around us that will make us safer and our world smarter.

As somebody who understands the current telecom infrastructure I can’t help but be skeptical about most of these claims. 5G technology can be made to fulfill the many promises – but the ecosystem of all of the components needed to make these things happen will create roadblocks to that future. It would take two pages just to list all of the technological hurdles that must be overcome to deliver ubiquitous gigabit cellular service. But perhaps more importantly, as somebody who understands the money side of the telecom industry, I can’t imagine who is going to pay for these promised innovations. I’ve not seen anybody promising gigabit cellular predicting that monthly cellphone rates will double to pay for the new network. In fact, the industry is instead talking about how the long-range outlook for cellular pricing is a continued drop in prices. It’s hard to imagine a motivation for the cellular companies to invest huge dollars for faster speeds for no additional revenue.

This is not to say that 5G won’t be introduced and that it won’t bring improvements to cellular service. But I believe that a decade from now that if we pull out some of the current articles written about 5G that we’ll see that most of the promised benefits were never delivered. If I’m still writing a blog I can promise this retrospective!

 

p.s – I can’t ignore that sometimes the big technology promises come to pass. Some of you remember the series of AT&T ads that talked about the future. One of my favorite AT&T ads asked the question “Have you ever watched the movie you wanted to the minute you wanted to?”. This ad was from 1993 and promised a future where content would be at our finger tips. That was an amazing prediction for a time when dial-up was still a new industry. Any engineer at that time would have been skeptical about our ability to deliver large bandwidth to everybody – something that is still a work in process. Of course, that same ad also promised video phone booths, a concept that is quaint in a world full of smartphones.

Looking Closer at CAF II Broadband

AT&T is making the rounds in rural Kentucky, not too far from where I live, and is announcing the introduction of their residential wireless broadband product that is the result of the FCC’s CAF II program. Today I’m looking at more detail at that product.

AT&T was required under the CAF II rules to deliver broadband speeds of at least 10 Mbps download and 1 Mbps upload. AT&T says Kentucky announcement that they will be delivering products with at least that much speed, so it’s possible that customers might see something a little faster. Or the company could cap speeds at 10 Mbps and we’ll have to wait for reports from customers about actual speeds.

AT&T accepted nearly $186 million in FCC funds to bring CAF II broadband capabilities to 84,333 households in the state, or $2,203 per household. They say all of those homes will have the broadband available by the end of 2020 (although there is no penalty if some of the homes don’t get covered – which one would expect since many homes are likely to be too far from a cell tower).

AT&T will be delivering the broadband in Kentucky using LTE broadband from cellphone towers. This is delivered to homes by placing a small antenna box (not a dish) on the exterior of a home. They say that they will be using a different set of frequencies for CAF II broadband than what is used for cellular service, meaning there should be no degradation of normal cellular service.

I saw a news article in Kentucky that says the price will be $50 per month, but that’s a special one-year price offer for customers also willing to sign up for DirecTV. Following are more specific details of the normal product and pricing:

  • Customers can get a price of $60 per month for 1-year by signing a 12-month contract. After the year the price increases to $70 per month and is set at $70 per month for those not willing to agree to a contract.
  • Customers signing a contract see no installation charge, but otherwise there is a $99 one-time fee to connect.
  • There is an early termination charge for customers that break the one-year contract of $10 for each remaining month of the contract.
  • There is a $150 fee for customers who don’t return the antenna box.
  • There is a monthly data cap of 170 Gigabytes of downloaded data. Customers pay $10 for each additional 50 GB of download up to a maximum of $200 per month. AT&T is offering a 340 GB monthly data cap right now for customers who bundle with DirecTV – but that’s a temporary offer until October 1.
  • AT&T also will layer on a monthly $1.99 administrative fee that they pocket.

I think the pricing is far too high considering that the $186 million given to AT&T probably paid for all, or nearly all of the cost of the upgrades needed to deliver the service. Some of that money probably was used to bolster fiber to rural cell sites and the funding would have been used to add the new electronics to cell sites. AT&T used free federal money to create a $72 monthly broadband product, and before even considering the data cap is a product with a huge margin return since AT&T doesn’t have to recover the cost of the underlying network.

The small data cap is going to generate a lot of additional revenue for AT&T. The monthly data cap of 170 GB is already too small. Comcast just reported in June that the average download for all of their 23 million broadband customers was 151 GB per month. That means there are already a significant number of homes that want to use more than AT&T’s monthly 170 GB cap. We know that monthly home demand for broadband keeps growing and the Comcast average just a year ago was 128 GB per month. With that growth, within a year the average customer will want more than AT&T’s cap.

A few years ago when I was on Comcast they measured my 3-person home as using nearly 700 GB per month. On the AT&T plan my monthly bill would be $180 per month. Within a few years most homes will want to use more data than AT&T’s cap. The FCC really screwed the public when they didn’t insist that carriers taking the funding should provide unlimited downloads, or at least some high data cap like 1 terabyte. That stingy data cap gives AT&T permission to print money in rural America.

The 10 Mbps speed is also a big problem. That speed today is already inadequate for most households who now want to engage in multiple simultaneous streams. I’ve written many times about the huge inefficiencies in home WiFi and a 10 Mbps connection is just barely adequate for two video streams as long as there are no other broadband uses in the home at the same time. A typical home with kids these days is going to want to simultaneously watch video, do homework, play games, browse the web, download files or work from home. A home with a 10 Mbps speed is not close to equivalent to much faster urban broadband connections. You don’t have to look forward more than a few years to know that a 10 Mbps data caps is soon going to feel glacially slow.

Finally, cellular data has a higher latency than landline broadband, with latency as high as 100 msec. Customers might have problems at times on this product maintaining video streams, making VoIP calls or staying connected to a school or work server.

I’m sure that a home that has never had broadband is going to welcome this product. But it’s not going to take them long to realize that this is not the same broadband available to most homes. They are also going to realize that it’s possibly the last speed upgrade they are going to see for a long time since AT&T and the FCC want to check off these homes as now having broadband.

Who WIll the Big ISPs Blame Now?

For the last few years the biggest ISPs have blamed regulations for reducing the amount of capital they are willing to invest. They specifically blamed Title II regulation of broadband and net neutrality rules as being a disincentive for them to invest in broadband infrastructure.

The FCC Chairman Ajit Pai adopted this same narrative and used it for justification to repeal net neutrality. He is still sticking to this story now that Title II regulation has been repealed and this week will be telling this story to the House Communications Subcommittee. In a prepared statement he claims that the repeal of the net neutrality rules is now paving the way for increased capital investment and better broadband service.

However, the whole narrative is false. There is no evidence that big ISPs held back on broadband investments before Title II authority was repealed and there is no evidence that the repeal has somehow unleashed a wave of new broadband investment. I’m not going to track the numbers in this blog, but the capital budgets of all of the big ISPs have been relatively steady for a number of years.

This particular narrative is just the latest iteration on a theme that the big ISPs have used for decades. The big ISPs have always publicly claimed that regulations were killing them, while privately admitting that they were able to successfully work around most regulations. It’s the nature of regulated industries to push back against regulation and ISPs don’t differ from the many other regulated industries in this regard.

A quick look at each of the major ISPs shows a different story than is being pushed by Chairman Pai. AT&T is a good example. They were required by an agreement from the purchase of DirecTV to pass 12 million homes and businesses with fiber. For a while it looks like they were shirking that requirement, but somewhere along the line they seem to have embraced it. They have been quietly extending fiber to apartment complexes and also to any homes or business that are located close to any of their many fiber nodes around the country. This expansion started well before the net neutrality repeal. AT&T for now has no plans to deploy 5G and says they don’t see a business case for it yet. The company is quietly walking away from rural copper and only beefing up rural cellular broadband where the FCC funded it with CAF II money.

Comcast doesn’t seem to have changed strategies for a number of years. They build fiber to shrink node sizes to relieve local network congestion. They made a decision well before the net neutrality appeal to embrace upgrades to DOCSIS 3.1. The company has entered the cellphone business, but for now resells minutes from the other cellular company networks, and only in their operating footprint. They say they plan to eventually build cellular networks to increase the profitability of the business.

Verizon has been shrinking their landline broadband networks and sold a pile of customers, including many on FiOS fiber to Frontier. The company made an announcement several years ago, and before the repeal of net neutrality that they were going to build new FiOS fiber in Boston – but it appears that project has largely been put on hold. Verizon might be the only big ISP who claims to have plans to expand residential broadband and says it will build 5G in a number of markets outside of its traditional footprint. But there is a lot of industry skepticism that this will be much larger trials of new technology and not a major capital outlay.

CenturyLink recently made it clear that they are walking away from making new broadband investments. They new CEO made it clear that the company will not be making any new capital expenditures that will earn infrastructure levels of returns. That is a 180-degree turnaround from a company that built fiber in 2017 to pass 900,000 premises and is the opposite of what Chairman Pai is claiming.

All of the big telcos have largely abandoned DSL and haven’t made new investments for years, even though there are faster DSL technologies available. To make matters worse the telcos are trying to kill the regulations from the Telecommunications Act of 1996 that allows competitors to offer faster DSL using telco copper – a move that would kick hundreds of thousands of customers nationwide off of decent broadband and force the back to the more expensive cable monopolies.

I can’t see any evidence from the big ISPs that the repeal of net neutrality made any difference in their capital spending plans. When you look at what these ISPs tell their investors the topic of regulation never arises – which it shouldn’t. The big ISPs have always invested in areas where they could foresee returns and regulation had no real negative impact on those returns. The whole false narrative has been a lobbying effort to get out from under regulation – and with this FCC the lobbying worked.

Now that Title II regulation is dead I wonder what the ISPs will blame for not investing in residential and rural broadband? They can’t point the finger any longer at regulations and I’m sure they will find a new story that sounds good. The only ISP that seems to be telling the truth is CenturyLink, and I suspect that they will soften that narrative since they are telling existing residential customers that they no longer care about them.

Relying on Cellular Broadband (Part II)

One of my recent blogs talked about the reliability of cellular data as a substitute for wireline broadband. Almost immediately I had an example of a wireless outage shoved in my face. I was in Phoenix at an all-day meeting. When I left at about 4:00 I tried my Uber app and it wasn’t working. The app cycled through but would not find a driver. This was inconvenient because I was standing in the 100-degree sun, so I immediately looked for shade. I tried a few more times. Giving up on Uber I tried Lyft and got the same results. Now I’m figuring a data outage, but since Android phones are sometimes squirrelly, to be safe I rebooted my phone.

That didn’t work and I was standing waiting in hot weather to get a ride to my hotel which was 20-miles away. Uber, Lyft and taxis were out of the question. Luckily my voice was still working, so I called my wife who ordered an Uber for me. But had she not been available I’m not sure how I would have gotten to my hotel. I’m picturing the huge number of other people this also inconvenienced. How many people landed at an airport and couldn’t get a ride? How many people were driving and suddenly lost access to their mapping software? How many businessmen were traveling and couldn’t read or respond to email?

When I got back to a landline connection I looked at the AT&T outage website and it was lit up like a Christmas tree. It looked like the east coast was totally out, but almost every other NFL city also showed an outage. Phoenix, which I knew to be out, didn’t even show on the map as having a problem, and it’s possible that the whole nationwide AT&T network had a data outage. A few days later I checked and AT&T had said nothing about the cause of the outage. Their outage website shows a 17-hour outage that day, without specifying the extent or the reason for the outage.

There is obviously something shoddy in the AT&T national network if an event of any kind can knock out the whole nationwide data network for that long. It’s hard to believe that the company would not have redundant backup for every critical system that is needed to keep the network functioning. There are only a few possible explanations. Possibly some critical component of data routing failed, such as their DNS system that routes Internet traffic for cellphones. The company might also have gone too far with software defined networking and created some new points of failure that could affect the whole network. Or the company had a major fiber cut that feeds the site of one of those key network systems. There is no excuse for any of these possibilities, and a company with nearly 160 million customers ought to have redundancy for every critical component of their wireless network.

I contrast this to the hundreds of companies I know with landline broadband networks. All of my clients worry about total network failure and they work hard to avoid it. Unless they are geographically isolated, most of my clients have redundant routes between their network and the Internet. They generally have redundancy of key routers and switches to keep critical functions operational. Most of my clients have almost no outages that are not caused in the last mile. Local broadband networks are always susceptible to cable cuts in the last mile. But those cuts, by design, only knock out customers who are ‘downstream’ from the cut. It’s becoming extremely rare for my clients to have a total network outage, and if they do they usually take steps to stop it from happening a second time.

The press is in love with wireless right now and there are dozens of articles every month declaring how wireless is our future. Cellphones are going to become blazingly fast and 5G will fill in the gaps where cellular isn’t good enough. I’ve written enough blogs about this that you probably know that I think we are still a number of years away from seeing such wireless technologies.

But this outage makes me wonder about whether people will ever fully trust wireless technologies if they are operated by the big ISPs. The big ISPs are cavalier about network outages and they seem to suppose that their customers will just accept them. If my ISP clients had a 17-hour outage they would have taken steps after the outage to made amends with customers. They would have explained the cause of the outage and talked about their plans to make sure that it didn’t happen again. They likely would have given every customer a day’s credit on their bill for the downtime.

It astounds me that something like this outage could happen. If I was the head of AT&T, heads would have rolled after this was fixed. There is no excuse for a company with a $23 billion annual capital budget to have a network that is vulnerable to a widespread outage. The only reason the company could have such outages is that they don’t place value on redundancy. Until the big ISPs can make their wireless networks as reliable as landline networks I will never consider using them for broadband. I can’t see customers sticking with a 5G network that has a 17-hour outage. Broadband is now critical to many of us and I expect outages to be measured in minutes, not in hours or days.

Shrinking Cellular Backhaul Revenues

There are a few carriers that rely on cellular backhaul as a major part of their revenue stream, but there are many more carriers that provide transport to a handful of cell sites. In all cases these are some of the highest-margin and lucrative products sold on the market today, and a business line that every carrier wants to keep. However, there are big changes coming in the cellular market and today I will look at the trends that are going to affect this market over the next decade.

Increasing Bandwidth Demand. The growth in bandwidth demand at many cell sites is explosive with the overall growth in cellular data doubling every 18 months. This growth is not the same everywhere with growth coming in cell sites serving residential customers and not in older cell sites built to satisfy highway phone coverage.

The demand growth is being driven by several factors. First, it’s becoming far more prevalent for customers to use cellphones to watch video. Part of that growth in demand comes directly from the big cellular companies which are bundling in access to content as part of the service. But a more important reason for the growth in demand is that the historic reluctance of customers to use cellular data is eroding as the cellular companies push ‘unlimited’ data plans.

Demands for Lower Transport Costs. Cellular service has become a commodity. The industry is no longer adding many new customers since almost everybody has a cellphone. This has led to price wars between cellular providers, and lower average customer prices are driving the cellular companies to look to cost reductions. At least in urban areas they are starting to also lose significant customers to Comcast, with Charter just entering the fray.

Recently I’ve seen cellular companies ask for lower prices as contracts get renewed or else demand greater bandwidth for the prices already in place. This means that fiber owners are not likely to see increases in revenues even as the bandwidth they are delivering grows.

Cellular Carriers Building Fiber. I’ve had several clients tell me recently that Verizon or AT&T is building fiber in their area. While this construction might be to reach a new large customer, the most likely reason these companies are building right now is to eliminate leased transport at cell sites. This is not just happening in urban areas and one of my clients who serves a market of 10,000 homes tells me that Verizon is building fiber to all of the cell sites in the area.

Verizon made headlines last year when they ordered $1 billion in fiber. AT&T is also building furiously. If you believe the claims made by T-Mobile and Sprint as part of the proposed merger – they also will be expanding their own fiber.

I also expect the cellular carriers to make reciprocal deals to swap fiber connections at cell sites where they now own fiber. If Verizon and AT&T each build to 2,000 cell sites they could easily swap transport and both gain access to 4,000 cell sites – that’s a huge nationwide decrease in transport revenues for others.

Growth of Small Cells. Layered on top of all of this is the predicted growth of small cell sites. I don’t think anybody knows how big this might market grow. I’ve seen optimistic predictions that small sell sites will be everywhere and other predictions that the business case for small cell sites might never materialize. Many of my clients are seeing the deployment of a few small cell sites to relive 4G congestion, but it’s hard to predict in smaller markets if this will ever expand past that.

One thing we can know for sure is that the cellular carriers will not be willing to pay the same prices for connection to small cell sites that they’ve been paying for the big cell tower sites. By definition, a smaller cell site is going to serve a smaller number of customers and the pricing must be reduced accordingly for it to make sense for the cellular providers.

Conclusion. My best guess is that cellular transport will be hit and miss depending up the specific local situation. There are many who will lose all cell site transport where the cellular carriers decide to build their own fiber. But even where they don’t build fiber I would expect the cellular carriers to bring the threat of physical bypass into price negotiations to drive transport prices far below where they are today.

This is a natural economic consequence of cellular becoming a commodity. As the cellular industry tightens its belt it’s going to demand lower costs from its supply chain. Transport costs are one of the major costs of the cellular industry and the most natural place for them to look to reduce costs. The big cell companies already understand this future which is one of the primary reasons they are furiously building fiber today while they have the cash to do so.

The Growing Dislike of Big ISPs

The annual ratings from the American Consumer Satisfaction Index came out recently, and they show that consumer dislike for the big ISPs is increasing. This survey looks at how consumers feel about a wide range of businesses, and the ISPs have been ranked as some of the most disliked corporations for a number of years.

The survey asks numerous questions and creates a satisfaction scale from 1 to 100. The survey looks at several different categories of telecom companies and has separate rankings for for cable TV providers, broadband providers and a new category for streaming video providers.

Among the big ISPs that offer cable TV service, the rank of every provider except AT&T U-Verse sank compared to last year. AT&T was the highest rated company in this group with a rating of 70. At the bottom was Mediacom with a rating of 55, down from 56 a year ago. The two giant cable companies both saw a drop in consumer satisfaction: Charter had a huge drop from 63 down to 58, Comcast dropped from 58 to 57.

The rankings for how consumers feel about their broadband provider were similar. The only big ISP that didn’t drop was Comcast that stayed at a ranking of 60 for two years running. Everybody other big ISP dropped. At the top of the list was Verizon FiOS which dropped from 71 to 70. At the bottom was Mediacom again which had a big drop from 58 to 53. Charter also had a big drop from 63 to 58. Rounding out the bottom rankings were Frontier (54), Windstream (56) and CenturyLink (58)

Streaming services got significantly higher rankings. Topping this first time list were Netflix, Playstation Vue and Twitch with a ranking of 78. At the bottom were Sony Crackle (68), Showtime Anywhere (70) and DirecTV Now (70), all still significantly better than traditional cable companies.

It must be frustrating for the big ISPs to see their customer satisfaction drop year after year. The rankings of the ISPs are lower than other unpopular industries like airlines, banks, insurance companies and even the Internal Revenue Service.

If there is any upside to the low customer satisfaction rankings it’s that it creates opportunities for competitors. It’s been conventional wisdom for years that a new competitor will get up to 30% of a market just for showing up with an alternative network – assuming they know how to sell and have decent customer service.

They survey doesn’t dig into the reasons for the sinking dissatisfaction, but it’s easy to speculate on some of the reasons. People are certainly unhappy with traditional cable TV due to the ever-rising prices. High prices are the number one factor cited for consumers who are cutting the cord, and the dropping satisfaction shows there is likely another growing pile of future cord cutters.

It’s a little harder to understand the dissatisfaction with broadband. At least in major metropolitan areas the ISPs have continued to unilaterally increase download speeds with only modest rate hikes. One would expect satisfaction with the the broadband product to be higher and my guess is that the low ranking deal more with the pain involved in having to ever call these big companies. Compared to other businesses we all deal with, the interaction with the cable company / ISP is often the one we dread the most. The other likely cause for dissatisfaction is that ISPs often don’t deliver the speeds they promise. This varies by market, but we’ve seen cities where consumers only get a fraction of the speed they are paying for.

It’s much easier to understand unhappiness with ISPs immediately outside of big cities. Broadband is smaller towns is often still generations behind and is inadequate for what households expect today in terms of download speeds and latency. Anybody who reads this blog will understand the near-hatred for the ISPs in rural areas. The cable companies don’t come to rural America and the big telcos have abandoned maintenance of the copper networks for decades. Rural broadband is either poor or nonexistent with practically everybody hating the companies that won’t bring them broadband.

 

Predicting Broadband Usage on Networks

One of the hardest jobs these days is being a network engineer who is trying to design networks to accommodate future broadband usage. We’ve known for years that the amount of data used by households has been doubling every three years – but predicting broadband usage is never that simple.

Consider the recent news from OpenSource, a company that monitors usage on wireless networks. They report a significant shift in WiFi usage by cellular customers. Over the last year AT&T and Verizon have introduced ‘unlimited’ cellular plans and T-Mobile has pushed their own unlimited plans harder in response. While the AT&T and Verizon plans are not really unlimited and have caps a little larger than 20 GB per month, the introduction of the plans has changed the mindset of numerous users who no longer automatically seek WiFi networks.

In the last year the percentage of WiFi usage on the Verizon network fell from 54% to 51%; on AT&T from 52% to 49%, and on T-Mobile from 42% to 41%. Those might not sound like major shifts, but for the Verizon network it means that the cellular network saw an unexpected additional 6% growth in data volumes in one year over what the company might normally have expected. For a network engineer trying to make sure that all parts of the network are robust enough to handle the traffic this is a huge change and means that chokepoints in the network will appear a lot sooner than expected. In this case the change to unlimited plans is something that was cooked-up by marketing folks and it’s unlikely that the network engineers knew about it any sooner than anybody else.

I’ve seen the same thing happen with fiber networks. I have a client who built one of the first fiber-to-the-home networks and use BPON, the first generation of electronics. The network was delivering broadband speeds of between 25 Mbps and 60 Mbps, with most customers in the range of 40 Mbps.

Last year the company started upgrading nodes to the newer GPON technology, which upped the potential customer speeds on the network to 1 gigabit. The company introduced both a 100 Mbps product and a gigabit product, but very few customers immediately upgraded. The upgrade meant changing the electronics at the customer location, but also involved a big boost in the size of the data pipes between neighborhood nodes and the hub.

The company was shocked to see data usage in the nodes immediately spike upward between 25% and 40%. After all, they had not arbitrarily increased customer speeds across-the-board, but had just changed the technology in the background. For the most part customers had no idea they had been upgraded – so the spike can’t be contributed to a change in customer behavior like what happened to the cellular companies after introducing unlimited data plans.

However, I suspect that MUCH of the increased speeds still came from changed customer behavior. While customers were not notified that the network had been upgraded, I’m sure that many customers noticed the change. The biggest trend we see in household broadband demand over the last two years is the desire by households to utilize multiple big data streams at the same time. Before the upgrades households were likely restricting their usage by not allowing kids to game or do other large bandwidth activities while the household was video streaming or doing work. After the upgrade they probably found they no longer had to self-monitor and restrict usage.

In addition to this likely change in customer behavior the spikes in traffic also were likely due to correcting bottlenecks in the older fiber network that the company had never recognized or understood. I know that there is a general impression in the industry that fiber networks don’t see the same kind of bottlenecks that we expect in cable networks. In the case of this network, a speed test on any given customer generally showed a connection to the hub at the speeds that customers were purchasing – and so the network engineers assumed that everything was okay. There were a few complaints from customers that their speeds bogged down in the evenings, but such calls were sporadic and not widespread.

The company decided to make the upgrade because the old electronics were no longer supported by the vendor and they also wanted to offer faster speeds to increase revenues. They were shocked to find that the old network had been choking customer usage. This change really shook the engineers at the company and they feared that the broadband growth curve was going to now be at the faster rate. Luckily, within a few months each node settled back down to the historic growth rates. However, the company found itself instantly with network usage they hadn’t expected for at least another year, making them that much closer to the next upgrade.

It’s hard for a local network owner to predict the changes they are going to affect the network utilization. For example, they can’t predict that Netflix will start pushing 4K video. They can’t know that the local schools will start giving homework that involves watching a lot of videos at home. Even though we all understand the overall growth curve for broadband usage, it doesn’t grow in a straight line and there are periods of faster and slower growth along the curve. It’s enough to cause network engineers to go gray a little sooner than expected!