5G vs. WiFi

The big cellular carriers envision a future where every smart device is connected to their cellular networks rather than to WiFi. They envision every home having to pay a monthly subscription to maintain connectivity for their wired devices. They envision every new car and truck coming with a subscription to cellular service.

I notice that the cellular providers talk about generating IoT revenues, but they’re never specific that the real vision is for everybody to buy additional cellular subscriptions. Most IoT applications will be low-bandwidth yet the carriers have been spreading the false message that 5G is all about faster broadband. I just saw another ludicrous article yesterday predicting how 5G was going to bring mobile gigabit broadband to rural America – a pure fantasy that is being fed by the public relations machines at Verizon and AT&T.

We aren’t seeing much press about the most important aspect of the new 5G specifications – that each cell site will be able to make up to 100,000 simultaneous connections. This isn’t being done for cellphones. It’s rare these days except in a few over-crowded places for a cellular call not to be connected. Placing a few small cell sites at the busiest places in most cities could solve most cellular bottlenecks without an upgrade to 5G.

The 100,000 connections give the wireless carriers the tool that can make a connection to every smart TV, smart washer and dryer, home video camera, burglar alarm sensor and every other wired device in a home. The big carriers are launching a direct challenge to WiFi as the wireless technology of choice for connecting our devices.

AT&T and Verizon envision every home having a new $10, $20 or $30 subscription to keep all of the devices connected. They also envision becoming the repository of all IoT data – moving them in front of Google and others in the chase for collecting the big data that drives advertising revenues. This is something they definitely don’t talk about.

It doesn’t take much of a thought exercise to understand that 5G is not about faster cellular service. Cellular subscribers will gladly take faster cellular broadband, but they probably aren’t willing to pay more for it. T-Mobile is already making that clear by announcing that they won’t charge more for 5G. The carriers are not going to spend tens of billions to implement 5G cellular technology that doesn’t drive the new revenues needed to pay for it. 5G is about IoT, plain and simple.

Today all of our home devices use WiFi. While WiFi is far from perfect, it seems to do an adequate job in connecting to the video camera at the front door, the smart TV, and the sensors in various appliances and devices around the home. WiFi has a few major advantages over cellular broadband – it’s already in our homes and connected to our devices and doesn’t require an additional monthly subscription.

I think people will resist another forced subscription. HP recently reported that the vast majority of their customers that buy 4G LTE-enabled laptops disable the cellular connection almost as soon as the new computer is out of the box. In this day of cellphones, very few car owners sign-up for the cellular subscription for OnStar when the free trial expires. I know that I personally would not buy a home device that eventually needed another cellular subscription to function.

The cellular carriers make a valid point in saying that WiFi is already growing inadequate for busy homes. But there are already short-term and long-term fixes on the way. The short-term fix is the upcoming migration to WiFi 6 using the 802.11ax standard. The new WiFi will better use MIMO antennas, frequency slicing and other techniques to allow for prioritization of devices and a more reliable connection to multiple devices.

The ultimate indoor broadband network will be a combination of WiFi and millimeter wave, or even faster spectrum. Higher frequency spectrum could provide bandwidth for the devices that use big bandwidth while keeping other devices on mid-range spectrum WiFi – getting the best from both sets of spectrum. That combination will allow for the easy integration, without interference for the connection of gigabit devices and also of tiny sensors that only communicate sporadically.

This is not the future that AT&T and Verizon want, because this is a world controlled by consumers who buy the wireless boxes that best suit them. I envision a future indoor-only wireless network that won’t require licensed spectrum or a cellular subscription since the millimeter waves and other higher frequencies won’t pass outdoors through walls.

The cellular carriers will have a monopoly on the outdoor sensor market. They will undoubtedly make the connections to smart cars, to smart agriculture, and to outdoor smart city sensors. But I think they will have a huge uphill battle convincing households to pay another monthly subscription for something that can be done better using a few well-placed routers.

One-Web Launches Broadband Satellites

Earlier this month OneWeb launched six test satellites intended for an eventual satellite fleet intended to provide broadband. The six satellites were launched from a Soyuz launch vehicle from the Guiana Space Center in Kourou, French Guiana.

OneWeb was started by Greg Wyler of Virginia in 2012, originally under the name of WorldVu. Since then the company has picked up heavy-hitter investors like Virgin, Airbus, SoftBank and Qualcomm. The company’s plan is to launch an initial constellation of 650 satellites that will blanket the earth, with ultimate deployment of 1,980 satellites. The plans are to deploy thirty of the sixty-five pound satellites with each launch. That means twenty-two successful launches are needed to deploy the first round.

Due to the low-earth orbits of the satellites, at about 745 miles above earth, the OneWeb satellites will avoid the huge latency that is inherent from current satellite broadband providers like HughesNet, which uses satellites orbiting at 22,000 miles above the earth. The OneWeb specifications filed with the FCC talks about having latency in the same range as cable TV networks in the 25-30 millisecond range. But where a few high-orbit satellites can see the whole earth, the big fleet of low-orbit satellites is needed just to be able in see everywhere.

The company is already behind schedule. The company had originally promised coverage across Alaska by the end of 2019. They are now talking about having customers demos sometime in 2020 with live broadband service in 2021. The timeline matter for a satellite company because the bandwidth license from the FCC requires that they launch 50% of their satellites within six years and all of them within nine years. Right now, OneWeb and also Elon Musk’s SpaceX have both fallen seriously behind the needed deployment timeline.

The company’s original goal was to bring low-latency satellite broadband to everybody in Alaska. While they are still talking about bringing broadband to those who don’t have it today, their new business plan is to sell directly to airlines and cruise ship lines and to sell wholesale to ISPs who will then market to the end user.

It will be interesting to see what kinds of speeds will really be delivered. The company talks today about a maximum speed of 500 Mbps. But I compare that number to the claim that 5G cellphones can work at 600 Mbps, as demonstrated last year by Sprint – it’s possible only in a perfect lab setting. The best analog to a satellite network is a wireless transmitter on a tower in a point-to-multipoint network. That transmitter is capable of making a relatively small number of big-bandwidth connections or many more low-bandwidth connections. The economic sweet spot will likely be to offer many connections at 50 – 100 Mbps rather than fewer connections at a higher speed.

It’s an interesting business model. The upfront cost of manufacturing and launching the satellites is high. It’s likely that a few launches will go awry and destroy satellites. But other than replacing satellites that go bad over time, the maintenance costs are low. The real issue will be the bandwidth that can be delivered. Speeds of 50 – 100 Mbps will be welcomed in the rural US for those with no better option. But like with all low-bandwidth technologies – adequate broadband that feels okay today will feel a lot slower in a decade as household bandwidth demand continues to grow. The best long-term market for the satellite providers will be those places on the planet that are not likely to have a landline alternative – which is why they first targeted rural Alaska.

Assuming that the low-earth satellites deliver as promised, they will become part of the broadband landscape in a few years. It’s going to be interesting to see how they play in the rural US and around the world.

There’s No 5G Race

FCC Chairman Ajit Pai was recently quoted in the Wall Street Journal as saying, “In my view, we’re in the lead with respect to 5G”. Over the last few months I’ve heard this same sentiment expressed in terms of how the US needs to win the 5G race.

This talk is just more hype and propaganda from the wireless industry that is trying to create a false crisis concerning 5G in order to convince politicians that we need to break our regulatory traditions and give the wireless carriers everything they want. After all, what politician wants to be blamed for the US losing the 5G race? This kind of propaganda works. I was just at an industry trade association show and heard three or four people say that the US needs to win the 5G race.

There is no 5G race; there is no 5G war; there is no 5G crisis. Anybody that repeats these phrases is wittingly or unwittingly pushing the lobbying agenda of the big wireless companies. Some clever marketer at one of the cellular carriers invented the imaginary 5G race as a great way to emphasize the importance of 5G.

Stop and think about it for a second. 5G is a telecom technology, not some kind of military secret that some countries are going to have, while others will be denied. 5G technology is being developed by a host of multinational vendors that are going to sell it to anybody who wants it. It’s not a race when everybody is allowed to win. If China, or Germany, or Finland makes a 5G breakthrough and implements some aspect of 5G first, within a year that same technology will be in the gear available to everybody.

What I really don’t get about this kind of hype and rhetoric is that 5G is basically a new platform for delivering bandwidth. If we are so fired up to not lose the 5G race, then why have we been so complacent about losing the fiber race? The US is far down on the list of countries in terms of our broadband infrastructure. We’ve not deployed fiber optics nearly as quickly as many other countries, and worse we still have millions of households with no broadband and many tens of millions of others with inadequate broadband. That’s the race we need to win because we are keeping whole communities out of the new economy, whch hurts us all.

I hope that my readers don’t think I’m against 5G because I’m for any technology that improves access to bandwidth. What I’m against is the industry hype that paints 5G as the technology that will save our country – because it will not. Today, more than 95% of the bandwidth we use is carried over wires, and 5G isn’t going to move that needle much. There are clearly some bandwidth needs that only wireless will solve, but households and businesses are going to continue to rely on wires to move big bandwidth.

When I ask wireless engineers about the future they almost all have painted the same picture. Over time we will migrate to a mixture of WiFi and millimeter wave spectrum indoors to move around big data. When virtual and augmented reality was first mentioned a few years ago, one of the big promises we heard was for telepresence, where we’ll be able to meet and talk with remote people as if they are sitting with us. That technology hasn’t moved forward because it requires huge broadband beyond what today’s WiFi routers can deliver. Indoor 5G using millimeter wave spectrum will finally unleash gigabit applications within the home.

The current hype for 5G has only one purpose. It’s a slick way for the wireless carriers to push the government to take the actions they want. 5G was raised as one of the reasons to kill net neutrality. It’s being touted as a reason to gut most of the rest of existing telecom legislation. 5G is being used as the reason to give away huge blocks of mid-range spectrum exclusively to the big wireless companies. It’s pretty amazing that the government would give so much away for a technology that will roll out slowly over the next decade.

Please think twice before you buy into the 5G hype. It takes about five minutes of thinking to poke a hole in every bit of 5G hype. There is no race for 5G deployment and the US, by definition, can’t be ahead or behind in the so-called race towards 5G. This is just another new broadband technology and the wireless carriers and other entrepreneurs will deploy 5G in the US when it makes economic sense. Instead of giving the wireless companies everything on their wish list, a better strategy by the FCC would be to make sure the country has enough fiber to make 5G work.

The Slow Deployment of 5G

Somebody asked me a few days ago why I write so much about 5G. My response is that I am intrigued by the 5G hype. The major players in the industry have been devoting big dollars to promote a technology that is still mostly vaporware. The most interesting thing about 5G is how politicians, regulators and the public have bought into the hype. I’ve never seen anything like it. I can remember other times when the world was abuzz over a new technology, but this was usually a reaction to an actual technology you could buy like the first laptop computers, the first iPhone and the first iPod.

Anybody that understands our industry knew that it will take a number of years to roll out any major new technology, particularly a wireless technology since wireless behaves differently in the field compared to the lab. We’re only a year past the release of 5G standards, and it’s unrealistic to think those standards could be translated into operation hardware and software systems in such a short time. You only have to look back at the history of 4G, which started as slowly as 5G and which finally had the first fully-compliant 4G cell site late last year.  It’s going to take just as long until we see a fully functional 5G cell site. What we will see, over time, is the incremental introduction of some of the aspects of 5G as they get translated from lab to the field. That rollout is further complicated for cellular use by the timeline needed to get 5G-ready handsets into peoples’ hands.

This blog was prompted by a Verizon announcement that 5G mobile services will be coming to 30 cities later this year. Of course, the announcement was short on details, because those details would probably be embarrassing for Verizon. I would expect that the company will introduce a tiny few aspects of 5G into the cell sites in business districts of major cities and claim that as a 5G roll-out.

What does that a roll-out this year mean for cellular customers? There are not yet any 5G capable cellphones. Both AT&T and Verizon have been working with Samsung to introduce a 5G version of their S10 phone later this year. Verizon has also been reported to be working with Lenovo for a 5G modular upgrade later this year. I’m guessing these phones are going to come with a premium price tag for the early adaptors willing to pay for 5G bragging rights. These phones will only work as 5G from the handful of cell sites with 5G gear – and that will only be for a tiny subset of the 5G specifications. I remember when one of my friends bought one of the first 4G phones and crowed about how it worked in downtown DC. At the time I told him his great performance was because he was probably the only guy using 4G – and sure enough, his performance dropped as others joined the new technology.

On the same day that I saw this Verizon announcement I also saw a prediction by Cisco that only 3% of cellular connections will occur over a 5G network by the end of 2022. This might be the best thing I’ve seen that pops the 5G hype. Even for folks buying the early 5G phones, there will be a dearth of cell sites around the country that will work with 5G for a number of years. Anybody who understands the lifecycle of cellular upgrades agrees with the Cisco timeline. It takes years to work through the cycle of upgrading cell sites, upgrading handsets and then getting those handsets to the public.

The same is true for the other technologies that are also being called 5G. Verizon made a huge splash just a few months ago about introducing 5G broadband using millimeter wave spectrum in four cities. Even at the time of that announcement, it was clear that those radios were not using the 5G standard, and Verizon quietly announced recently that they were ceasing those deployments while they wait for actual 5G technology. Those deployments were actually a beta test of millimeter wave radios, not the start of a rapid nationwide deployment of 5G broadband from poles.

AT&T had an even more ludicrous announcement at the end of 2018 where they announced 5G broadband that involved deployment of WiFi hotspots that were supposedly fed by 5G. However, this was a true phantom product for which they had no pricing and that nobody could order. And since no AT&T cell sites have been upgraded to 5G, one had to wonder how this involved any 5G technology. It’s clear this was technology roll-out by press release only so that they could have the bragging rights of saying they were the first ones to have 5G.

The final announcement I saw on that same day was one by T-Mobile saying they would begin deploying early 5G in cell sites in 2020. But the real news is that they aren’t planning on charging any more for any extra 5G speeds or features.

I come back to my original question about why I write about 5G so often. A lot of my clients ask me if they should be worried about 5G and I don’t have an answer for them. I can see that actual 5G technology is going to take a lot longer to come to market than the big carriers would have you believe. But I look at T-Mobile’s announcement on price and I also have to wonder what the cellular companies will really do once 5G works. Will AT&T and Verizon both spend billions to put 5G small cells in residential neighborhoods if it doesn’t drive any new cellular revenues? I have to admit that I’m skeptical – we’re going to have to wait to see what the carriers do rather than listen to what they say.

Making a Safe Web

Tim Berners-Lee was one of the founders of the Internet and implemented the first successful communication between a client and a server using HTTP in 1989. He’s always been a proponent for an open Internet and doesn’t like how the web has changed. The biggest profits on the web today come from the sale of customer data.

Berners-Lee has launched a new company along with cybersecurity expert John Bruce that proposes to ‘restore rightful ownership of the data back to every web user”. The new start-up is called Inrupt which is proposing to develop an alternate web for users who want to protect their data and their identity.

Berner-Lee has been working at the Computer Sciences and Artificial Intelligence Laboratory (CSAIL) at MIT to develop a software platform that can support his new concept. The platform is called Solid, which has the main goal of decoupling web applications from the data they produce.

Today our personal data is stored all over the web. Our ISPs make copies of a lot of our data. Platforms like Google, Facebook, Amazon, and Twitter gather and store data on us. Each of these companies captures a little piece of the picture of who we each are. These companies use our data for their own purposes and then sell it to companies that buy, sort and compile that data to make profiles on all of us. I saw a disturbing statistic recently that there are now up to 1,400 data points created daily for the typical data user every day – data gathered from our cellphones, smart devices, and our online web activity.

The Solid platform would change the fundamental structure of data storage. Each person on the Solid platform would create a cache of their own personal data. That data could be stored on personal servers or on servers supplied by companies that are part of the Solid cloud. The data would be encrypted and protected against prying.

Then, companies like Berners-Lee’s Inrupt would develop apps that perform functions users want without storing any customer data. Take the example of shopping for new health insurance. An insurance company that agrees to be part of the Solid platform would develop an app that would analyze your personal data to determine if you are a good candidate for the insurance policy. This app would work on your server to analyze your medical records and other relevant personal information. The app would do its analysis and decide if you are a good candidate for a policy. It might report information back to the insurance company such as some sort of rating of you as a potential customer, but the insurance would never see the personal data.

The Solid concept is counting on the proposition that there are a lot of people who don’t want to share their personal data on the open web. Berners-Lee is banking that there are plenty of developers who would design applications for those in the Solid community. Over time the Solid-based apps can provide an alternate web for the privacy-minded, separate and apart from the data-collection web we share today.

Berners-Lee expects that this will first take a foothold in industry groups that value privacy like coders, lawyers, CPAs, investment advisors, etc. Those industries have a strong desire to keep their client’s data private, and there is no better way to do that than by having the client keep their own data. This relieves lawyers, CPAs and other professionals from the ever-growing liabilities from data breaches of client data.

Over time Berners-Lee hopes that all sorts of other platforms will want to cater to a growing base of privacy-minded users. He’s hoping for a web ecosystem of search engines, news feeds, social media platforms, and shopping sites that want to sell software and services to Solid users, but with the promise of not gathering personal data. One would think current existing privacy-minded platforms like Mozilla Firefox would join this community. I would love to see a Solid-based cellphone operating system. I’d love to use an ISP that is part of this effort.

It’s an interesting concept and one I’ll be watching. I am personally uneasy about the data being gathered on each of us. I don’t like the idea of applying for health insurance, a credit card or a home mortgage and being judged in secret by data that is purchased about me on the web. None of us has any idea of the validity and correctness of such data. And I doubt that anybody wants to be judged by somebody like a mortgage lender using non-financial data like our politics, our web searches, or the places we visit in person as reported by our cellphones. We now live in a surveillance world and Berners-Lee is giving us the hope of escaping that world.

San Jose Tackles the Digital Divide

As a country we have done well with 85% of households in most areas now buying some form of broadband connection. But that still means that 15% of homes don’t have broadband. Certainly there are some homes that don’t want broadband, but it’s clear that a significant percentage of those without broadband can’t afford it.

Affordability is going to become more of an issue now that we see a strategy of the big ISPs to raise rates every year. I don’t think there’s much doubt that the cost of broadband is going to climb faster than the overall rate of inflation. We recently saw Charter raise the rate of bundled broadband by $5 per month. Wall Street is crediting the higher earnings of several big cable companies due to the trend that the companies are cutting back on their willingness to offer special prices for term contracts – I think the cable companies are finally acknowledging that they have won the war against DSL.

San Jose is no different than any big city in that it has big numbers of homes without broadband. The city recently estimated that there are 95,000 residents of the city without a home broadband connection. The city just announced a plan to begin solving the digital divide and pledged $24 million to kick off the effort. They claim this is the biggest effort being taken by a major city to solve the digital divide.

The digital divide became apparent soon after the introduction of DSL and cable modems in the late 1990s. Even then there were households locked out from the new technology due to the cost of buying broadband service. The digital divide gets more acute every year as more and more of our daily lives migrate online. It’s grown to become unimaginable for a student to have an even chance in school without access to broadband. Anybody with broadband only has to stop and imagine for a second what it would be like to lose broadband access – and then realize that there are huge numbers of homes that are missing out on many of the basic benefits that those with broadband take for granted.

The San Jose plan is light on detail at this early stage, but it’s clear that the city will be looking for infrastructure plans to extend broadband rather than subsidizing service from incumbent ISPs. Consider the mayor’s stated vision for broadband:

“Ensure all residents, businesses, and organizations can participate in and benefit from the prosperity and culture of innovation in Silicon Valley . . . Broaden access to basic digital infrastructure to all residents, especially our youth, through enabling free or low cost, high-speed, 1 gigabit or faster broadband service in several low-income communities, and increasing access to hardware, including tablets and laptops, for low-income students.”

The city won’t be tackling the issue alone and is hoping for involvement from the business and charitable organizations in the city. For example, the city is already working with the Knight Foundation that has been addressing this issue for years. The city is interested in technologies like Facebook’s terragraph wireless technology that plans to use 60 GHz spectrum to create fast outdoor wireless broadband.

The city recognizes that there are no instant fixes and already recognizes that it might take a decade to bring fast affordable broadband to everybody in the city. I’m sure that $24 million is also just a downpayment towards a permanent broadband solution. But this plan puts the city ahead of every other major metropolitan area in the willingness to tackle the problem head-on.

There has been a cry for solving the digital divide for twenty years. Some communities have found solutions that help, like the charitable effort by E2D in Charlotte, NC that is bringing laptops and wireless broadband to large numbers of homeless and low-income school students. But no city has directly tackled the problem before with a pledge of serious taxpayer funds to help find a solution. It’s been obvious from the beginning of the digital divide discussions that it was going to take money and broadband infrastructure to solve the problem. I’m sure that many other cities will be watching San Jose because the broadband gap is becoming a significant contributor to creating an underclass that has less access to education, healthcare and the chance for good paying jobs. I’m willing to make a bet that the long-term economic benefits from solving the digital divide in San Jose will be far greater than the money they are putting into the effort.

Google Fiber Leaving Louisville

Most readers have probably heard by now that Google Fiber is leaving Louisville because of failures with their fiber network. They are giving customers two months of free service and sending them back to the incumbent ISPs in the city. The company used a construction technique called micro-trenching where they cut a tiny slit in the road, one inch wide and few inches deep to carry the fiber. Only a year after construction the fiber is popping out of the micro-trenches all over the city.

Everybody I’ve talked to is guessing that it’s a simple case of ice heaving. While a micro-trench is sealed, it’s likely that small amounts of moisture seep into the sealed micro-trench and freezes when it gets cold. The first freeze would create tiny cracks, and with each subsequent freeze the cracks would get a little larger until the trench finally fills up with water, fully freezes and ejects the fill material. The only way to stop this would be to find a permanent seal that never lets in moisture. That sounds like a tall task in a city like Louisville that might freeze and thaw practically every night during the winter.

Nobody other than AT&T or Charter can be happy about this. The reason that Google Fiber elected to use micro-trenching is that both big ISPs fought tooth and nail to block Google Fiber from putting fiber on the utility poles in the city. The AT&T suit was resolved in Google’s favor, with the Charter one is still in court. Perhaps Google Fiber should have just waited out the lawsuits – but the business pressure was there to get something done. Unfortunately, the big ISPs are being rewarded for their intransigence.

One obvious lesson learned is not to launch a new network using an untried and untested construction technique. In this case, the micro-trenches didn’t just fail, they failed spectacularly, in the worst way imaginable. Google Fiber says the only fix for the problem would be to build the network again from scratch, which makes no financial sense.

Certainly, the whole industry is going to now be extremely leery about micro-trenching, but there is a larger lesson to be learned from this. For example, I’ve heard from several small ISPs who are ready to leap into the 5G game and build networks using millimeter wave radios installed on poles. This is every bit a new and untested technology like micro-trenching. I’m not predicting that anybody pursuing that business plan will fail – but I can assuredly promise that they will run into unanticipated problems.

Over my career, I can’t think of a single example where an ISP that took a chance on a cutting-edge technology didn’t have big problems – and some of those problems were just as catastrophic as what Google Fiber just ran into. For example, I can remember half a dozen companies that tried to deploy broadband networks using the LMDS spectrum. I remember one case where the radios literally never worked and the venture lost their $2 million investment. I remember several others where the radios had glitches that caused major customer outages and were largely a market disaster.

One thing that I’ve seen over and over is that telecom vendors take shortcuts. When they introduce a new technology they are under extreme pressure to get it to market and drive new revenues. Ideally, a vendor would hold small field trials of new technology for a few years to work out the bugs. But if a vendor finds an ISP willing to take a chance on a beta technology, they are happy to let the customers of that ISP be the real guinea pigs for the technology, and for the ISP to take the hit for the ensuing problems.

I can cite similar stories for the first generation of other technologies including the first generation of DSL, WiFi mesh networks, PON fiber-to-the-home and IPTV. The companies that were the first pioneers deploying these technologies had costly and sometimes deadly problems. So perhaps the lesson learned is that pioneers pay a price. I’m sure that this failure of micro-trenching will result in changing or abandoning the technique. Perhaps we’ll learn to not use micro-trenches in certain climates. Or perhaps they’ll find a way to seal the micro-trenches against humidity. But none of those future solutions will make up for Google Fiber’s spectacular failure.

The real victims of this situation are the households in Louisville who had changed to Google Fiber – and everybody else in the City. Because of Google Fiber’s lower prices, both Charter and AT&T lowered prices everywhere in the city. You can bet it’s not going to take long to get the market back to full prices. Any customers crawling back to the incumbents from Google Fiber can probably expect to pay full price immediately – there is no real incentive to give them a low-price deal. As a whole, every household in the City is going to be spending $10 or $20 more per month for broadband – which is a significant penalty on the local economy.

AT&T’s 5G Strategy

AT&T recently described their long-term 5G strategy using what they call the 3 pillars of 5G – the three areas where the company is putting their 5G focus. The first pillar is a concentration on 5G cellular, and the company’s goal is to launch a 5G-based cellular service, with some cities coming on board in the second half of 2020. This launch will use frequencies in the sub-6 GHz range. This admission that there won’t be any AT&T 5G until at least 2020 contradicts the AT&T marketing folks who are currently trying to paint the company’s 4G LTE as pre-5G.

The biggest problem for the public will be getting a 5G cellphone. AT&T is working with Samsung to hopefully launch two phones later this year that have some 5G capability. As always with a new generation of wireless technology, the bottleneck will be in handsets. The cell phone makers can’t just make generic 5G phones – they have to work with the carriers to be ready to support the spectific subset of 5G features that are released. You might recall that the 5G cellular specification contains 13 improvements, and only the first generation of a few of those will be included in the first generation 5G cell sites. Cellphone manufacturers will also have to wrestle with the fact that each big cellular carrier will introduce a different set of 5G features.

This is a real gamble for cellphone makers because a 5G phone will become quickly obsolete. A 5G phone sold in late 2019 probably won’t include all of the 5G features that will be on the market by late 2020 – and this is likely to be true for the next 3 or 4 years as the carriers roll out incremental 5G improvements. It’s also a gamble for customers because anybody that buys an early 5G cellphone will have early bragging rights, but those cool benefits can be out of date in six months. I think most people will be like me and will wait a few years until the 5G dust settles.

AT&T’s second pillar is fixed wireless. This one is a head-scratcher because they are talking about the fixed cellular product they’ve already been using for several years – and that product is not 5G. This is the product that delivers broadband to homes using existing low-band cellular frequencies. This is not the same as Verizon’s product that delivers hundreds of megabits per second but is instead a product that delivers speeds up to 50 Mbps depending upon how far a customer lives from a cell tower – with reports that most households are getting 15 Mbps at best. This is the product that AT&T is mostly using to satisfy its CAF II requirements in rural America. All of the engineers I’ve talked to don’t think that 5G is going to materially improve this product.

The final pillar of AT&T’s strategy is edge computing. What AT&T means by this is to put fast processors at customer sites when there is the need to process low-latency, high-bandwidth data. Like other carriers, AT&T has found that not everything is suited for the cloud and that trying to send big data to and from the cloud can create a bandwidth bottleneck and add latency. This strategy doesn’t require 5G and AT&T has already been deploying edge routers. However, 5G will enhance this ability at customer sites that need to connect a huge number of devices simultaneously. 5G can make it easier to connect to a huge number of IoT devices in a hospital or to 50,000 cell phones in a stadium. The bottom line is that the migration to more edge computing is not a 5G issue and applies equally to AT&T’s fiber customers.

There is really nothing new in the three-pillar announcement and AT&T has been talking about all three applications from some time – but the announcement does highlight the company’s focus for stockholders.

In what was mostly a dig at Verizon, AT&T’s CEO Randall Stephenson did hold out the possibility of AT&T following Verizon into the 5G fixed wireless local loop using millimeter wave spectrum – however, he said such a product offering is probably three to five years into the future. He envisions the product as an enhancement to AT&T’s fiber products, not necessarily a replacement. He emphasized that AT&T is happy with the current fiber deployments. He provided some new statistics on a recent earnings call and said the company is seeing customer penetration rates between 33% and 40% within 18 months of new fiber deployment and penetration around 50% after three years. Those are impressive statistics because AT&T’s fiber deployments have been largely in urban areas competing with the big cable companies.

A year ago, Stephenson said that getting sufficient backhaul was his number one concern with deploying high-bandwidth wireless. While he hasn’t repeated that recently, it fits in with his narrative of seeing millimeter wave radio deployments in the 3-5 year time frame. The company recently released a new policy paper on its AirGig product that says that the product is still under development and might play well with 5G. AirGig is the mysterious wireless product that shoots wireless signals along power lines and somehow uses the power lines to maintain focus of the signal. Perhaps the company is seeing a future path for using AirGig as the backhaul to 5G fixed wireless deployments.

Forecasting the Future of Video

I recently saw several interesting forecasts about the cable industry. The research firm SNL Kagan predicts that broadband-only homes in the US – those that don’t subscribe to traditional linear cable TV – will increase from 23.3 million in 2018 to 40.8 million by 2023. In another forecast Parks Associates predicts that the number of worldwide OTT subscribers – households that subscribe to at least one online video service – will grow to 310 million by 2024.

These kinds of forecasts have always intrigued me. I doubt that there is anybody in the industry that doesn’t think that cord cutting won’t keep growing or that the market for services like Netflix won’t keep growing. What I find most interesting about these total-market forecasts is the specificity of the predictions, such as when Kagan predicts the 40.8 million number of broadband-only homes. I suspect if we did deeper into what Kagan says that they have probably predicted a range of possible future outcomes and were not that specific. But I also understand that sometimes putting a number on things is the best way to make a point in a press release.

What I’ve always found interesting about future predictions is how hard it is to predict where a whole industry is going. If I look back ten years I could find a dozen experts predicting the death of traditional landline telephones, and yet not one of them would have believed that by 2019 that landline penetration rates would still be around 40%. I imagine every one of them would have bet against that possibility. It’s easy to understand the trajectory of an industry, but it’s another thing to predict specifically where an industry will land in the future. It wasn’t hard ten years ago to predict the trajectory of the landline business, but it was nearly impossible to know how many landlines would still be around after ten years.

That doesn’t mean that somebody doesn’t have to try to make these predictions. There are huge dollars riding on the future of every telecom industry segment. Companies that invest in these industries want outside opinions on the direction of an industry. If I was developing a new OTT product like Apple is doing, I’d want some feel for the potential of my new investment. I’d want to gather as many different predictions about the future of the OTT market as possible. The above two predictions were announced publicly, but corporations regularly pay for private market assessments that never see the light of day.

To show how hard it is to make such predictions, I want to look a little more closely at the Kagan prediction. They are predicting that in five years there will be 17.5 million more homes that buy broadband and don’t buy a traditional TV product. There a number of factors and trends that would feed into that number:

  • It looks like first-time households of millennials and generation Z don’t subscribe to cable TV at nearly the same levels as their parents. Some portion of the increase in broadband-only homes will come from these new households.
  • While final numbers are still not in for 2018 it appears that there will be around 2 million homes that cut the cord last year and dropped cable TV. Is the future pace of cord cutting going to be faster, slow or stay the same? Obviously, predicting the future of cord cutting is a huge piece of the prediction.
  • It’s becoming a lot more complicated for a household to replace traditional cable. It looks like every major owner of content wants to put their unique content into a separate OTT service like CBS All Access did with the Star Trek franchise. The cost of subscribing to multiple OTT services is already getting expensive and is likely to get even costlier over time. Surveys have shown that households cut the cord to save money, so how will cord cutting be impacted if there are no savings from cutting the cord?
  • The big cable companies are creating new video products aimed at keeping subscribers. For instance, Comcast is bundling in Netflix and other OTT products and is also rolling out smaller and cheaper bundles of traditional programming. They are also allowing customers to view the content on any device, so buying a small bundle from Comcast doesn’t feel much different to the consumer than buying Sling TV. What impact will these countermeasures from the cable companies have on cord cutting?

I’m sure there are other factors that go into predicting the number of future homes without traditional cable TV and these few popped into my mind. I know that companies like Kagan and Parks have detailed current statistics on the industry that are not available to most of us. But statistics only take you so far, and anybody looking out past the end of 2019 is entering crystal ball territory. Five years is forever in a market that is as dynamic as cable TV and OTT content.

We aso know from past experience that there will be big changes in these industries that will change the paridigm. For example, the content owners might all decide that there is no profit in the OTT market and could kill their own OTT products and cause an OTT market contraction. Or a new entrant like Apple might become a major new competitor for Netflix and the demand for OTT services might explode even faster than expected. I don’t know how any prediction can anticipate big market events that might disrupt the whole industry.

Understand that I am not busting on these two predictions – I don’t know enough to have the slightest idea if these predictions are good are bad. These companies are paid to make their best guess and I’m glad that there are firms that do that. For example, Cisco has been making predictions annually for many years about the trajectory of broadband usage and that information is a valuable piece of the puzzle for a network engineer designing a new network. However, predicting how all of the different trends that affect video subscriptions over five years sounds like an unsolvable puzzle. Maybe if I’m still writing this blog five years from now I can check to see how these predictions fared.  One thing I know is that I’m not ready to take any five-year forecast of the cable industry to the bank.

The Status of the CAF II Deployments

The Benton Foundation noted last month that both CenturyLink and Frontier have not met all of their milestones for deployment of CAF II. This funding from the FCC is supposed to be used to improve rural broadband to speeds of at least 10/1 Mbps. As of the end of 2018, the CAF II recipients were to have completed upgrades to at least 60% of the customers in each state covered by the funding.

CenturyLink took funding to improve broadband in 33 states covering over 1 million homes and businesses. CenturyLink claims to have met the 60% milestone in twenty-three states but didn’t make the goal in eleven states: Colorado, Idaho, Kansas, Michigan, Minnesota, Missouri, Montana, Ohio, Oregon, Washington, and Wisconsin.

Frontier received CAF II funding to improve broadband to over 774,000 locations in 29 states. Frontier says they’ve met the milestone in 27 states but haven’t reached the 60% deployment milestone in Nebraska and New Mexico.  There were a number of other large telcos that took CAF Ii funding like AT&T, Windstream, and Consolidated, and I have to assume that they’ve reported meeting the 60% milestone.

Back in 2014 when it looked like the CAF II program might be awarded by reverse auction, we helped a number of clients take a look at the CAF II service areas. In many cases, these are large rural areas that cover 50% or more of most of the rural counties in the country. Most of my clients were interested in the CAF II money as a funding mechanism to help pay for rural fiber, but all of the big telcos other than AT&T announced originally that they planned to upgrade existing DSL. AT&T announced a strategy early on to used fixed cellular wireless to satisfy their CAF II requirements. Since then a few big telcos like Frontier and Windstream have said that they are also using fixed wireless to meet their obligations.

To us, the announcement that the telcos were going to upgrade DSL set off red flag alarms. In a lot of rural counties there are only a small number of towns, and those towns are the only places where the big telcos have DSLAMs (the DSL hub). Rural telephone exchanges tend to be large and the vast majority of rural customers have always been far out of range of DSL that originates in the small towns. One only has to go a few miles – barely outside the towns – to see DSL speeds fall off to nothing.

The only way to make DSL work in the CAF II areas would be to build fiber to rural locations and establish new DSL hub sites. As any independent telco can tell you who deployed DSL the right way, this is expensive because it takes a lot of the rural DSLAMs to get within range of every customer. By electing DSL upgrades, the big telcos like CenturyLink and Frontier had essentially agreed to build a dozen or more fiber DSLAMs in each of the rural counties covered by CAF II. My back-of-the-envelope math showed that was going to cost a lot more than what the companies were receiving from the CAF fund. Since I knew these telcos didn’t want to spend their own money in rural America, I predicted execution failures for many of the planned DSL deployments.

I believe the big telcos are now facing a huge dilemma. They’ve reached 60% of customers in many places (but not all). However, it is going to cost two to three times more per home to reach the remaining 40% of homes. The remaining customers are the ones on extremely long copper loops and DSL is an expensive technology use for reaching these last customers. A DSLAM built to serve the customers at the ends of these loops might only serve a few customers – and it’s hard to justify the cost of the fiber and electronics needed to reach them.

I’ve believed from the beginning that the big telcos building DSL for the CAF II program would take the approach of covering the low hanging fruit – those customers that can be reached by the deployment of a few DSLAMs in a given rural area. If that’s true, then the big telcos aren’t going to spend the money to reach the most remote customers, meaning a huge number of CAF II customers are going to see zero improvements in broadband. The telcos mostly met their 60% targets by serving the low-hanging fruit. They are going to have a huge challenge meeting the next milestones of 80% and 100%.

Probably because I write this blog, I hear from folks at all levels of the industry about rural broadband. I’ve heard a lot of stories from technicians telling me that some of the big telcos have only tackled the low-hanging fruit in the CAF builds. I’ve heard from others that some telcos aren’t spending more than a fraction of the CAF II money they got from the FCC and are pocketing much of it. I’ve heard from rural customers who supposedly already got a CAF II upgrade and aren’t seeing speeds improved to the 10/1 threshold.

The CAF II program will be finished soon and I’m already wondering how the telcos are going to report the results to the FCC if they took shortcuts and didn’t make all of the CAF II upgrades. Will they say they’ve covered everybody when some homes saw no improvement? Will they claim 10/1 Mbps speeds when many households were upgraded to something slower? If they come clean, how will the FCC react? Will the FCC try to find the truth or sweep it under the rug?