AT&T Argues for Broadband Reform

Ed Gillespie, the Senior Executive Vice President of External & Legislative Affairs at AT&T posted a policy position on the AT&T website that argues for major policy reform to better bring broadband to low-income homes and rural areas.

It’s hard for any broadband advocate to not agree with the suggestions Mr. Gillespie is making:

  • He wants Congress to finish funding the new FCC mapping program to identify homes without access to broadband.
  • He supports additional broadband grant funding for programs like the $20 billion RDOF grants.
  • He supports Lifeline reform and says that it should be as easy for low-income homes to apply a Lifeline discount as it is to use a card to buy food from the SNAP program.
  • He thinks funding should be increased for the Lifeline program and should be funded by Congress rather than funded through a 26% tax on interstate telephony.

I hope AT&T is serious about these proposals because having them lobby for these ideas would help to move the needle on digital inclusion. It’s just odd to see these positions from AT&T since they have spent a lot of effort and dollars arguing against some of these policies.

Mr. Gillespie complains that a lot of the current $9.25 Lifeline discount program is used by MVNOs and other carriers that have not built networks. That’s an ironic argument for AT&T to make since the company has done it’s best to walk away from the Lifeline program. AT&T no longer offers Lifeline in 14 states – AL, AR, FL, IN, KS, KY, LA, MS, NC, NV, SC, TN, and WI. AT&T still participates in Lifeline in 6 states, but only because those states refuse to allow the company to exit the Lifeline program.

Of course, this would not be an AT&T policy paper if the company didn’t pat itself on the back a bit. Mr. Gillespie brags that the ISP networks in the country weathered the big increase in web traffic due to the pandemic even though predictions were made that networks would collapse. He claims that AT&T made it through the pandemic due to light touch regulation. The fact is, once it was understood that the new traffic on the web was coming during the daytime when the network wasn’t busy, I don’t know any network engineer who thought that the web would collapse. I also wonder why AT&T would claim to have weathered the pandemic well – I would challenge AT&T to bring forth happy customers using AT&T DSL and ask for their testimonials on how the AT&T network enabled multiple people to work from home at the same time.

Mr. Gillespie is also calling for an expansion of the concepts used in the RDOF grants. Those grants provide funding for new broadband networks in rural areas that have the worse broadband. Before supporting an expansion of that grant program, I think many of us are holding judgment on the RDOF reverse auction process. While I think it’s likely that there will be beneficial grants given to those willing to build rural fiber networks, I also fear that a huge amount of these grants are going to be wasted on satellite broadband or other technologies that don’t bring rural broadband in line with urban broadband. I’m not ready to bless that grant program until we see how the reverse auction allocates money. I also can’t help being suspicious that AT&T’s position in favor of more grants reflects a hope to win billions of new grant dollars.

Interestingly, even though he never says it, the reforms that Mr. Gillespie is asking for require new broadband regulation. I’m sure that Mr. Gillespie must realize that bills needed from Congress for these reforms are not likely to stop with just AT&T’s wish list. We are long overdue for a new telecommunications act that brings broadband regulation in line with today’s reality. The last such law was passed at a time when people were flocking to AOL for dial-up access. It’s highly likely that new telecom legislation is going to go beyond what AT&T is calling for. It’s likely that new legislation will give some broadband regulating authority back to the FCC and will likely include some version of net neutrality. It’s ironic to see arguments for a stronger FCC when the FCC walked away from regulating broadband at the urging of AT&T and the other giant ISPs. Perhaps even AT&T knows it went too far with deregulation.

Will Congress Fund Rural Broadband?

Members of Congress seem to be competing to sponsor bills that will fund rural broadband. There are so many competing bills that it’s getting hard to keep track of them all. Hopefully, some effort will be made to consolidate the bills together into one coherent broadband funding bill.

The latest bill is the Accessible, Affordable Internet for All Act, introduced in the House of Representatives. This is part of a plan to provide $1.5 trillion of infrastructure funding that would include $100 billion for rural broadband. $80 billion of the funding would be used to directly construct rural broadband. It’s worth looking at the details of this bill since it’s similar to some of the other ideas floating around Congress.

The bill focuses on affordability. In addition to building broadband it would:

  • Require ISPs to offer an affordable service plan to every consumer
  • Provide a $50 monthly discount on internet plans for low-income households and $75 for those on tribal lands.
  • Gives a preference to networks that will offer open access to give more choice to consumers.
  • Direct the FCC to collect data on broadband prices and to make that data widely available to other Federal agencies, researchers, and public interest groups
  • Direct the Office of Internet Connectivity and Growth to conduct a biennial study to measure the extent to which cost remains a barrier to broadband adoption.
  • Provide over $1 billion to establish two new grant programs: the State Digital Equity Capacity Program, an annual grant program for states to create and implement comprehensive digital equity plans to help close gaps in broadband adoption and digital skills, and the Digital Equity Competitive Grant Program which will promote digital inclusion projects undertaken by individual organizations and local communities
  • Provide $5 billion for the rapid deployment of home internet service or mobile hotspots for students with a home Internet connection.

This bill also guarantees the right of local governments, public-private partnerships, and cooperatives to deliver broadband service – which would seemingly override the barriers in place today in 21 states that block municipal broadband and the remaining states that don’t allow electric cooperatives to be ISPs.

This and the other bills have some downsides. The biggest downside is the use of a reverse auction.  There are two big problems with reverse auctions that the FCC doesn’t seem to want to acknowledge. First, a reverse auction requires the FCC to predetermine the areas that are eligible for grants – and that means relying on their lousy data. Just this month I was working with three different rural counties where the FCC records show the entire county has good broadband because of over-reporting of speeds by a wireless ISP. In one county, a WISP claimed countywide availability of 300 Mbps broadband. In another county a WISP claimed countywide coverage of 100 Mbps symmetrical broadband coverage, when their closest transmitter was a county and several mountain ranges away. Until these kinds of mapping issues are fixed, any FCC auctions are going to leave out a lot of areas that should be eligible for grants. The people living in these areas should not suffer due to poor FCC data collection.

Second, there are not enough shovel ready projects ready to chase $80 billion in grant funding. If there is no decent ISP ready to build in a predetermined area, the funding is likely to revert to a satellite provider, like happened when Viasat was one of the largest winners in the CAF II reverse auction. The FCC also recently opened the door to allowing rural DSL into the upcoming RDOF grant – a likely giveaway to the big incumbent telcos.

This particular bill has a lot of focus on affordability, and I am a huge fan of getting broadband to everybody. But policymakers have to know that this comes at a cost. If a grant recipient is going to offer affordable prices and even lower prices for low-income households then the amount of grant funding for a given project has to be higher than what we saw with RDOF. There also has to be some kind of permanent funding in place if ISPs are to provide discounts of $50 to $75 for low-income households – that’s not sustainable out of an ISP revenue stream.

The idea of creating huge numbers of rural open-access networks is also an interesting one. The big problem with this concept is that there are many places in the country where there a few, or even no local ISPs. Is it an open-access network if only one, or even no ISPs show up to compete on a rural network?

Another problem with awarding this much money all at once is that there are not enough good construction companies to build this many broadband rural networks in a hurry. In today’s environment that kind of construction spending would superheat the market and would drive up the cost of construction labor by 30-50%. It would be just as hard to find good engineers and good construction managers in an overheated market – $80 billion is a lot of construction projects.

Don’t take my negative comments to mean I am against massive funding for rural broadband. But if we do it poorly a lot of the money might as well just be poured into a ditch. This much money used wisely could solve a giant portion of the rural broadband problem. But done poorly and many rural communities with poor broadband probably won’t get a solution. Congress has the right idea, but I hope that they don’t dictate how to disperse the money without talking first to rural industry experts, or this will be another federal program with huge amounts of wasted and poorly spent money.

An End to Data Caps?

All of the major ISPs that were enforcing data caps have lifted those caps in response to the COVID-19 crisis. This includes AT&T, Comcast, Cox, Mediacom, and CenturyLink. All of these companies justified data caps as a network management tool that was in place to discourage overuse of the network. That argument no longer holds water if these ISPs eliminate the during a crisis that is overtaxing networks more than we are likely to ever see again.

These companies eliminated the caps as a result of political pressure and from a mass public outcry. The caps were eliminated to make broadband more affordable in a time when millions of people are becoming unemployed. By eliminating the caps during this crisis, these ISPs have publicly admitted that the caps were about making money and not for any issues related to network traffic.

The lame justification these ISPs gave for data caps was always weak when other large ISPs like Charter, Verizon, Altice, Frontier, and Windstream never implemented data caps. A few ISPs on that list like Frontier and Windstream have some of the weakest networks in the country yet never sought to implement data caps as a traffic control measure.

AT&T has been the king of data caps. They have data caps that kick in as low as 150 gigabytes of monthly broadband usage on DSL lines. AT&T’s fixed wireless product for rural markets has a data cap that kicks in at 250 GB. Interestingly, customers buying the 300 Mbps on fiber have a 1 terabyte data cap while customers buying gigabit broadband on fiber are allowed unlimited usage. This also proves that the data caps aren’t about traffic control – the caps are removed from the largest data users. The AT&T caps are to encourage somebody buying 300 Mbps to upgrade to the faster service. AT&T is also the king of overage charges. For DSL and fixed wireless, the overage charges are $10 for each 50 GB, with a maximum monthly overage of a whopping $200. The monthly dollar cap on 300 Mbps service is $100.

Mediacom had the next lowest data caps at 400 Mbps. Comcast and Cox have had data caps at 1 TB. It’s been reported by customers that the companies aggressively enforce the caps. CenturyLink has mostly not billed the data caps, but the fact that they eliminated the caps during this crisis likely means they were billing it to some customers.

To put these data caps in context, OpenVault says that at the end of 2019 that the average households used 344 gigabytes of data, up from 275 gigabytes a year earlier. More germane to data caps, OpenVault says that nearly 1% of homes now use 2 terabytes per of data month and 7.7% use over 1 terabyte per month. The percentage of homes using over 1 terabyte climbed from 4% a year earlier. AT&T has likely been cleaning up with data caps charges while Comcast was just starting to see some real revenue from the caps.

What remains to be seen is if these ISPs reintroduce data caps sometime later this year. They can no longer make a straight-faced claim that data caps are in place to dissuade overuse of the network. If data caps had that kind of impact on networks, then during the crisis the ISPs should have tightened the data cap threshold to protect the many new households that are working from home or doing schoolwork remotely. The data caps have always been about money, nothing else.

Unfortunately, we have no recourse other than a loud public outcry if these ISPs renew the data caps. The FCC has completely washed its hands of broadband regulation and killed its authority to do anything about data caps. Most tellingly, when FCC Chairman Ajit Pai released a plea to ISPs to “Keep Americans Connected”, that plea didn’t even mention data caps. Chairman Pai asked ISPs not to disconnect customers for not paying and asked ISPs to provide more public hotspots.

I bet that when this crisis is over that the big ISPs will quietly introduce data caps again. Even before this crisis, almost 9% of homes routinely used more than a terabyte of data, and the data caps are a huge moneymaker for the big ISPs that they are not willingly going to give up. During this crisis, a lot of routine functions are going go virtual and I expect a lot of them will stay virtual when the crisis is over. It wouldn’t be surprising a year from now to see 20% of homes routinely exceeding a terabyte of usage each month.

I think these ISPs will be making a huge mistake if they introduce data caps a few months, or even a year from now. Millions of people found themselves unable to work or school from home due to poor broadband. In the current environment a big public outcry against bad ISP behavior has a high chance of bringing Congressional action. Almost nobody, except the most partisan politicians would vote against a bill that bans data caps. The ISPs should be afraid of other restrictions that might come along with such a bill.

The Fragile Supply Chain

The recent outbreak of the coronavirus reminded us how fragile the supply chain is for telecom. As it turns out, the Hubei province of China is where much of the world’s optics and lasers are built that are the key component in every device that is used to communicate in a fiber network. Within days after the reality of the virus become apparent, the stocks of tech companies that rely on lasers took a hit.

The supply chain for electronics manufacturing stretches worldwide. The lasers are made in one place. The chips in devices are made somewhere else. Other electronic components come from a third geographic source. Components like cellphone screens and other non-electric components come from yet a different place. And the raw materials to make all of these devices come from markets all over the world.

The virus scare made the world wake up to the fragility of the supply chain. Without lasers, there would be no fiber-to-the-home devices manufactured. There would be no new servers in data centers. There would be no new small cell sites built or activated. Major industries could be brought to their knees within weeks.

It’s not hard to understand why I say the supply chain is fragile. Consider smartphones. There are probably a dozen components in a smartphone that must be delivered on time to a smartphone factory to keep the manufacturing process going. If any one of those components can’t be delivered, smartphone manufacturing comes to a halt. The manufacturing floor can be crippled by a lack of screens just as much as it can suffer if the chips, antennas, or other key electronic components become unavailable.

It’s impossible to know if the coronavirus will cause any major disruption in the supply chain for fiber optics – but the point is that it could. If it’s not a virus today, disruptions could come from a wide range of natural disasters and manmade problems. I remember a fire that destroyed a fiber optic cable factory a few decades ago that created a major shortfall of optic cables for a year. Floods, fires, earthquakes, and other disasters can knock out key manufacturing sites.

Manmade disruptions to the supply chain are even easier to imagine. We saw the price of electronics components shoot up over the last year due to tariff battles between the US and China. The supply chain can be quickly cut if the country making devices goes to war, or even undergoes an ugly regime change. It’s also now possible to weaponize the supply chain and threaten to cut off key components when negotiating other issues.

I’m sure that very few Americans realized that the Wuhan region has a near-monopoly on the manufacture of lasers. A worldwide economy rewards the creation of monopolies because components are cheapest when an industry takes the most advantage of the economy of scale. The companies in the Wuhan region can likely manufacture lasers cheaper than anybody else.

From a strategic position, countries like the US should foster their own industries to manufacture vital components. But that’s not easy or practical to achieve. A new US company trying to compete on the world stage by making lasers is likely to be more expensive and unable to compete when the supply chain is humming at normal capacity. It’s hard to picture creating a competitor to the Wuhan region that can manufacture lasers in the quantities, and at a price the market is willing to pay.

In the long run, the world always finds alternate solutions to any permanent changes in the supply chain. For example, if China is ever unable to export lasers, within a few years other countries would pick up the slack. But the fiber industry would be devastated during the lull needed to find a new source of components. Bank of America reported last year that 3,000 major manufacturing companies were already reconsidering their supply chain due to tariff and other concerns. Some of these companies, particularly electronics companies have been considering bringing production back to the US now that factories can be heavily robotized. I’m sure the coronavirus has accelerated these decisions.

 

Will Costly Alternatives Slow Cord Cutting?

The primary reason that households claim they cut the cord is due to price. Surveys have shown that most households regularly watch around a dozen cable channels, and cord cutters still want to see their favorite channels. Not all cord cutters are willing to go cold turkey on the traditional cable networks and so they seek out an online alternative that carries the networks they want to watch.

For the last few years, there have been online alternatives that carry the most popular cable networks for prices between $35 and $45 per month. However, during the last year, the cost of these alternatives has risen significantly. I doubt that the price increases will drive people back to the cable companies where they had to pay for hidden fees and a settop box, but the higher prices might make more households hesitate to make the switch. Following are the current prices of the major online alternatives to traditional cable TV:

Hulu Live TV. This service is owned 2/3 by Disney and 1/3 by Comcast. They recently announced a price increase effective December 18 to move the package from $44.99 to $54.99. Customers can also select an add-free version for $60.99. At the beginning of 2019, the service was priced at $39.99, so the price increased by 36% during the year.

AT&T TV Now (was called DirecTV Now) raised the price of the service earlier this year from $50 to $65. The company also raised the prices significantly for DirecTV over satellite and lost millions of customers between the two services.

YouTube TV raised prices in May from $40 to $50. This service is owned by Google. Along with the price increase, the service added the Discovery Channel.

Sling TV is owned by Dish Networks. They still have the lowest prices for somebody looking for a true skinny package. They offer two line-ups, called Blue or Orange that each cost $25 per month, or both for $40 per month. There are also add-ons packages for $5 per month for Kids (Nick channels, Disney Jr), Lifestyle (VH-1, BET, diy, Hallmark), Heartland (outdoor channels), Hollywood (TCM, Sundance, Reelz), along with News, Spanish and International packages. One of the big things missing from Sling TV is local network channels and they provide an HD antenna with a subscription. Sling TV has spread the most popular channels in such a way that customers can easily spend $50 to $60 monthly to get their favorite channels.

Fubo TV is independent and not associated with another big media company. They offer 179 channels, including local network channels for $54.99 per month. The network started with sports coverage including an emphasis on soccer.

TVision Home is owned by T-Mobile. This was formerly known as Layer3 TV. The company has never tried to make this a low-cost alternative and it’s the closest online service to mimic traditional cable TV. The service is only available today in a few major markets. Customers can get an introductory price of $90 per month (goes up to $100 after a year). They charge $10 per extra TV and also bill taxes that range from 4% to 20% depending upon the market. This is cable TV delivered over broadband.

Playstation Vue. The service is owned by Sony and has announced that it will cease service at the end of January 2020. The service is no longer taking new customers. The price of the core packages is $55 per month, which increased by $5 in July.  The service carries more sports channels than most of the other services.

The channels offered by each service differ, so customers need to shop carefully and compare lineups. For example, I’m a sports fan and Sling TV and Fubo TV don’t carry the BigTen Network. There are similar gaps throughout the lineups of all of the providers.

All of these alternatives, except perhaps TVision Home, are still less expensive than most traditional cable TV packages. However, it looks like all of these services are going to routinely increase rates to cover increased programming fees. Couple that with the fact that customers dropping cable TV probably lose their bunding discounts, and a lot of houses are probably still on the fence about cord cutting.

A Peek at AT&T’s Fixed LTE Broadband

Newspaper articles and customer reviews provide a glimpse into the AT&T wireless LTE product being used to satisfy the original CAF II obligations. This article from the Monroe County Reporter reports on AT&T wireless broadband in Monroe County, Georgia. This is a county where AT&T accepted over $2.6 million from the original CAF II program to bring broadband to 1,562 rural households in the County.

Monroe is a rural county southeast of Atlanta with Forsyth as the county seat. As you can see by the county map accompanying this blog, AT&T was required to cover a significant portion of the county (the areas shown in green) with broadband of at least 10/1 Mbps. In much of the US, AT&T elected to use the CAF II money to provide faster broadband from cellular towers using LTE technology.

The customer cited in the article is happy with the AT&T broadband product and is getting 30/20 Mbps service. AT&T is cited in the article saying that the technology works best when serving customers within 2 miles of a cell tower, but that the coverage can sometimes extend to 3 miles. Unfortunately, 2 miles or even 3 miles isn’t very far in rural America and there are going to be a lot of homes in the CAF II service area that will be too far from an AT&T cell tower to get broadband.

From the AT&T website, the pricing for the LTE broadband is as follows. The standalone data product is $70 per month. Customers can get the product for $50 per month with a 1-year contract if they subscribe to DirecTV or an AT&T cellular plan that includes at least 1 GB of cellular broadband allowance. The LTE data product has a tiny data cap of 215 GB of download per month. Customers that exceed the data cap pay $10 for each additional 50 GB of data, up to a maximum fee of $200 per month.

The average household broadband usage was recently reported by OpenVault as 275 GB per month. A household using that average broadband would pay an additional $30 monthly. OpenVault also reported recently that the average cord cutter uses over 520 GB per month. A customer using a cord cutter level of data would pay an additional $70 per month. The product is only affordably priced if a household doesn’t use much broadband.

The article raises a few questions. First, this customer had to call AT&T to get the service, which apparently was not being advertised in the area. He said it took a while to find somebody at AT&T who knew about the LTE broadband product. The customer also said that the installer for the service came from Bainbridge, Georgia – which is a 3-hour drive south from the AT&T cell site mentioned in the article.

This highlights one of the major problems of rural broadband that doesn’t get talked about enough. The big telcos all have had massive layoffs over the last decade, particularly in the workforces supporting copper and rural networks. Even should one of these big telcos offer a rural broadband product, how good is that product without technician support? As I travel the county, I hear routine stories of rural folks who wait weeks to get broadband problems fixed.

When I heard that AT&T was going to use LTE to satisfy it’s CAF II requirements, my first thought was that their primary benefit was to use the federal funding to beef up their rural cellular networks rather than to start caring about rural broadband customers. In Monroe County, AT&T received almost $1,700 per CAF household, and I wonder if they will all see the benefits of this upgrade.

I’ve always suspected that AT&T wouldn’t aggressively market the LTE broadband product. If they were heavily marketing this by now, at the end of the fifth year of the CAF II buildout, there would be rural customers all over the country buying upgraded broadband. However, news about upgraded broadband is sparse for AT&T, and also for CenturyLink, and Frontier. I work with numerous rural counties where the local government never heard of CAF II since the telcos have done little marketing of improved rural broadband.

The article highlights a major aspect of the plight of rural broadband. We not only need to build new rural broadband infrastructure, but we need to replenish the rural workforce of technicians needed to take care of the broadband networks. The FCC needs to stop giving broadband money to the big telcos and instead distribute it to companies willing to staff up to support rural customers.

Counting Gigabit Households

I ran across a website called the Gigabit Monitor that is tracking the population worldwide that has access to gigabit broadband. The website is sponsored by VIAVI Solutions, a manufacturer of network test equipment.

The website claims that in the US over 68.5 million people have access to gigabit broadband, or 21% of the population. That number gets sketchy when you look at the details. The claimed 68.5 million people includes 40.3 million served by fiber, 27.2 million served by cable company HFC networks, 822,000 served by cellular and 233,000 served by WiFi.

Each of those numbers is highly suspect. For example, the fiber numbers don’t include Verizon FiOS or the FiOS properties sold to Frontier. Technically that’s correct since most FiOS customers can buy maximum broadband speeds in the range of 800-900 Mbps. But there can’t be 40 million people other people outside of FiOS who can buy gigabit broadband from other fiber providers. I’m also puzzled by the cellular and WiFi categories and can’t imagine there is anybody that can buy gigabit products of either type.

VIAVI makes similar odd claims for the rest of the world. For example, they say that China has 61.5 million people that can get gigabit service. But that number includes 12.3 million on cellular and 6.2 million on WiFi.

Finally, the website lists the carriers that they believe offer gigabit speeds. I have numerous clients that own FTTH networks that are not listed, and I stopped counting when I counted 15 of my clients that are not on the list.

It’s clear this web site is flawed and doesn’t accurately count gigabit-capable people. However, it raises the question of how to count the number of people who have access to gigabit service. Unfortunately, the only way to do that today is by accepting claims by ISPs. We’ve already seen with the FCC broadband maps how unreliable the ISPs are when reporting broadband capabilities.

As I think about each broadband technology there are challenges in defining gigabit-capable customers. The Verizon situation is a great example. It’s not a gigabit product if an ISP caps broadband speeds at something lower than a gigabit – even if the technology can support a gigabit.

There are challenges in counting gigabit-capable customers on cable company networks as well. The cable companies are smart to market all of their products as ‘up to’ speeds because of the shared nature of their networks. The customers in a given neighborhood node share bandwidth and the speeds can drop when the network gets busy. Can you count a household as gigabit-capable if they can only get gigabit speeds at 4:00 AM but get something slower during the evening hours?

It’s going to get even harder to count gigabit capability when there are reliable cellular networks using millimeter wave spectrum. That spectrum is only going to able to achieve gigabit speeds outdoors when in direct line-of-site from a nearby cell site. Can you count a technology as gigabit-capable when the service only works outdoors and drops when walking into a building or walking a few hundred feet away from a cell site?

It’s also hard to know how to count apartment buildings. There are a few technologies being used today in the US that bring gigabit speeds to the front of an apartment building. However, by the time that the broadband suffers packet losses due to inside wiring and is diluted by sharing among multiple apartments, nobody gets a true gigabit product. But ISPs routinely count them as gigabit customers.

There is also the issue of how to not double-count households that can get gigabit speeds from multiple ISPs. There are urban markets with fiber providers like Google Fiber, Sonic, US Internet, EPB Chattanooga, and others where customers can buy gigabit broadband on fiber and also from the cable company. There are even a few lucky customers in places like Austin, Texas and the research triangle in North Carolina where some homes have three choices of gigabit networks after the telco (AT&T) also built fiber.

I’m not sure we need to put much energy into accurately counting gigabit-capable customers. I think everybody would agree an 850 to 950 Mbps connection on Verizon FiOS is blazingly fast. Certainly, a customer getting over 800 Mbps from a cable company has tremendous broadband capability. Technically such connections are not gigabit connections, but the difference between a gigabit connection and a near-gigabit connection for a household is so negligible as to not practically matter.

Starlink Making a Space Grab

SpaceNews recently reported that Elon Musk and his low-orbit space venture Starlink have filed with the International Telecommunications Union (ITU) to launch an additional 30,000 broadband satellites in addition to the 11,927 now in the planning stages. This looks like a land grab and Musk is hoping to grab valuable orbital satellite paths to keep them away from competitors.

The new requests consist of 20 filings requesting to deploy 1,500 satellites each in 20 different orbital bands around the earth. These filings are laying down the gauntlet for other planned satellite providers like OneWeb that has plans for 1,910 satellites, Kuiper (Jeff Bezos) with plans for 3,326 satellites and Samsung with plans for 4,600 satellites.

The Starlink announcements are likely aimed at stirring up regulators at the ITU, which is meeting at the end of this month to discuss spectrum regulations. The FCC has taken the lead in developing satellite regulations. Earlier this year the FCC established a rule where an operator must deploy satellites on a timely basis to keep the exclusive right of the spectrum needed to communicate with the satellites. Under the current FCC rules, a given deployment must be 50% deployed within six years and completely deployed within nine years. In September, Spacelink revised its launch plans with the FCC in a way that meets the new FCC guidelines, as follows:

Satellites Altitude (Km) 50% Completion 100% Completion
Phase 1 1,584 550 March 2024 March 2027
1,600 1,110
400 1,130
375 1,275
450 1,325
Phase 2 2,493 336 Nov 2024 Nov 2027
2,478 341
2,547 346
11,927

This is an incredibly aggressive schedule and would require the company to launch 5,902 satellites by November 24, 2024, or 120 satellites per month beginning in November 2019. To date, the company has launched 62 satellites. The company would then need to step launches up to 166 per month to complete the second half on time.

I’m guessing that Starlink is already starting to play the regulatory game. For example, if they can’t meet the launch dates over the US in that time frame, then some of the constellations might not work in the US. If the company eventually launches all of the satellites it has announced, then every satellite would not need to serve customers everywhere. If the ITU adopts a timeline similar to the US, then it’s likely that other countries won’t award spectrum to every one of the Starlink constellations. Starlink will be happy if each country gives it enough spectrum to be effective there. Starlink’s strategy might be to flood the sky with so many satellites that they can provide service anywhere as long as at least a few of their constellations are awarded spectrum in each country. There are likely to be countries like North Korea, and perhaps China that won’t allow any connections with satellite constellations that bypass their web firewalls.

Starlink faces an additional challenge with many of the planned launches. Any satellite with an orbit at less than 340 kilometers (211 miles) is considered as very low earth orbit (VLEO) since there is still enough earth atmosphere at that altitude to cause drag that eventually degrades a satellite orbit. Anything deployed at VLEO heights will have a shorter than normal life. The company has not explained how it plans to maintain satellites at the VLEO altitudes.

At this early stage of satellite deployment, there is no way to know if Starlink is at all serious about wanting to launch 42,000 satellites. This may just be a strategy to get more favorable regulatory rules. If Starlink is serious about this, you can expect other providers to speed up plans to avoid being locked out of orbital paths. We’re about to see an interesting space race.

The Myth of 5G and Driverless Cars

A colleague sent me an article that had been published earlier this year in MachineDesign magazine that predicts that driverless cars can’t be realized until we have a ubiquitous 5G network. When looking for the original article on the web I noticed numerous similar articles like this one in Forbes that have the same opinion.

These articles and other similar articles predict that high-bandwidth, low-latency 5G networks are only a few years away. I’m not quite sure who these folks think will invest the $100 billion or more that would likely be required to build such a wireless network along all of the roads in the country. None of the cellular carriers have such grandiose plans, and if they did their stockholders would likely replace a management team that suggested such an investment.

It’s easy to understand how this myth got started. When 5G was first discussed, the cellular companies listed self-driving cars as one of the reasons the government should support 5G. However, over time they’ve all dropped this application from their 5G message and it’s no longer a cellular company talking point.

The idea that 5G is needed for self-driving cars is bolstered by the belief that the computing power of a data center is needed to process the massive amounts of data generated by a self-driving car. That very well may be true, and the current versions of self-driving cars are essentially data centers on wheels that contain several fast computers.

The belief that 5G will enable self-driving cars also comes from the promise of low latency, near to that of a direct fiber connection. The folks that wrote these articles envision a massive 2-way data transfer happening constantly with 5G for every self-driving car. I can’t imagine they have ever talked to a network engineer about the challenge of creating 2-way wireless gigabit connections with hundreds of moving cars simultaneously on a freeway at rush hour. It’s hard to envision the small cell site and fiber infrastructure needed to handle that without hiccups. I also don’t know if the authors have recently driven down many rural reads recently to remind themselves of the huge challenge of implementing rural gigabit 5G.

The talk of using wireless for vehicles also ignores some fundamental issues. Wireless technologies are wonky in the real world. Radio waves do odd things in the wild and every wireless network has dead zones and places where the system mysteriously won’t work the way it’s supposed to. Worse, the dead spots and odd spots move around with changes in temperature, humidity, and precipitation.

Network engineers also would advise that for a critical task like driving at high speeds that every vehicle should have a redundant back-up connection, meaning a second wireless connection in case the first one has a problem. Anybody that puts critical tasks on a fiber network invests in such redundancy. Hospitals that use broadband as part of a surgical procedure or a factory that does precision manufacturing will have a second fiber connection to be safe. It’s hard to imagine a redundant connection for a moving car since the only place it can come from is the nearest cell sites that provide the primary connection.

I don’t know how other feel about this, but I’m not about to trust my life to a self-driving car that needs a connection to an external data center to be safe. I know too much about how broadband networks function to believe that 5G networks will somehow always make perfect connections when other fiber networks don’t.

One of the first things that came to my mind when I read these articles was to wonder what happens when there is a fiber outage on the network supporting the 5G cell sites. Do all of the self-driving cars just stop and wait for a broadband signal? I picture a city during an event like the 19-hour CenturyLink fiber outage a year ago and wonder if we are so stupid as to make our transportation systems reliant on external computing and external networks. I sure hope that we are not that dumb.

Looking Back at Looking Forward

I find it interesting to sometimes look backward a few years to see what predictions were made about the future of the telecom industry. Five years ago I went to an NTCA conference where several speakers made predictions about the industry, particularly as it would impact rural America. It’s interesting to look at what was predicted about today just a few years ago. Some predictions were dead on and others fizzled. Following are some of the more interesting misses.

Broadband Technologies. There were predictions that by 2020 that we’d see upgrades to G.Fast in rural copper networks and to next-generation PON equipment for fiber deployments. Neither of these happened for various reasons. US telcos have never accepted G.Fast, although there is widespread adoption in Europe where copper networks are delivering 300 Mbps speeds to customers. The big telcos in the US are making no investments in rural copper unless the FCC fully funds it. Many smaller telcos have taken advantage of changes in the Universal Service Fund to upgrade from copper to fiber rather than upgrade DSL. Next-generation PON electronics are still waiting for one big ISP to buy enough gear to lower prices.

Widespread 5G. It’s not hard to understand why this would have been believed in 2014 since the big carriers were already in hype mode even then. One prediction was that as many as 60% of cellphones would be 5G by 2020. There were several predictions that 5G was going to enable autonomous vehicles and that building fiber along highways would be routine by 2020. There was a prediction that we’d see small cells everywhere, with deployments every 3,000 feet.

The timing of 5G is far behind those predictions. I see where Cisco recently estimated that only 3% of cellphones worldwide would be 5G enabled by 2022. Most experts today believe that the cellular networks will still predominantly rely on 4G LTE even a decade from today. The idea of building a cellular network for autonomous vehicles died – it was always hard to imagine the revenue stream that would have supported that network. We may still get to a dense small cell network someday, but calling for a small cell every 3,000 feet still sounds incredibly aggressive even decades from now.

IoT and LPWAN. There was a prediction that by 2020 that we’d have deployed low bandwidth networks using 900 MHz spectrum that would connect to huge numbers of outdoor IoT sensors. The prediction was that there is a huge revenue opportunity to charge $1 monthly for each sensor. There are still those calling for these networks today, but it’s still not getting any widespread traction.

Widespread Adoption of Augmented and Virtual Reality. Those technologies were on everybody’s future list in 2014. Oculus Rift was the leader in developing virtual reality and Magic Leap had raised several rounds of funding to develop augmented reality. There is now a sizable gaming deployment of virtual reality, but virtual reality has not yet touched the average person or moved beyond gaming. Magic Leap finally started selling a developer headset at the end of last year.

We Should Be Overrun by Now with Robots and Drones. In 2014 there was a prediction of robots everywhere by 2020. New factories are manned today by robots, but robots are still news when they are used in a public-facing function. A few hotels are trying out a robot concierge. There are a few automated fast food restaurants. There are a few hospitals with robots that transport meals and medicines. Robots deliver take-out food in a few city centers and university towns.

Drones are quietly being used for functions like mapping and inspecting storm damage. Flying small drones is now a popular hobby. Amazon keeps experimenting with drone delivery of packages but it’s still in the trial stage. Commercial use of drones is still in its infancy.

Use of Data. My favorite prediction was that by 2020 we’d have software systems that can deliver data at the right place, at the right time, to the right person, on the right device. This harkens back to the old AT&T promise that someday we’d be able to watch any movie we wanted, the minute we wanted. To some degree that old promise came to pass, although it was implemented by somebody other than AT&T.

Some businesses are meeting parts of this prediction today. These are custom platforms that send trouble tickets to technicians, notify employees to connect a new customer, automate ordering of inventory, etc. However, nothing close to that promise has yet made it into our everyday lives. In fact, except for Candy Crush most of us probably still have the same apps on our smartphones we used in 2014. Many of us are still waiting for the digital assistant we were first promised a decade ago.

Got Some Things Right. It’s easy to pick on predictions that never came to pass and I’ve made plenty of those myself. There was some great prediction in 2014. One presenter said we’d continue to see the explosive growth of residential data usage, that would continue to grow at 24% per year – that’s still a dead-on prediction. There was a prediction that businesses would migrate employees to mobile devices and it is routine today to see employees in all sorts of businesses operating from a tablet. There was a prediction of explosive growth of machine-to-machine data traffic, and today this one of the areas fastest traffic growth.