The Slow Death of Satellite TV?

There has been rumors for years about merging Dish Networks and Direct TV to try to gain as much market synergy as possible for the two sinking businesses. It’s hard to label these companies as failures just yet because between two companies collectively still had 21.8 million customers at the end of 2020 (DirectTV 13.0 million, Dish 8.8 million). This makes the two companies collectively the largest provider of cable TV, with Comcast at 19.8 million and Charter at 16.2 million.

But both companies have been bleeding customers in the last few years. In 2020, DirecTV lost over 3 million customers and Dish Networks lost nearly 600,000. Together, the two companies lost 14% of customers in 2020. This is not unusual in the industry when we saw Comcast lose 1.4 million cable customers during the same year.

Dish Networks CEO Charlie Ergen has been predicting for years that a merger of the two companies is inevitable. The two companies could save money on infrastructure and overheads to prop up the combined businesses.

There are a number of factors that make a merger complicated. AT&T divested 30% of DirecTV earlier this year to TPG Capital. That included TV offered by DirecTV, U-Verse, and AT&T TV.

Probably the biggest long-term trend that bodes poorly for satellite TV is the federal government’s push to bring better broadband to rural America. Selling TV to customers with poor broadband is still the sweet spot for the two companies. As the number of homes with good broadband rises, the prospects for satellite TV sinks.

My firm has been doing community surveys for twenty years and we’ve noticed a big change in satellite TV penetrations. A decade ago, I expected to find a 15% market share of satellite TV in almost any town that we surveyed. But in the last few years, people in towns appear to be the ones that have bailed on satellite TV. It’s rare for us to find more than a few percent of households in towns who now buying satellite TV. Households have moved to the web to find video content, with the big losers being satellite TV and landline cable companies.

I also notice the same thing in traveling around the country. It used to be that you’d see satellite dishes peppered in every neighborhood. But I’ve noticed that satellite dishes are becoming a rarity. I know from walking in my neighborhood that only one house still has satellite TV. Just a few years ago there were many more.

Finally, these two companies are both saddled with the ever-increasing programming costs that have plagued the whole industry. Cable customers everywhere have rate fatigue as prices are increased every year to account for higher programming costs. Satellite TV is like the rest if the industry and is pricing itself out of the budget range for the average household.

The two companies are also each saddled with a lot of current debt. Craig Moffett, of MoffettNathanson recently estimated that the combined companies might not have a valuation of more than $1 billion – a bad harbinger for a merger.

It’s hard to picture any investor group that would want to back this merger. The whole idea behind a merger is that the combined company is worth more than the individual pieces. But even if the combined satellite companies were able to cut costs with a merger, it seems likely that any savings would quickly get subsumed by continued customer losses.

It’s not unrealistic to think that a decade from now that this industry will disappear. Maybe the companies can hang on longer even as the number of customers continues to drop – but the math of doing so doesn’t bode well. The end of the satellite TV industry would feel odd to me. I witnessed the meteoric growth of the industry and watched satellite dishes popping up everywhere in the US. Satellite TV could fall into the category of huge tech industries that popped into existence, grew, and then died within our adult lifetime. I’m betting that we’re not far off from the day when kids will have no idea what a satellite dish is, just as they now stare perplexed at dial telephones.

To 5.5G and Beyond

I recently saw an article in FierceWireless that reports that Huawei thinks we are going to need an intermediate step between 5G and 6G, something like 5.5G. To me, this raises the more immediate question about why we are not talking about the steps between 4G and 5G?

The wireless industry used to tell the truth about cellular technology. You don’t need to take my word for it – search Google for 3.5G and you’ll find mountains of articles from 2010 to 2015 that talked about 3.5G as an important intermediate step between 3G and 4G. It was clearly understood that it would take a decade to implement all of the specifications that defined 4G, and industry experts, manufacturers, and engineers all regularly debated about the level of 4G implementation. Few people realize that we didn’t have the first fully 4G compliant cell site until late 2018. Up until then, everything that was called 4G was something a little less than 4G. Interestingly, we debated the difference between 3.1G and 3.2G, but once the industry hit what might be considered as 3.5G, the chatter stopped, and the industry leaped to labeling everything as 4G.

That same industry hype that didn’t want to talk about 3.8G has remained intact, and somehow magically, we leaped to calling the next generation technology 5G before even one of the new 5G technologies has been implemented in the network. All we’ve done so far is to layer on new spectrum bands onto 4G phones and labeled that as 5G. These new spectrum bands require phones that can receive the new frequencies, which phone manufacturers gleefully label as 5G phones. I’m not convinced that we are even yet at 4.1G and yet the industry has fully endorsed labeling the first baby steps towards 5G as if we have full 5G.

I have to laugh to see articles already talking about what comes next after 5G. It’s like already picking the best marketing names for the self-driving hovercars that will be replacing regular self-driving cars. We are only partway down the path of implementing self-driving cars that people are ready to buy and trust. The government wouldn’t let a car manufacturer falsely declare it has a fully-self driving car – but we seem to have no problem allowing cellular companies to pronounce having 5G technology that doesn’t yet exist.

Back to the article about 6G. Huawei suggests that 5.5G would be 10 times faster than the current 5G specification and with lower latency. Unfortunately for this suggestion, we just suffered through a whole year of Verizon TV ads showing cellphones achieving gigabit plus speeds. It’s almost as if Huawei hasn’t seen the Verizon commercials and doesn’t know that the US already has 5.5G. I’m thrilled to be the first one to report that the US has already won the 5.5G race!

But it’s also somewhat ludicrous to be talking about 5.5G as an intermediate step on the way to 6G. The next generation of wireless technology we’re labeling as 6G will use terahertz spectrum. The wavelengths of those frequencies are so small that a beam of terahertz frequency beamed from a cellular tower will dissipate before it hits the ground. Even so, the technology holds out a lot of promise for providing extremely high bandwidth for indoor communications. But faster 5G is not an intermediate spot between today’s cellular technology and terahertz-based technology.

Interestingly, there could have been an intermediate step. We still have a long way to go to harness millimeter-wave spectrum in the wild. These frequencies require pure line-of-sight and pass through virtually nothing. I would expect over the next decade or two that lab scientists will find much better ways to propagate and use millimeter-wave spectrum.

But the cellular industry already claims it has solved all of the issues with millimeter-wave spectrum and already claim it as part of today’s 5G solution. It’s going to be anticlimactic when scientists announce breakthroughs in ways to use millimeter-wave spectrum that the cellular industry has already been claiming. Using millimeter-wave spectrum to its fullest capability could have been 5.5G. I can’t wait to see what the industry claims instead.

Reporting the Broadband Floor

I want to start by giving a big thanks to Deb Socia for today’s blog. I wrote a recent blog about the upcoming public reporting process for the FCC maps. In that blog, I noted that ISPs are going to be able to continue to report marketing speeds in the new FCC mapping. An ISP that may be delivering 3 Mbps download will continue to be able to report broadband speeds of 25/3 Mbps as long as that is marketed to the public. This practice of allowing marketing speeds that are far faster than actual speeds has resulted in a massive overstatement of broadband availability. This is the number one reason why the FCC badly undercounts the number of homes that can’t get broadband. The FCC literally encourages ISPs to overstate the broadband product being delivered.

In my Twitter feed for this blog, Deb posted a brilliant suggestion, “ISPs need to identify the floor instead of the potential ceiling. Instead of ‘up to’ speeds, how about we say ‘at least’”.

This simple change would force some honesty into FCC reporting. This idea makes sense for many reasons. We have to stop pretending that every home receives the same broadband speed. The speed delivered to customers by many broadband technologies varies by distance. Telco DSL speeds get noticeably slower the further they are transmitted. The fixed wireless broadband delivered by WISPs loses speed with distance from the transmitting tower. The fixed cellular broadband that the big cellular companies are now pushing has the same characteristic – speeds drop quickly with the distance from the cellular tower.

It’s a real challenge for an ISPs using any of these technologies to pick a representative speed to advertise to customers – but customers want to know a speed number. DSL may be able to deliver 25/3 Mbps for a home that’s within a quarter-mile of a rural DSLAM. But a customer eight miles away might be lucky to see 1 Mbps. A WISP might be able to deliver 100 Mbps download speeds within the first mile from a tower, but the WISP might be willing to sell to a home that’s 10 miles away and deliver 3 Mbps for the same price. The same is true for the fixed cellular data plans recently being pushed by A&T, Verizon, and T-Mobile. Customers who live close to a cell tower might see 50 Mbps broadband, but customers further away are going to see a tiny fraction of that number.

The ISPs all know the limitations of their technology, but the FCC has never tried to acknowledge how technologies behave in real markets. The FCC mapping rules treat each of these technologies as if the speed is the same for every customer. Any mapping system that doesn’t recognize the distance issue is going to mostly be a huge fiction.

Deb suggests that ISPs must report the slowest speed they are likely to deliver. I want to be fair to ISPs and I suggest they report both the minimum “at least” speed and the maximum “up to” speed. Those two numbers will tell the right story to the public because together they provide the range of speeds being delivered in a given Census. With the FCC’s new portal for customer input, the public could weigh in on the “at least” speeds. If a customer is receiving speeds slower than the “at least” speeds, then, after investigation, the ISP would be required to lower that number in its reporting.

This dual reporting will also allow quality ISPs to distinguish themselves from ISPs that cut corners. If a WISP only sells service to customers within 5 or 6 miles of a transmitter, then the difference between its “at least” speeds and its “up to” speeds would be small. But if another WISP is willing to sell a crappy broadband product a dozen miles from the transmitter, there would be a big difference between its two numbers. If this is reported honestly, the public will be able to distinguish between these two WISPs.

This dual reporting of speeds would also highlight the great technologies – a fiber network is going to have a gigabit “at least” and “up to” speed. This dual reporting will end the argument that fixed wireless is a pure substitute for fiber – which it clearly is not. Let the two speeds tell the real story for every ISP in the place of marketing hype.

I’ve been trying for years to find a way to make the FCC broadband maps meaningful. I think this is it. I’ve never asked this before, but everybody should forward this blog to the FCC Commissioners and politicians. This is an idea that can bring some meaningful honesty into the FCC broadband maps.

Controlling Fiber Construction Costs

It’s obvious with all of the grant money coming downhill from the federal government that there is going to be a lot of fiber constructed over the next year or two, and much of it by municipalities or other entities that have not built fiber before. Today’s blog talks about issues that can increase the cost of building fiber – an important topic since cost overruns could be devastating to an entity that is largely funded with grants.

I think everybody knows of cases where the funding for infrastructures has gone off the rails, with the final cost of a project being much higher than what was originally funded. I can remember when I last lived near DC and watched the cost of a new Beltway bridge over the Potomac come in at more than twice the original cost estimate. I can remember instances of big cost overruns for infrastructure like schools and roads. Cost overruns can also easily happen on fiber projects.

The number one issue facing the whole industry right now is shortages in the supply chain. I have clients seeing relatively long delivery times for fiber and fiber electronics. New entities that have never built fiber are going to go to the end of the line for receiving fiber. To the extent that grant-funded projects come with a mandated completion date, this is going to be an issue for some projects.

But more importantly, labor-related costs for building fiber are going to rise (and have already started doing so). With a huge volume of new projects, there will be a big shortage of consultants, engineers, and construction contractors. Like always happens in times of high demand, this means labor rates are going to rise – and that’s even assuming you can find somebody to work on a small project. One of the hidden facts in the industry is that very few construction companies build 100% with staff and heavily rely on subcontractors. Those subcontractors are going to be bid away from small projects to get more lucrative work for big projects. Even ISPs that build with their own crews are going to see staff lured away by higher pay rates. If you estimated the cost of building fiber a few years ago, the labor component of those estimates is now too low. Another issue to consider is that some grants require paying labor at prevailing wages, which means at metropolitan rates. This alone can add 15% or more to the cost of a rural fiber project.

The biggest crunch will be consultants and engineers who work for smaller projects. I’m in this category. There are only a handful of good consultants and engineers and we’re already seeing that we are going to be swamped and fully booked before this year is over. Don’t be surprised if you hear that your preferred vendors are not taking on new business.

The other big gotcha in fiber construction projects is change orders. This means any event that gives a construction contractor a chance to charge more than the original proposed cost of construction. Using the example of the bridge that went over budget – most of the extra costs came through change orders.

There are construction firms that bid low for projects with the expectation that they’ll make a lot more from change orders. You want to interview other communities that used the contractors you are considering. But a lot of change order costs can be laid at the feet of the project owner. It’s not unusual to see a project go out to bid that is not fully engineered and thought through. Changing your mind on almost any aspect of a project can mean extra costs and cost overruns. Here are just a few examples of situations I have seen on projects that added to the costs:

  • After the first neighborhood of a project was built, the client decided that they didn’t like fiber pedestals and wanted everything put into buried handholes. That meant ripping and replacing what had already been built and completely swapping inventory.
  • A contractor ran into a big underground boulder that was incredibly difficult to bore through. This was a city network, and the city would not allow an exception to build shallower only at this boulder and insisted on boring through it – at a huge, unexpected cost.
  • I worked on a project where the original specification was to build past every home and business in the community. Once construction was started the client decided to build fiber to every street, including the ones with no current buildings. That’s a valid decision to make, but it added a lot to construction costs.

I could write a week worth of blogs listing situations that added to construction costs. The bottom line for almost all of these issues is that the fiber builder needs to know what they want before a project starts. There should be at least preliminary engineering that closely estimates the cost of construction before starting. Project owners also need to be flexible if the contractor points out opportunities to save costs. But my observation is that a lot of change orders and cost overruns come from network owners that don’t know what they want before construction starts.

Rural Redundancy

This short article details how a burning tree cut off fiber optic access for six small towns in Western Massachusetts. This included Ashfield, Colrain, Cummington, Heath, Plainfield, and Rowe. I not writing about this today because this fiber cut was extraordinary, but because it’s unfortunately very ordinary and usual. There are fiber cuts every day that isolate communities by cutting Internet access.

It’s not hard to understand why this happens in rural America. In much of the country, the fiber backbone lines that support Internet access to rural towns use the same routes that were built years ago to support telephone service. The telephone network is configured using a hub and spoke, and all of the towns in a region have a single fiber line into a single central tandem switch that was the historic focal point for regional telephone switching.

Unfortunately, a hub and spoke network (which resembles the spokes of a wagon wheel) does not have any redundancy. Each little town or clusters of towns typically had a single path to reach the telephone tandem – and today to reach the Internet.

The problem is that an outage that historically would have interrupted telephone service now interrupts broadband. This one cut in Massachusetts is a perfect example of how reliant we’ve become on broadband. Many businesses shut down completely without broadband. Businesses take orders and connect with customers in the cloud. Credit card processing happens remotely in the cloud. Businesses are often connected to distant corporate servers that provide everything from software connectivity to voice over IP. A broadband outage cuts off students taking classes from home and adults working from home. An Internet outage cripples most work-from-home people who work for distant employers. A fiber cut in a rural area can also cripple cell service if the cellular carriers use the same fiber routes.

The bad news is that nobody is trying to fix the problem. The existing rural fiber routes are likely owned by the incumbent telephone companies and they are not interested in spending money to create redundancy. Redundancy in the fiber world means having a second fiber route into an area so that the Internet doesn’t go dead if the primary fiber is cut. One of the easiest ways to picture a redundant solution is to picture a ring of fiber that would be equivalent to the rim of the wagon wheel. This fiber would connect all of the ‘spokes’ and provide am alternate route for Internet traffic.

To make things worse, the fiber lines reaching into rural America are aging. These were some of the earliest fiber routes built in the US, and fiber built in the 1980s was not functionally as good as modern fiber. Some of these fibers are already starting to die. We’re going to be faced eventually with the scenario of fiber lines like the one referenced in this article dying, and possibly not being replaced. A telco could use a dying fiber line as a reason to finally walk away from obsolete copper DSL in a region and refuse to repair a dying fiber line. That could isolate small communities for months or even a few years until somebody found the funding to replace the fiber route.

There have been regions that have tackled the redundancy issue. I wrote a blog last year about Project Thor in northwest Colorado where communities banded together to create the needed redundant fiber routes. These communities immediately connected critical infrastructure like hospitals to the redundant fiber and over time will move to protect more and more Internet traffic in the communities from routine and crippling fiber cuts.

This is a problem that communities are going to have to solve on their own. This is not made easier by the current fixation of only using grants to build last-mile connectivity and not middle-mile fiber. All of the last mile fiber in the world is useless if a community can’t reach the Internet.

Big Funding for Libraries

North Asheville Library

The $1.9 trillion American Rescue Plan Act (ARPA) includes a lot of interesting pockets of funding that are easy to miss due to the breadth of the Act. The Act quietly allocates significant funding to public libraries, which have been hit hard during the pandemic.

The ARPA first allocates $200 million to the Institute of Museum and Library Services. This is an independent federal agency that provides grant funding for libraries and museums. $178 million of the $200 million will be distributed through the states to libraries. Each state is guaranteed to get at least $2 million, with the rest distributed based upon population. This is by far the largest federal grant ever made directly for libraries.

Libraries are also eligible to apply to the $7.172 billion Emergency Connectivity Fund that the ARPA is funding through the FCC’s E-Rate program. This program can be used to compensate for hotspots, modems, routers, laptops, and other devices that can be lent to students and library patrons to provide broadband.

The ARPA also includes $360 billion in funding that will go 60% to states and 40% directly to local governments and tribal governments. Among other things, this funding is aimed at offsetting cuts made during the pandemic to public health, safety, education, and library programs.

There is another $130 billion aimed at offsetting the costs associated with reopening K-12 schools to be used for hiring staff, reducing class sizes, and addressing student needs. The funds can also be invested in technology support for distance learning, including 20% that can be used to address learning loss during the pandemic. This funding will flow through the Department of Education based upon Title I funding that supports schools based upon the level of poverty.

Another $135 million will be flowed through the National Endowment for the Arts and Humanities to support state and regional arts and humanities agencies. At least 60% of this funding is designated for grants to libraries.

There is also tangential funding that could support libraries. This includes $39 billion for Child Care and Development Block Grants and Stabilization Fund plus $1 billion for Head Start that might involve partnerships with schools and libraries. There is also $9.1 billion to states and $21.9 billion for local programs for afterschool and summer programs to help students catch back up from what was a lost school year for many.

It’s good to see this funding flow to libraries. Many people may not understand the role that libraries play in many communities as the provider of broadband and technology for people that can’t afford home broadband. Libraries have struggled to maintain this role through the pandemic and the restrictions of not allowing patrons into libraries. Libraries in many communities have become the focal point for the distribution of broadband devices during the pandemic.

One of the lessons that the pandemic has taught us is that we need to connect everybody to broadband. As hard as the pandemic has been on everybody, it’s been particularly hard on those that couldn’t connect during the pandemic. This continues today as many states have established vaccine portals completely online.

Communities everywhere owe a big thanks to librarians for the work they’ve done in the last year to keep our communities connected. When you get a chance, give an elbow bump to your local librarian.

Investing in Rural Broadband

There was a headline in a recent FierceTelecom article that I thought I’d never see –  Jeffries analyst says rural broadband market is ripe for investment. In the article, analyst George Notter is quoted talking about how hot rural broadband is as an investment. He cites the large companies that have been making noise about investing in rural broadband.

Of course, that investment relies on getting significant rural grants. We’ve seen the likes of Charter, Frontier, CenturyLink, Windstream, and others win grants in the recent RDOF reverse auction. I have municipal clients who are having serious discussions with other large incumbents about partnering – when these incumbents wouldn’t return a call a year ago. It’s amazing how billions of dollars of federal grants can quickly change the market. Practically every large carrier in the country is looking at the upcoming broadband grants as a one-time opportunity to build broadband networks cheaply.

This is a seismic change for the industry. Dozens of subdivisions with losuy broadband have contacted me over the years wondering how to get the interest of the nearby cable incumbent. We’ve just gone through a decade when there has been little expansion by the cable companies in terms of footprint. In many cases the reluctance for a cable company to build only a few miles of fiber to reach a community of several hundred homes has been puzzling – these subdivisions often look like a good business opportunity to me. The first carrier to build broadband in such areas is likely to get 70% to 90% of the households as customers almost immediately.

The analyst mentioned the newly found interest in rural broadband from the cellular carriers. It’s been a mystery for me over the last decade why AT&T, Verizon, and others didn’t take advantage of rural cellular towers to get new broadband customers. There are a lot of places in rural America where cellular broadband has been superior to rural DSL and satellite broadband. It’s odd to finally see these carriers want to build now, at a time when people are hoping for technologies that are faster than cellular broadband. The cellular carriers instead have poisoned the rural market by selling cellular hotspot plans with tiny data caps. I heard numerous stories during the pandemic of families spending $500 to $1,000 per month on a hotspot – with the alternative being throttled to dial-up speeds after hitting the small data caps. These customers are never going back to the cellular carriers if they get a different option.

Some of the sudden expansion of the big companies mystifies me. For example, Charter won $1.2 billion in the RDOF to expand into rural areas. The company is matching this with $3.8 billion of its own money. That means Charter is building rural broadband with a 24% federal grant. I’ve studied some of these same grant areas and I couldn’t see a way to build these rural communities without grants of at least 50% of the cost of construction. The RDOF might make sense when Charter is building to areas that are directly adjacent to an existing market. But Charter took grants in counties where it doesn’t have an existing customer. This makes me wonder how much the company is going to eventually like what it has bitten off. I’m betting we won’t see articles talking about rural investment opportunities after a few big ISPs bungle the expansion into rural areas.

When talking about how rural properties are good investments due to grant money, I always wonder if the companies thinking about this are considering the extra operational costs in rural areas. Truck rolls are a lot longer than in an urban market. There are a lot of miles of cable plant that are subject to being cut. Before the pandemic, 16% of states and 35% of counties had a sustained population decrease. Even with grant funding, many rural communities are sparsely populated and often suffer from low household incomes. Even with grant funding, it’s hard to see an ISP doing much better than break even in many rural communities – something cooperatives and municipalties are willing to undertake but which is poison for publicly traded corporations.

Unfortunately, I think I know at least some of the reasons why some companies are attracted to the grants. The big telcos have been cutting the workforce and curtailing maintenance costs and efforts for decades. It’s a lot easier to make money with a grant-funded rural market if a carrier already plans to scrimp on needed maintenance expenditures. To me, that’s the subtle message not mentioned in the Jeffries’s analyst opinion – too many big carriers know how to milk grant money to gain a financial advantage. Unfortunately, those kinds of investors are going to do more long-term harm than good in rural America.

Public Reporting of Broadband Speeds

The FCC’s Acting Chairman Jessica Rosenworcel wrote a recent blog that talks about the progress the FCC is making towards revising the FCC mapping system. The blog concentrates on the upcoming consumer portal to provide input into the FCC maps.

It’s good to see progress finally being made on the maps – this has been discussed but not implemented for over two years. And it’s good that the public will have a way to provide input to the FCC database. Hopefully, the FCC will change the rules before the new mapping tools are implemented because the current rules don’t let the public provide any worthwhile input to the mapping data.

The current mapping rules were implemented in Docket FCC 21-20 on January 13 of this year – one of the last acts of outgoing Chairman Ajit Pai. Those rules outline a consumer input process to the mapping that is going to be a lot less impactful than what the public is hoping for.

The new FCC maps will require that ISPs draw ‘polygons’ around the areas where there is existing broadband coverage, or where the ISP can install broadband with 10 days of a consumer request. A consumer can challenge the availability of broadband at their home. If a consumer claims that broadband is not available at an address, the ISP is required to respond. If there is no broadband available at the address, the likely response of the ISP will be to amend the polygon to exclude the challenged address. I guess that consumers who can’t buy broadband from a given ISP can gain some satisfaction from having that ISP fix the maps to set the record straight. But the complaint is unlikely to get broadband to the home where broadband is not available.

Unfortunately, the challenge process is not going to help in the much more common situation where a household has dreadfully slow broadband. The ISP might be advertising speeds of ‘up to 25/3 Mbps’ but delivering only a tiny fraction of that speed. This is the normal situation for rural DSL and many fixed wireless connections – speeds customers see are much slower than what ISPs claim on the FCC maps.

Unless the FCC changes the rules established in this Docket, a consumer claiming slow broadband will see no change to the FCC map. The January rules allow ISPs to continue to claim marketing speeds in the new FCC mapping system. A rural ISP can continue to claim ‘up to 25/3 Mbps’ for an area with barely functioning broadband as long as the ISP advertises the faster up-to speed.

The FCC needs to change the rules established in the January Docket or they are going to witness a rural revolt. Consumers that are seeing broadband speeds that are barely faster than dial-up are going to flock to the new FCC reporting portal hoping for some change. Under the current rules, the FCC is going to side with the ISP that advertises speeds faster than it delivers.

The FCC has a real dilemma on how to change the public reporting process. The FCC can’t automatically side with each consumer. Any given consumer that reports slow speeds might be seeing the impact of an old and outdated WiFi router, or have some other issue inside the home that is killing the speed delivered by the ISP. But when multiple homes in a neighborhood report slow speeds, then the ISP is almost certainly delivering slow speeds.

Unfortunately, there is no way to report ‘actual’ speeds on an FCC map. If you ever ran a speed test multiple times during a day and night you know that the broadband speed at your home likely varies significantly during a day. What’s the ‘actual’ broadband data speed for a home that sees download speeds vary from 5 Mbps to 15 Mbps at different times of the day?

The consumer challenge of FCC data was dreamed up to allow the public to tell a broadband story different than what the ISPs have been reporting to the FCC. Unfortunately, it’s not going to work to anybody’s satisfaction. The real culprit in this story is the idea that we can define broadband somehow by speed – that there is a functional difference between a broadband connection that delivers 5 Mbps or 15 Mbps. The fact is that both connections are dreadfully slow and should not be considered as broadband. But as long as we have grant programs that fund areas that have speeds under 10/1 Mbps or 25/3 Mbps, we’ll keep having these dumb processes that pretend that we know the actual speed on even a single rural broadband connection. The fact is – we don’t and we can’t.

AT&T Says No to Symmetrical Broadband

Since it seems obvious that the new FCC will take a hard look at the definition of broadband, we can expect big ISPs to start the lobbying effort to persuade the FCC to make any increase in the definition as painless as possible. The large ISPs seem to have abandoned any support for the existing definition of 25/3 Mbps because they know sticking with it gets them laughed out of the room. But many ISPs are worried that a fast definition of broadband will bypass their technologies – any technology that can’t meet a revised definition of broadband will not be eligible for future federal grants, and even more importantly can be overbuilt by federal grant recipients.

AT&T recently took the first shot I’ve seen in the speed definition battle. Joan March, the Executive VP of Federal Regulatory Relations wrote a recent blog that argues against using symmetrical speeds in the definition of bandwidth. AT&T is an interesting ISP because the company operates three different technologies. In urban and suburban areas AT&T has built fiber to pass over 14 million homes and businesses and says they are going to pass up to 3 million more over the next year or two. The fiber technology offers at least a symmetrical gigabit product. AT&T is also still a huge provider of DSL, but the company stopped installing DSL customers in October of last year. AT&T’s rural DSL has speeds far south of the FCC’s 25/3 definition of bandwidth, although U-verse DSL in larger towns has download speeds as fast as 50 Mbps.

The broadband product that prompted the blog is AT&T’s rural cellular product. This is the company’s replacement for DSL, and AT&T doesn’t want the FCC to declare the product as something less than broadband. AT&T rightfully needs to worry about this product not meeting the FCC definition of broadband – because in a lot of places it is slower than 25/3 Mbps.

Reviews.org looks at over one million cellular data connections per year and calculates the average data speeds for the 3 big cellular carriers. The report for early 2021 shows the following nationwide average speeds for cellular data. These speeds just barely qualify as broadband with the current 25/3 definition.

AT&T – 29.9 Mbps download, 9.4 Mbps upload

T-Mobile – 32.7 Mbps download, 12.9 Mbps upload

Verizon – 32.2 Mbps download, 10.0 Mbps upload

PC Magazine tests cellular speeds in 26 major cities each summer. In the summer of 2020, they showed the following speeds:

AT&T – 103.1 Mbps download, 19.3 Mbps upload

T-Mobile – 74.0 Mbps download, 25.8 Mbps upload

Verizon – 105.1 Mbps download, 21.6 Mbps upload

Cellular data speeds are faster in cities for several reasons. First, there are more cell sites in cities. The data speed a customer receives on cellular is largely a function of how far the customer is from a cell site, and in cities, most customers are within a mile of the closest cell site. The cellular carriers have also introduced additional bands of spectrum in urban areas that are not being used outside cities. The biggest boost to the AT&T and Verizon urban speeds comes from the deployment of millimeter-wave cellular hotspots in small areas of the downtowns in big cities – a product that doesn’t use traditional cell sites, but which helps to increase the average speeds.

Comparing the urban speeds to the average speeds tells us that rural speeds are even slower than the averages. In rural areas, cellular customers are generally a lot more than one mile from a cell tower, which really reduces speeds. My firm does speed tests, and I’ve never seen a rural fixed cellular broadband product with a download speed greater than 20 Mbps, and many are a lot slower.

The AT&T blog never makes a specific recommendation of what the speeds ought to be. But Marsh hints at a new definition at 50/10 or 100/20. My firm has also done a lot of surveys during the pandemic and we routinely see about one-third of households or more that are unhappy with the upload speeds on urban cable company networks – which have typical upload speeds between 15 Mbps and 20 Mbps. AT&T is hoping that the FCC defines broadband with an upload speed of 10-20 Mbps – a speed that many homes already find inadequate today. That’s the only way that rural fixed cellular can qualify as broadband.

The 6G Hype is Already Starting

Even though 5G hasn’t yet made it onto any cellphone, the wireless vendor industry is already off and running looking at the next generation of wireless technology that has been dubbed as 6G. This recent article describes the European Union Hexa-X project that started in January to look at developing specifications for next-generation wireless technology using terahertz spectrum. The initiative will be led by Nokia Bell Labs and Ericsson. Similar research is being done elsewhere around the world by companies such as Huawei, NTT, and Samsung.

6G wireless will explore using the high frequencies between 100 GHz and 1 THz (terahertz), which are collectively being referred to as terahertz frequencies. These are radio waves that are just below the frequencies of infrared light. These frequencies have such short waves, that at the upper end, the frequencies could carry as much as 1,000 times more bandwidth than the frequencies used in cellphones today.

But there is a huge trade-off for the huge bandwidth capacity in that these frequencies travel only short distances, measured in a few feet, before starting to dissipate. These frequencies will not pass through any obstacle and need a clear line of sight.

It’s likely that any 6G technology will be used for indoor data transmission, and 6G could become the fastest delivery mechanism of bandwidth to use within a room between devices. The bandwidth capabilities of these superhigh frequencies could finally fully enable technologies like telepresence (I finally get a holodeck!), or cobots (interactive robots).

Of course, like with any new technology, there is also already hype. Samsung recently released a whitepaper that said that using terahertz waves for cellphones is ‘inevitable’. Long before we try to somehow tame terahertz frequencies in the wild, we need to first figure out millimeter-wave cellular technologies. The current use of millimeter-wave hotspots in downtown metropolitan areas has provided cover for cellular carriers to hype gigabit speeds 5G – but this is a miserable technology in terms of usefulness or reliability. The millimeter-wave spectrum is blocked by everything in the environment, including the body of the user.

More importantly, I’ve never heard anybody make a coherent description of why we need to deliver gigabit or faster speeds to cellphones. If we modify cellphones to process data that quickly we’ll need to find a way to recharge the phones every hour. While I understand why engineers go gaga over the idea of delivering a hundred or a thousand times more data to a cellphone, we need a reality check to ask why anybody would want to do that. Smartphones might be the most important technology developed in this century, but there seems to be little need to turn cellphones into a walking data center unless we want to also start carrying around small air-conditioning units to keep the chips cool.

It makes sense that device makers like Nokia and Ericsson would get excited over the next generation of wireless devices. It’s not hard to envision entirely new technologies twenty years from now that take advantage of terahertz frequencies. Seriously, who is not going to want a holodeck in their living room?

Interestingly, the introduction of 6G is likely going to be of less value to the big cellular carriers. These companies have already started to lose the indoor battle for 5G. Verizon and AT&T had once envisioned a world where homeowners would buy monthly 5G data plans for all of the wired devices in our home. But the FCC already gutted that idea by releasing 6 GHz spectrum for free use, which manufacturers are marrying to the new WiFi 6 standard. As is inevitable, a free solution that doesn’t require a monthly subscription is going to capture most of the indoor market. We’re not going to be buying a 5G subscription for our 8K TV when we have WiFi 6 operating from a $100 router.

One has to imagine the same future for terahertz frequencies. The FCC will eventually create at least one band of terahertz frequency that anybody can use, and that’s the frequency that will power the superfast devices in our homes and offices.

One thing that the early 6G hype fails to mention is the fiber networks that will be needed to fuel superfast applications. We aren’t going to be operating a holodeck using a measly 1 Gbps broadband connection. Twenty years from now, techie households will be screaming for the delivery of 100 Gbps bandwidth to support their terahertz gaming applications.