Rural Redundancy

This short article details how a burning tree cut off fiber optic access for six small towns in Western Massachusetts. This included Ashfield, Colrain, Cummington, Heath, Plainfield, and Rowe. I not writing about this today because this fiber cut was extraordinary, but because it’s unfortunately very ordinary and usual. There are fiber cuts every day that isolate communities by cutting Internet access.

It’s not hard to understand why this happens in rural America. In much of the country, the fiber backbone lines that support Internet access to rural towns use the same routes that were built years ago to support telephone service. The telephone network is configured using a hub and spoke, and all of the towns in a region have a single fiber line into a single central tandem switch that was the historic focal point for regional telephone switching.

Unfortunately, a hub and spoke network (which resembles the spokes of a wagon wheel) does not have any redundancy. Each little town or clusters of towns typically had a single path to reach the telephone tandem – and today to reach the Internet.

The problem is that an outage that historically would have interrupted telephone service now interrupts broadband. This one cut in Massachusetts is a perfect example of how reliant we’ve become on broadband. Many businesses shut down completely without broadband. Businesses take orders and connect with customers in the cloud. Credit card processing happens remotely in the cloud. Businesses are often connected to distant corporate servers that provide everything from software connectivity to voice over IP. A broadband outage cuts off students taking classes from home and adults working from home. An Internet outage cripples most work-from-home people who work for distant employers. A fiber cut in a rural area can also cripple cell service if the cellular carriers use the same fiber routes.

The bad news is that nobody is trying to fix the problem. The existing rural fiber routes are likely owned by the incumbent telephone companies and they are not interested in spending money to create redundancy. Redundancy in the fiber world means having a second fiber route into an area so that the Internet doesn’t go dead if the primary fiber is cut. One of the easiest ways to picture a redundant solution is to picture a ring of fiber that would be equivalent to the rim of the wagon wheel. This fiber would connect all of the ‘spokes’ and provide am alternate route for Internet traffic.

To make things worse, the fiber lines reaching into rural America are aging. These were some of the earliest fiber routes built in the US, and fiber built in the 1980s was not functionally as good as modern fiber. Some of these fibers are already starting to die. We’re going to be faced eventually with the scenario of fiber lines like the one referenced in this article dying, and possibly not being replaced. A telco could use a dying fiber line as a reason to finally walk away from obsolete copper DSL in a region and refuse to repair a dying fiber line. That could isolate small communities for months or even a few years until somebody found the funding to replace the fiber route.

There have been regions that have tackled the redundancy issue. I wrote a blog last year about Project Thor in northwest Colorado where communities banded together to create the needed redundant fiber routes. These communities immediately connected critical infrastructure like hospitals to the redundant fiber and over time will move to protect more and more Internet traffic in the communities from routine and crippling fiber cuts.

This is a problem that communities are going to have to solve on their own. This is not made easier by the current fixation of only using grants to build last-mile connectivity and not middle-mile fiber. All of the last mile fiber in the world is useless if a community can’t reach the Internet.

Big Funding for Libraries

North Asheville Library

The $1.9 trillion American Rescue Plan Act (ARPA) includes a lot of interesting pockets of funding that are easy to miss due to the breadth of the Act. The Act quietly allocates significant funding to public libraries, which have been hit hard during the pandemic.

The ARPA first allocates $200 million to the Institute of Museum and Library Services. This is an independent federal agency that provides grant funding for libraries and museums. $178 million of the $200 million will be distributed through the states to libraries. Each state is guaranteed to get at least $2 million, with the rest distributed based upon population. This is by far the largest federal grant ever made directly for libraries.

Libraries are also eligible to apply to the $7.172 billion Emergency Connectivity Fund that the ARPA is funding through the FCC’s E-Rate program. This program can be used to compensate for hotspots, modems, routers, laptops, and other devices that can be lent to students and library patrons to provide broadband.

The ARPA also includes $360 billion in funding that will go 60% to states and 40% directly to local governments and tribal governments. Among other things, this funding is aimed at offsetting cuts made during the pandemic to public health, safety, education, and library programs.

There is another $130 billion aimed at offsetting the costs associated with reopening K-12 schools to be used for hiring staff, reducing class sizes, and addressing student needs. The funds can also be invested in technology support for distance learning, including 20% that can be used to address learning loss during the pandemic. This funding will flow through the Department of Education based upon Title I funding that supports schools based upon the level of poverty.

Another $135 million will be flowed through the National Endowment for the Arts and Humanities to support state and regional arts and humanities agencies. At least 60% of this funding is designated for grants to libraries.

There is also tangential funding that could support libraries. This includes $39 billion for Child Care and Development Block Grants and Stabilization Fund plus $1 billion for Head Start that might involve partnerships with schools and libraries. There is also $9.1 billion to states and $21.9 billion for local programs for afterschool and summer programs to help students catch back up from what was a lost school year for many.

It’s good to see this funding flow to libraries. Many people may not understand the role that libraries play in many communities as the provider of broadband and technology for people that can’t afford home broadband. Libraries have struggled to maintain this role through the pandemic and the restrictions of not allowing patrons into libraries. Libraries in many communities have become the focal point for the distribution of broadband devices during the pandemic.

One of the lessons that the pandemic has taught us is that we need to connect everybody to broadband. As hard as the pandemic has been on everybody, it’s been particularly hard on those that couldn’t connect during the pandemic. This continues today as many states have established vaccine portals completely online.

Communities everywhere owe a big thanks to librarians for the work they’ve done in the last year to keep our communities connected. When you get a chance, give an elbow bump to your local librarian.

Investing in Rural Broadband

There was a headline in a recent FierceTelecom article that I thought I’d never see –  Jeffries analyst says rural broadband market is ripe for investment. In the article, analyst George Notter is quoted talking about how hot rural broadband is as an investment. He cites the large companies that have been making noise about investing in rural broadband.

Of course, that investment relies on getting significant rural grants. We’ve seen the likes of Charter, Frontier, CenturyLink, Windstream, and others win grants in the recent RDOF reverse auction. I have municipal clients who are having serious discussions with other large incumbents about partnering – when these incumbents wouldn’t return a call a year ago. It’s amazing how billions of dollars of federal grants can quickly change the market. Practically every large carrier in the country is looking at the upcoming broadband grants as a one-time opportunity to build broadband networks cheaply.

This is a seismic change for the industry. Dozens of subdivisions with losuy broadband have contacted me over the years wondering how to get the interest of the nearby cable incumbent. We’ve just gone through a decade when there has been little expansion by the cable companies in terms of footprint. In many cases the reluctance for a cable company to build only a few miles of fiber to reach a community of several hundred homes has been puzzling – these subdivisions often look like a good business opportunity to me. The first carrier to build broadband in such areas is likely to get 70% to 90% of the households as customers almost immediately.

The analyst mentioned the newly found interest in rural broadband from the cellular carriers. It’s been a mystery for me over the last decade why AT&T, Verizon, and others didn’t take advantage of rural cellular towers to get new broadband customers. There are a lot of places in rural America where cellular broadband has been superior to rural DSL and satellite broadband. It’s odd to finally see these carriers want to build now, at a time when people are hoping for technologies that are faster than cellular broadband. The cellular carriers instead have poisoned the rural market by selling cellular hotspot plans with tiny data caps. I heard numerous stories during the pandemic of families spending $500 to $1,000 per month on a hotspot – with the alternative being throttled to dial-up speeds after hitting the small data caps. These customers are never going back to the cellular carriers if they get a different option.

Some of the sudden expansion of the big companies mystifies me. For example, Charter won $1.2 billion in the RDOF to expand into rural areas. The company is matching this with $3.8 billion of its own money. That means Charter is building rural broadband with a 24% federal grant. I’ve studied some of these same grant areas and I couldn’t see a way to build these rural communities without grants of at least 50% of the cost of construction. The RDOF might make sense when Charter is building to areas that are directly adjacent to an existing market. But Charter took grants in counties where it doesn’t have an existing customer. This makes me wonder how much the company is going to eventually like what it has bitten off. I’m betting we won’t see articles talking about rural investment opportunities after a few big ISPs bungle the expansion into rural areas.

When talking about how rural properties are good investments due to grant money, I always wonder if the companies thinking about this are considering the extra operational costs in rural areas. Truck rolls are a lot longer than in an urban market. There are a lot of miles of cable plant that are subject to being cut. Before the pandemic, 16% of states and 35% of counties had a sustained population decrease. Even with grant funding, many rural communities are sparsely populated and often suffer from low household incomes. Even with grant funding, it’s hard to see an ISP doing much better than break even in many rural communities – something cooperatives and municipalties are willing to undertake but which is poison for publicly traded corporations.

Unfortunately, I think I know at least some of the reasons why some companies are attracted to the grants. The big telcos have been cutting the workforce and curtailing maintenance costs and efforts for decades. It’s a lot easier to make money with a grant-funded rural market if a carrier already plans to scrimp on needed maintenance expenditures. To me, that’s the subtle message not mentioned in the Jeffries’s analyst opinion – too many big carriers know how to milk grant money to gain a financial advantage. Unfortunately, those kinds of investors are going to do more long-term harm than good in rural America.

Public Reporting of Broadband Speeds

The FCC’s Acting Chairman Jessica Rosenworcel wrote a recent blog that talks about the progress the FCC is making towards revising the FCC mapping system. The blog concentrates on the upcoming consumer portal to provide input into the FCC maps.

It’s good to see progress finally being made on the maps – this has been discussed but not implemented for over two years. And it’s good that the public will have a way to provide input to the FCC database. Hopefully, the FCC will change the rules before the new mapping tools are implemented because the current rules don’t let the public provide any worthwhile input to the mapping data.

The current mapping rules were implemented in Docket FCC 21-20 on January 13 of this year – one of the last acts of outgoing Chairman Ajit Pai. Those rules outline a consumer input process to the mapping that is going to be a lot less impactful than what the public is hoping for.

The new FCC maps will require that ISPs draw ‘polygons’ around the areas where there is existing broadband coverage, or where the ISP can install broadband with 10 days of a consumer request. A consumer can challenge the availability of broadband at their home. If a consumer claims that broadband is not available at an address, the ISP is required to respond. If there is no broadband available at the address, the likely response of the ISP will be to amend the polygon to exclude the challenged address. I guess that consumers who can’t buy broadband from a given ISP can gain some satisfaction from having that ISP fix the maps to set the record straight. But the complaint is unlikely to get broadband to the home where broadband is not available.

Unfortunately, the challenge process is not going to help in the much more common situation where a household has dreadfully slow broadband. The ISP might be advertising speeds of ‘up to 25/3 Mbps’ but delivering only a tiny fraction of that speed. This is the normal situation for rural DSL and many fixed wireless connections – speeds customers see are much slower than what ISPs claim on the FCC maps.

Unless the FCC changes the rules established in this Docket, a consumer claiming slow broadband will see no change to the FCC map. The January rules allow ISPs to continue to claim marketing speeds in the new FCC mapping system. A rural ISP can continue to claim ‘up to 25/3 Mbps’ for an area with barely functioning broadband as long as the ISP advertises the faster up-to speed.

The FCC needs to change the rules established in the January Docket or they are going to witness a rural revolt. Consumers that are seeing broadband speeds that are barely faster than dial-up are going to flock to the new FCC reporting portal hoping for some change. Under the current rules, the FCC is going to side with the ISP that advertises speeds faster than it delivers.

The FCC has a real dilemma on how to change the public reporting process. The FCC can’t automatically side with each consumer. Any given consumer that reports slow speeds might be seeing the impact of an old and outdated WiFi router, or have some other issue inside the home that is killing the speed delivered by the ISP. But when multiple homes in a neighborhood report slow speeds, then the ISP is almost certainly delivering slow speeds.

Unfortunately, there is no way to report ‘actual’ speeds on an FCC map. If you ever ran a speed test multiple times during a day and night you know that the broadband speed at your home likely varies significantly during a day. What’s the ‘actual’ broadband data speed for a home that sees download speeds vary from 5 Mbps to 15 Mbps at different times of the day?

The consumer challenge of FCC data was dreamed up to allow the public to tell a broadband story different than what the ISPs have been reporting to the FCC. Unfortunately, it’s not going to work to anybody’s satisfaction. The real culprit in this story is the idea that we can define broadband somehow by speed – that there is a functional difference between a broadband connection that delivers 5 Mbps or 15 Mbps. The fact is that both connections are dreadfully slow and should not be considered as broadband. But as long as we have grant programs that fund areas that have speeds under 10/1 Mbps or 25/3 Mbps, we’ll keep having these dumb processes that pretend that we know the actual speed on even a single rural broadband connection. The fact is – we don’t and we can’t.

AT&T Says No to Symmetrical Broadband

Since it seems obvious that the new FCC will take a hard look at the definition of broadband, we can expect big ISPs to start the lobbying effort to persuade the FCC to make any increase in the definition as painless as possible. The large ISPs seem to have abandoned any support for the existing definition of 25/3 Mbps because they know sticking with it gets them laughed out of the room. But many ISPs are worried that a fast definition of broadband will bypass their technologies – any technology that can’t meet a revised definition of broadband will not be eligible for future federal grants, and even more importantly can be overbuilt by federal grant recipients.

AT&T recently took the first shot I’ve seen in the speed definition battle. Joan March, the Executive VP of Federal Regulatory Relations wrote a recent blog that argues against using symmetrical speeds in the definition of bandwidth. AT&T is an interesting ISP because the company operates three different technologies. In urban and suburban areas AT&T has built fiber to pass over 14 million homes and businesses and says they are going to pass up to 3 million more over the next year or two. The fiber technology offers at least a symmetrical gigabit product. AT&T is also still a huge provider of DSL, but the company stopped installing DSL customers in October of last year. AT&T’s rural DSL has speeds far south of the FCC’s 25/3 definition of bandwidth, although U-verse DSL in larger towns has download speeds as fast as 50 Mbps.

The broadband product that prompted the blog is AT&T’s rural cellular product. This is the company’s replacement for DSL, and AT&T doesn’t want the FCC to declare the product as something less than broadband. AT&T rightfully needs to worry about this product not meeting the FCC definition of broadband – because in a lot of places it is slower than 25/3 Mbps.

Reviews.org looks at over one million cellular data connections per year and calculates the average data speeds for the 3 big cellular carriers. The report for early 2021 shows the following nationwide average speeds for cellular data. These speeds just barely qualify as broadband with the current 25/3 definition.

AT&T – 29.9 Mbps download, 9.4 Mbps upload

T-Mobile – 32.7 Mbps download, 12.9 Mbps upload

Verizon – 32.2 Mbps download, 10.0 Mbps upload

PC Magazine tests cellular speeds in 26 major cities each summer. In the summer of 2020, they showed the following speeds:

AT&T – 103.1 Mbps download, 19.3 Mbps upload

T-Mobile – 74.0 Mbps download, 25.8 Mbps upload

Verizon – 105.1 Mbps download, 21.6 Mbps upload

Cellular data speeds are faster in cities for several reasons. First, there are more cell sites in cities. The data speed a customer receives on cellular is largely a function of how far the customer is from a cell site, and in cities, most customers are within a mile of the closest cell site. The cellular carriers have also introduced additional bands of spectrum in urban areas that are not being used outside cities. The biggest boost to the AT&T and Verizon urban speeds comes from the deployment of millimeter-wave cellular hotspots in small areas of the downtowns in big cities – a product that doesn’t use traditional cell sites, but which helps to increase the average speeds.

Comparing the urban speeds to the average speeds tells us that rural speeds are even slower than the averages. In rural areas, cellular customers are generally a lot more than one mile from a cell tower, which really reduces speeds. My firm does speed tests, and I’ve never seen a rural fixed cellular broadband product with a download speed greater than 20 Mbps, and many are a lot slower.

The AT&T blog never makes a specific recommendation of what the speeds ought to be. But Marsh hints at a new definition at 50/10 or 100/20. My firm has also done a lot of surveys during the pandemic and we routinely see about one-third of households or more that are unhappy with the upload speeds on urban cable company networks – which have typical upload speeds between 15 Mbps and 20 Mbps. AT&T is hoping that the FCC defines broadband with an upload speed of 10-20 Mbps – a speed that many homes already find inadequate today. That’s the only way that rural fixed cellular can qualify as broadband.

The 6G Hype is Already Starting

Even though 5G hasn’t yet made it onto any cellphone, the wireless vendor industry is already off and running looking at the next generation of wireless technology that has been dubbed as 6G. This recent article describes the European Union Hexa-X project that started in January to look at developing specifications for next-generation wireless technology using terahertz spectrum. The initiative will be led by Nokia Bell Labs and Ericsson. Similar research is being done elsewhere around the world by companies such as Huawei, NTT, and Samsung.

6G wireless will explore using the high frequencies between 100 GHz and 1 THz (terahertz), which are collectively being referred to as terahertz frequencies. These are radio waves that are just below the frequencies of infrared light. These frequencies have such short waves, that at the upper end, the frequencies could carry as much as 1,000 times more bandwidth than the frequencies used in cellphones today.

But there is a huge trade-off for the huge bandwidth capacity in that these frequencies travel only short distances, measured in a few feet, before starting to dissipate. These frequencies will not pass through any obstacle and need a clear line of sight.

It’s likely that any 6G technology will be used for indoor data transmission, and 6G could become the fastest delivery mechanism of bandwidth to use within a room between devices. The bandwidth capabilities of these superhigh frequencies could finally fully enable technologies like telepresence (I finally get a holodeck!), or cobots (interactive robots).

Of course, like with any new technology, there is also already hype. Samsung recently released a whitepaper that said that using terahertz waves for cellphones is ‘inevitable’. Long before we try to somehow tame terahertz frequencies in the wild, we need to first figure out millimeter-wave cellular technologies. The current use of millimeter-wave hotspots in downtown metropolitan areas has provided cover for cellular carriers to hype gigabit speeds 5G – but this is a miserable technology in terms of usefulness or reliability. The millimeter-wave spectrum is blocked by everything in the environment, including the body of the user.

More importantly, I’ve never heard anybody make a coherent description of why we need to deliver gigabit or faster speeds to cellphones. If we modify cellphones to process data that quickly we’ll need to find a way to recharge the phones every hour. While I understand why engineers go gaga over the idea of delivering a hundred or a thousand times more data to a cellphone, we need a reality check to ask why anybody would want to do that. Smartphones might be the most important technology developed in this century, but there seems to be little need to turn cellphones into a walking data center unless we want to also start carrying around small air-conditioning units to keep the chips cool.

It makes sense that device makers like Nokia and Ericsson would get excited over the next generation of wireless devices. It’s not hard to envision entirely new technologies twenty years from now that take advantage of terahertz frequencies. Seriously, who is not going to want a holodeck in their living room?

Interestingly, the introduction of 6G is likely going to be of less value to the big cellular carriers. These companies have already started to lose the indoor battle for 5G. Verizon and AT&T had once envisioned a world where homeowners would buy monthly 5G data plans for all of the wired devices in our home. But the FCC already gutted that idea by releasing 6 GHz spectrum for free use, which manufacturers are marrying to the new WiFi 6 standard. As is inevitable, a free solution that doesn’t require a monthly subscription is going to capture most of the indoor market. We’re not going to be buying a 5G subscription for our 8K TV when we have WiFi 6 operating from a $100 router.

One has to imagine the same future for terahertz frequencies. The FCC will eventually create at least one band of terahertz frequency that anybody can use, and that’s the frequency that will power the superfast devices in our homes and offices.

One thing that the early 6G hype fails to mention is the fiber networks that will be needed to fuel superfast applications. We aren’t going to be operating a holodeck using a measly 1 Gbps broadband connection. Twenty years from now, techie households will be screaming for the delivery of 100 Gbps bandwidth to support their terahertz gaming applications.

Is it Time to Kill Retransmission Rules?

Rep. Anna Eshoo (D-Calif.) and Rep. Steve Scalise (R-La.) recently introduced a bill to Congress labeled as the Introducing the Modern Television Act of 2021 that wants to largely do away with the retransmission consent rules for cable companies. They’ve introduced similar bills in recent years.

Retransmission rules require that cable operators must carry local TV stations that are within over-the-air transmission range of a given area. That rule sounds benign enough but has been used by local stations to extract huge fees from cable companies for carrying local content. The fees paid to local stations are one of the primary reasons that cable TV rates have escalated so quickly over the last decade.

Fifteen years ago, it was rare for local stations to charge anything for carrying their signal. They were happy to be able to claim cable viewers of their content when calculating advertising rates based upon ‘eyeballs’ for ads placed on their stations. But a handful of consultants convinced a few stations that the retransmission requirements were a valuable commodity and stations started insisting on payments from cable companies to carry the content. Since that time, the payments have climbed from zero to rates in the range of $4 or more per cable customer, per local station per month. For a cable company carrying even the basic four networks of ABC, CBS, FOX, and NBC means shelling out $16 or more per month to local stations for each cable subscriber.

It was these fees that have led the big cable companies to create the local programming fees that are not part of basic rates. Cable companies may advertise a basic rate for a cable package at $50 but then sock on large hidden fees of $20 or more to cover local station fees along with some sports network fees.

The bill sponsors also blame high retransmission fees for the increasing blackouts of content that we’ve seen in recent years. When cable companies balk at paying increasing rates each year for local content, the local stations have adopted the tactic of shutting off access to their content until the cable company finally agrees to pay the ever-increasing rates.

Following are a few of the key provisions of the bill:

  • Eliminates the retransmission consent, mandatory copyright fees, and other provisions of current FCC rules (which were dictated by Congress). This should allow for real negotiations of rates – today the stations demand rates and there is little room for negotiation.
  • Adds a 60-day period where blackouts of content aren’t allowed when the local station and a cable operator are negotiating rates.
  • Gives the FCC the right to push a programming dispute into binding arbitration. Blackouts would be prohibited during the arbitration period.
  • Preempts federal, state, and local governments from regulating cable rates. This is an odd requirement since there is little or no rate regulation that I know of, but it must exist somewhere in the country.
  • Keeps the rule that cable networks and satellite providers must continue to carry local content.

As would be expected, local TV stations and the major networks are against these changes. Most of the money charged for retransmission consent ends up in the pockets of the major networks. Cable companies are obviously in favor of the proposed changes since it would give them an opportunity for real negotiations for content.

Congress created this original mess by mandating that cable companies must carry local content without allowing for things like the arbitration in negotiations this bill brings to the process. But the runaway rates in the cable industry can be pinned on the greed of programmers who have raised programming charges far more than inflation for two decades. The industry has driven cable rates so high that millions of households are cutting the cord annually and abandoning paying for content that includes local stations. If you were asked to imagine a scenario where an industry would self-destruct over time, it would be hard to think of a better example than the retransmission fees in the TV industry.

Telling the Truth About 5G

I still run across articles that extol the supposed wonders of 5G. The most recent, published in Gizmodo asks “How 5G Could Replace Your Home Broadband Connection”. I was surprised to see an article like this in a tech-oriented site because the article gets most of the facts wrong about 5G – facts that are not hard to verify.

This article talks about 5G having “faster download speeds, faster upload speeds, more bandwidth, and lower latency” than landline broadband. The author talks about having gigabit speeds on 5G. The article is clearly talking about 5G cellular technology. The author talks about sticking a SIM card in a router and using this fast 5G instead of wired broadband. The article hints that 5G may be the savior for poor rural broadband. This all sounds like it came directly from the sales pitch that the big cellular carriers have been making to politicians for the last five years – 5G will transform the world.

The article talks about an AT&T cellular hotspot product that can handle data speeds up to 1 gigabit. The article mentions the T-Mobile Home Internet product and also mentions speeds up to 1 gigabit. Those two carriers mention the word gigabit in their advertising, but the author fails to understand that in urban areas these products might deliver speeds at something under 100 Mbps, and in rural America, where the products are aimed to serve, speeds are likely going to be south of 20 Mbps.

Finally, the article swallows the industry rhetoric and gives the label of 5G to the Verizon Home product – which is fiber-to-the-curb. The key word in that technology description is fiber – Verizon builds a fiber just outside of the home for this to work. This product is not even a distant cousin of cellular data.

And that’s where this author and a large number of other articles miss the boat about 5G. 5G is a cellular technology. Its sole purpose of 5G is to make cell sites perform better. Today there is no 5G anywhere on the planet because the 5G features that will make cell sites perform better have not yet been incorporated into cell sites or into phones. We can expect to start seeing these features over the next 3-4 years at cell sites, and a few years longer as future generations of cellphones can use the new features.

The author has fallen for the carrier hype that 5G will be blazingly fast. It will not be fast in the vast majority of circumstances. The 5G specifications call for cell towers to reliably deliver 100 Mbps cellular data to big numbers of cellphones or devices. The industry vendors might find a way to outperform that goal – but there is no wireless engineer anywhere thinking we’ll be delivering gigabit speeds to cellphones using 5G.

The biggest trap the author fell into is buying into the carrier rhetoric about gigabit speeds. The carriers have wireless products with fast broadband using millimeter-wave spectrum. The first was mentioned above, which is Verizon’s Home product. The second comes from deployment of millimeter-wave hot spots in downtown areas. These hotspots are the equivalent of putting a faster hotspot like the ones used at a Starbucks on a pole and beaming broadband to anybody within 500 feet.

Both of these applications are fast. Both use millimeter-wave spectrum. But both require a customer to be within close proximity to a fiber. Most importantly, these technologies are not 5G. They don’t currently use and will never use any of the 5G technology improvements that will make cellular phones perform better. I’m sometimes tempted to post an entire blog that, reminiscent of Jack Torrence in The Shining, types over and over, “Millimeter-wave spectrum is not 5G. Millimeter-wave spectrum is not 5G”.

I occasionally reply to one of these articles, and this one is particularly egregious because such articles magnify the false stories that the carriers have been trying to sell to the public, which is that 5G is an amazing technology that will transform the world – any day now, but not quite today. Such articles keep telling people to hold out for a technology that isn’t coming. Yes, there will be rural 5G hotspot products for households. But let’s please tell the truth – I’ll be surprised if the average rural home ever reaches 50 Mbps on the technology.

The Gigabit Wireless Controversy

One of the big controversies in the RDOF auction was that the FCC allowed three of the top ten grant winners to bid using gigabit wireless technology. This was Starry (Connect Everyone), Resound Networks, and Nextlink (AMG Technology). By bidding in the gigabit tier these technologies were given the same technology and dollar weighting as somebody bidding to build fiber-to-the-premise. There was a big outcry from fiber providers that claim that these bidders gained an unfair advantage because the wireless technology will be unable to deliver gigabit speeds in rural areas.

Fiber providers say that the bidding with gigabit wireless violates the intent of the grants. Bidding in the gigabit tier should mean that an ISP can deliver a gigabit product to every customer in an RDOF grant area. Customers don’t have to buy a gigabit product, but the capability to provide that speed to every customer must be there. This is something that comes baked-in with fiber technology – a fiber network can deliver gigabit speeds (or 10-gigabit speeds these days) to any one customer, or easily give it to all customers.

There is no denying that there is wireless technology that can deliver gigabit speeds. For example, there are point-point radios using millimeter-wave spectrum that can deliver a gigabit path for up to two miles or a multi-gigabit path for perhaps a mile. But this technology delivers the bandwidth to only a single point. This is the technology that Starry and others use in downtown areas to beam a signal from rooftop to rooftop to serve apartment buildings, with the bandwidth shared with all of the tenants in the building. This technology delivers up to a gigabit to a building, but something less to tenants. We have a good idea of what this means in real life because Starry publishes the average speed of its customers. In March 2021, the Starry website said that its average customer received 232 Mbps download and 289 Mbps up. That’s a good bandwidth product, but it is not gigabit broadband.

There is a newer technology that is more suited for areas outside of downtown metropolitan areas. Siklu has a wireless product that uses unlicensed spectrum in the V-band at 60 GHz and around 70 GHz. This uses a Qualcomm chip that was developed for the Facebook Terragraph technology. A wireless base station that is fiber-fed can serve up to 64 customers – but the catch is that the millimeter-wave spectrum used in this application travels only about a quarter of a mile. Further, this spectrum requires a nearly perfect line-of-sight.

The interesting feature of this technology is that each customer receiver can also retransmit broadband to make an additional connection. Siklu envisions a network where four or five hops are made from each customer to extend broadband around the base transmitter. Siklu advertises this product as being ideal for small-town business districts where a single fiber-fed transmitter can reach the whole downtown area through the use of the secondary beams. With a handful of customers on a system, this could deliver a gigabit wireless product. But as you start adding secondary customers, this starts acting a lot like a big urban apartment building, and the shared speeds likely start looking like what Starry delivers in urban areas – fast broadband, but that doesn’t meet the definition that every customer can receive a gigabit.

The real catch for this technology comes in the deployment. The broadband strength is pretty decent if every base transmitter is on fiber. But ISPs using the technology are likely going to cut costs by feeding additional base stations with wireless backhaul. That’s when the bandwidth starts to get chopped down. An RDOF winner would likely have to build a lot of fiber and have transmitters every mile to get the best broadband speeds – but if they dilute the backhaul by using wireless connections between transmitters, or spacing base station further apart, then speeds will drop significantly.

The other major issue with this technology is that it’s great for the small-town business district, but how will it overlay in the extremely rural RDOF areas? The RDOF grants cover some of the most sparsely populated areas in the country. The Siklu technology will be quickly neutered by the quarter-mile transmission distance when customers live more than a quarter-mile apart. Couple this with line-of-sight issues and it seems extremely challenging to reach a lot of the households in most RDOF areas with this technology.

I come down on the side of the fiber providers in this controversy. In my mind, an ISP doesn’t meet the grant requirements if they can’t reach every customer in an RDOF area. An ISP also doesn’t meet the gigabit grant requirements if only some customers can receive the gigabit speeds. That’s the kind of bait-and-switch we’ve had for years, thanks to the FCC that has allowed an ISP to bring fast broadband to one customer in a Census block and declare that everybody has access to fast speeds.

It’s a shame that I feel obligated to come to this conclusion because deployed well, these wireless technologies can probably bring decent broadband to a lot of homes. But if these technologies can’t deliver a gigabit to everybody, then the ISPs gained an unfair advantage in the RDOF grant bidding. When I look at the widely spaced home in many RDOF areas I can’t picture a wireless network that can reach everybody while also delivering gigabit capabilities. The only way to make this work would be to build fiber close to every customer in an RDOF area – and at that point, the wireless technology would be nearly as costly as FTTH and a lot more complicated to maintain. I think the FCC bought the proverbial pig-in-a-poke when they approved rural gigabit wireless.

The Birth of an Incumbent

Dish Networks wrote a recent letter to the FCC pointing out that T-Mobile had reversed its position over the last year on CBRS spectrum and other wireless issues. The opening paragraph of the letter contains the statement that is the genesis of today’s blog. Dish wrote, “As T-Mobile celebrates the one-year anniversary of its acquisition of Sprint, it is clear that the company’s worldview has transformed to that of an entrenched incumbent commensurate with its newfound size and scale”.

That sentence probably marks the date on which we should all start thinking of T-Mobile as an incumbent, with all that entails. In my mind, an incumbent in the telecom world is a carrier that acts like a monopoly. An incumbent does everything possible to maximize profits. Incumbents throw up barriers to entry to anybody that might compete with them.

The Dish letter points out that last behavior. T-Mobile had historically been a champion for opening up CBRS spectrum for rural use by small wireless companies. But as an incumbent, T-Mobile suddenly is against boosting power levels for CBRS that make it useful in a rural setting. This is a change of position that demonstrates that T-Mobile is not willing to accept even the slightest amount of interference from rural use of CBRS, even though the spectrum rules are written to minimize such interference.

T-Mobile is positioned to be an incumbent. In 2020, after the merger with Sprint, T-Mobile had almost 25% of the cellular market, ahead of Verizon at 24%, but still behind AT&T at 35%.

It’s an interesting change at T-Mobile considering its history in the US market. T-Mobile spent years touting itself as the Un-carrier under CEO John Legere. The company painted itself as the cellular carrier that looked out for the public with low prices, faster speeds, and better features – all different than what was offered by AT&T and Verizon. It was an interesting marketing posture and helped T-Mobile grow from an 11% market share a decade ago to 16% before the merger with Sprint.

Economists say that it’s inevitable that any company that gains market power will trend towards acting like a monopoly. This tendency isn’t due only to changes of behavior in the Boardroom, but rather happens from top to bottom in big companies as employees start taking steps to capitialize on company market advantages. Monopolies tend to reward employees for improving the bottom line, and things occur out of the eye of upper management. There is probably no better example of this than the many bizarre stories of overaggressive behavior from Comcast customer service. Much of this behavior has been blamed on regional service managers that took aggressive positions with the public to improve bonuses. The same thing was one of the primary causes for the behavior at Well’s Fargo where employees added unrequested accounts to customers as a way to earn sales bonuses.

If T-Mobile has indeed become a monopolist, and economic history suggests that’s inevitable, then this is a good reason for the country to oppose mergers that create monopolies. The cellular customers in the US will have been better off in the long run by having a hungry and separate T-Mobile and Sprint rather than letting them combine to create another monopoly.

There is no question that the cellular industry is controlled by the three monopolies of AT&T, T-Mobile, and Verizon. The next largest cellular carrier is US Cellular with barely more than 1% of the market. Dish will be trying to carve a niche in the market, but that’s not going to be easy when there are three incumbents pushing for policies and rules that maintain their market power.

Realistically, T-Mobile became an incumbent on the day of the merger with Sprint. It took less than a year for somebody to officially call out T-Mobile at the FCC as an entrenched incumbent.