Time to Stop Talking about Unserved and Underserved

I work with communities all of the time that want to know if they are unserved or underserved by broadband. I’ve started to tell them to toss away those two terms, which are not a good way to think about broadband today.

The first time I remember the use of these two terms was as part of the 2009 grant program created by the American Recovery & Reinvestment Act of 2009. The language that created those grants included language from Congress that defined the two terms. In that grant program, unserved meant any home or business that has a broadband speed of less than 10/1 Mbps. Underserved was defined as homes having speeds above 10/1 Mbps but slower than 25/3 Mbps.

As far as I can tell, these terms have never been defined outside of broadband grant programs. However, the terms began to be widely used when talking about broadband availability. A decade ago, communities all wanted to know if they were unserved or underserved.

The terms began to show up in other grant programs after 2009. For example, the FCC’s CAF II grant program in 2015 gave money to the largest telephone companies in the country and funded ‘unserved’ locations that had speeds less than 10/1 Mbps.

The same definition was used in the ReConnect grants created by Congress in 2018 and 2019. Those grants made money available to bring better broadband to areas that had to be at least 90% unserved, using the 10/1 Mbps definition.

The biggest FCC grant program of 2020 has scrapped the old definition of these terms. This $20.4 billion Rural Digital Opportunity Fund (RDOF) grant program is being made eligible to Census blocks that are “entirely unserved by voice and with broadband speeds of at least 25/3 Mbps”. That seemingly has redefined unserved to now mean 25/3 Mbps or slower broadband – at least for purposes of this federal grant program.

There are also states that have defined the two terms differently. For example, following is the official definition of broadband in Minnesota that is used when awarding broadband grants in the state:

An unserved area is an area of Minnesota in which households or businesses lack access to wire-line broadband service at speeds that meet the FCC threshold of 25 megabits per second download and 3 megabits per second upload. An underserved area is an area of Minnesota in which households or businesses do receive service at or above the FCC threshold but lack access to wire-line broadband service at speeds 100 megabits per second download and 20 megabits per second upload.

It must also be noted that there are states that define slower speeds as unserved. I’m aware of a few state broadband programs that still use 4/1 Mbps or 6/1 Mbps as the definition of unserved.

The main reason to scrap these terms is that they convey the idea that 25/3 Mbps broadband ought to be an acceptable target speeds for building new broadband. Urban America has moved far beyond the kinds of broadband speeds that are being discussed as acceptable for rural broadband. Cable companies now have minimum speeds that vary between 100 Mbps and 200 Mbps. Almost 18% of homes in the US now buy broadband provided over fiber. Cisco says the average achieved broadband speed in 2020 is in the range of 93 Mbps.

The time has come when we all need to refuse to talk about subsidizing broadband infrastructure that is obsolete before it’s constructed. We saw during the recent pandemic that homes need faster upload speeds in order to work or do schoolwork from home. We must refuse to accept new broadband construction that provides a 3 Mbps upload connection when something ten times faster than that would barely be acceptable.

Words have power, and the FCC still frames the national broadband discussions in terms of the ability to provide speeds of 25/3 Mbps. The FCC concentrated on 25/33 Mbps as the primary point of focus in its two recent FCC broadband reports to Congress. By sticking with discussions of 25/3 Mbps, the FCC is able to declare that a lot of the US has acceptable broadband. If the FCC used a more realistic definition of broadband, like the one used in Minnesota, then the many millions of homes that can’t buy 100/20 Mbps broadband would be properly defined as being underserved.

In the last few months, the FCC decided to allow slow technologies into the $16.4 billion RDOF grant program. For example, they’ve opened the door to telcos to bid to provide rural DSL that will supposedly offer 25/3 Mbps speeds. This is after the complete failure in the CAF II program where the big telcos largely failed to bring rural DSL speeds up to a paltry 10/1 Mbps.

It’s time to kill the terms unserved and underserved, and it’s time to stop defining connections of 10/1 Mbps or 25/3 Mbps as broadband. When urban residents can buy broadband with speeds of 100 Mbps or faster, a connection of 25/3 should not be referred to as broadband.

An Update on LEO Satellites

A lot of rural America continues to hope that low orbit satellite (LEO) service will provide a broadband alternative. It’s been a while since I’ve covered the status of the companies proposing to deploy constellations of satellites for providing broadband.

In March, OneWeb filed for Chapter 11 restructuring when it was clear that the company could not raise enough cash to continue the research and development of the satellite product. In July, a bankruptcy court in New York approved a $1 billion offer to take over the company filed jointly by the British Government and Bharti Airtel. Airtel is India’s largest cellular company. The restructured company will be owned with 45% stakes by Britain and Bharti Airtel, with the remaining 10% held by Softbank of Japan, the biggest original shareholder of OneWeb. Other earlier investors like the founders, Intelsat, Totalplay Telecommunications of Mexico, and Coca-Cola have been closed out of ownership by the transaction.

There is speculation that the British government purchased the company to create tech jobs in the country and that all R&D and manufacturing for OneWeb would immediately shift to England from Florida.

Of more concern for rural broadband is speculation that the mission of the company will change. Greg Wyler, the original CEO of the company had a vision of using the satellites to bring broadband to parts of the world that have no broadband. He chose a polar orbit for the satellites and was going to launch the business by serving Alaska and the northern territories of Canada like Nunavut. I’ve seen speculation that the revised company is likely to concentrate instead on wholesale connections to telcos and ISPs, such as providing backhaul for rural cell sites.

Elon Musk’s satellite venture StarLink was recently in the news when the company said it was going to raise ‘up to $1 billion’ to continue the development of the business. The company still has a long and expensive road to success. The company has raised over $3.5 billion to date before this latest raise, but a recent Bloomberg article estimates that the company will need to raise an additional $50 billion between now and 2033, which is when the company is projected to be cash-positive.

StarLink now has over 540 satellites in orbit, but the business plan calls for over 4,000. Keeping the constellation in place will be an ongoing challenge since the satellites have an estimated life of 5 to 6 years. Starlink will forever have to be launching new satellites to replace downed satellites.

The US government and the FCC seem to be in StarLink’s corner. The FCC is still evaluating if it will allow StarLink to participate in the upcoming RDOF grants auction in October. It would be incredibly unusual to award giant federal grants for a product that is still on the drawing board and for an ISP that hasn’t raised 10% of their needed funding.

StarLink recently made a very-public announcement that it was looking for beta customers – likely as a way to spur fundraising. Early Starlink customers will likely see blazingly fast speeds, which would happen for any broadband technology that could devote the bandwidth from one server to connect to one or two customers. The bandwidth delivered on a fully-subscribed satellite network will be far less – but that won’t stop the company from using a beta test to set unrealistic expectations of future satellite broadband speeds.

The last LEO player that is still active is Jeff Bezos venture that is still using the preliminary name of Project Kuiper. The FCC recently approved the licensing for Project Kuiper to move forward. Immediately following the FCC approval, Jeff Bezos announced that he will be investing $10 billion in the business. This ability to self-fund likely gives Project Kuiper an advantage over other competitors. It was reported that just for the month of July that Bezos’s net worth had climbed by $9 billion.  Funding is going to be a constant hurdle for the other two major competitors, but Project Kuiper might be the fastest to deploy if funding is not an issue.

The FCC approval pf Project Kuiper and the funding announcement by Bezos came at the same time that Starlink is seeking another round of financing and is trying to get into the FCC auction. It’s going to be interesting to see how the battle between two billionaires unfolds – my bet is on Amazon due to easy access to funding.

Is the Line Between Wireless and Wireline Blurring?

In the Bernstein Strategic Conference in May, Ronan Dunne, Verizon CEO and EVP for Verizon Consumer talked about his vision for the future of 5G. During that presentation, he made a statement that has been bugging me for weeks, so I finally had to write about it. He said that he can foresee a day when consumers will purchase home broadband in the same way that they buy wireless service today. He said that will happen because the line between the wireless and wireline business are blurring.

Dunne is talking about a future when 5G is ubiquitous and where people won’t perceive a difference between landline broadband and 5G broadband. In a term used by an economist, Dunne foresees a day when wireless broadband is a pure substitute for landline broadband – where a customer won’t perceive a functional difference between the two products.

Verizon offers several wireless products, so let’s talk about them individually. The predominant Verizon product that is in every market is cellular broadband. This uses cell sites to beam voice and data traffic to cellphones or other devices that are connected to a cellular data plan. Those cellular plans are incredibly stingy in terms of the amount of broadband that can be used in a month, with the unlimited plans offering a little more than 20 gigabytes of data before a user has to pay more or become restricted. The specifications for 5G set a goal of 100 Mbps for cellular broadband speeds within a decade. That kind of speed might be a substitute for landline broadband today from a speed perspective. But networks are not likely to achieve these speeds for at least five more years, and by then I think cable companies will be considering increase urban broadband speeds to something like 250 Mbps. I have to question if cellular broadband speeds can keep up with the speeds provided by landline connections.

Of more importance is that cellular speeds drop when entering a building. Anybody who has walked into a large building using their cellphone understands that cellular signals don’t perform as well indoors as outdoors. By the time I walk 100 feet into my neighborhood grocery store, I often have zero bars of data. While speeds don’t drop that drastically in most homes, when outdoor cellular speeds hit 100 Mbps, indoor speeds in most homes might hit half that number. With slower speeds and incredibly stingy data caps it’s hard to see cellular broadband as a pure substitute for a landline broadband connection.

I also don’t think that the gimmick product that Verizon and others are selling in urban city centers that offers gigabit speeds using millimeter wave spectrum is a landline substitute. The product requires closely spaced small cell sites fed by fiber – but the big gotcha is that the millimeter wave spectrum won’t penetrate a building and barely even make it through a pane of glass. This is an outdoor product for which I still struggle to understand a willing market. It’s certainly not a substitute for landline broadband, except perhaps for somebody who is always outdoors.

The newest wireless product is Verizon’s fixed wireless access (FWA) that beams a broadband signal into the home from a pole-mounted transmitter at the curb using millimeter wave spectrum. I have to suspect that this is the product Dunne is talking about. I would agree with him that this is a pure substitute for landline broadband. But that’s because this is just another variation of landline broadband. This technology has historically been referred to as fiber-to-the-curb. Verizon is using a wireless transmission instead of a fiber line for the last hundred feet to reach a home – but this technology requires building the same fiber into neighborhoods as fiber-to-the-home. This is not a wireless technology since 99% of the network is still comprised of fiber. Anybody using this service can walk to their curb to see the fiber that is carrying their broadband. This technology is a clear substitute for a landline fiber drop – but it’s not a wireless network other than for the last 100 feet to a home.

The other way to challenge Dunne’s vision is by comparing the volume of traffic used by landline and wireless networks. The vast majority of data traffic is still carried over wires and the gulf between the data carried by each technology is widening every year. Consider the following chart from Cisco from 2019 that shows the volumes of monthly data traffic in North America by type. This is expressed in exabytes (one billion gigabytes).

Monthly Exabytes 2017 2018 2019 2020 2021 2022
Homes 35 43 53 64 75 90
Cellular 1.3 1.8 2.5 3.4 4.5 5.9
Business 6.5 8.3 10.3 12.8 15.5 18.5
Total 43 53 66 80 95 114

Both home and business broadband are carried on wires. In 2020, only a little more than 4% of all of the data traffic in North America is carried wirelessly. For wireless technology to be a pure substitute for wireline data, wireless networks would have to be capable of carrying a much bigger share of data – many times what they carry today. The laws of physics argue against that, particularly since landline data usage is growing at an exponential rate. It’s hard to envision wireless networks in our lifetime that can handle the same volumes of data as fiber-based landline networks.

This is not intended as a major criticism of what Dunne said. The country will be better off if Verizon offers a competitive alternative to the cable companies. However, Verizon is like the other cellular companies and can’t talk about 5G without overstating the potential. I know has to keep hyping 5G for Wall Street and I sympathize with that need. But we are still very far from a day when the average household will view landline and wireless data to be pure substitutes.

Customers Still Flock to Promotional Rates

FierceVideo and others recently reported on a survey done in June by the research firm Cowen that looked at consumer use of promotional rates.

Cowen found that 20% of big ISP subscribers are on Internet plans that have promotional rates that will expire within the next 12 months. Another 13% of subscribers are on promotional plans that will expire in a time frame longer than 12 months. Surprisingly, 10% of subscribers have price-for-life guarantees. This leaves just 57% of subscribers paying full price for ISP services.

Promotional pricing is a sensitive topic for the industry and none of the big cable companies or telcos disclose the volume or amounts of discounts they give to customers. The big ISPs are all under a lot of pressure from Wall Street, and one of the key metrics used by analysts to track the big companies is ARPU – average revenue per user. ISPs have hard decisions to make. Giving too many discounts can kill ARPU, but not offering discounts can lose customers and revenues.

Some big ISPs have been working to curtail promotional pricing. AT&T has lost nearly three million video customers in the last year and claims that the losses mostly are due to tightening the promotional pricing that was given in the past by DirecTV. It’s also been reported that Charter has been tightening its policies on promotional prices, and in particular was ending a huge volume of promotional pricing they inherited through the acquisition of Time Warner Cable.

The Cowen report highlighted the difference in discount philosophy varies by ISP. For example, the report said that 45% of Altice customers have a promotional package, Comcast has 42%, and Charter is at 32%.

The big ISPs dole out promotional discounts in a few different ways. All of the incumbent ISPs offer low prices on the web to attract new customers. These new customer discounts generally last for 12 to 24 months before customers are moved to normal pricing. The other big category of promotional discounts is discounts that are negotiated with customers, often when customers threaten to leave an ISP.

The Cowen study confirmed something that we’ve always seen in the market. The promotional prices tend to go to younger subscribers, and older customers tend to pay full price for services. It takes real effort to either change ISPs or to renegotiate pricing every year or two, and only consumers willing to go through that hassle end up with a repetitive series of promotional deals.

The statistic that surprised me was that 10% of respondents in the survey said they had lifetime rates. ISPs have been somewhat leery of using the ‘lifetime rate’ words, but over the years as ISPs increased speeds and prices on their networks they have often allowed customers to stick with slower and less expensive broadband – generally with the caveat that a customer with a grandfathered plan can make no changes without being moved to newer pricing. In my mind, there is a significant difference between grandfathering an existing plan that offers slower speeds than other customers compared to new lifetime sales promotions that offer such deals to new customers. One of the biggest advantages to the ISPs of grandfathered plans is that customers keep these plans for years, meaning no churn.

Small ISPs struggle with promotional rates. Some small ISPs that still offer video offer guaranteed bundled rates for customers who buy cable TV. But I know a number of small ISPs that have ceased offering bundled discounts since the margins on cable TV are too small to afford them.

Small ISPs also generally don’t like the hassle of always having to negotiate rates with customers seeking a discount. Negotiating with customers changes the culture in a call center and adds a lot of pressure to customer service reps – and is probably the number one reason why the public dislikes big ISP customer service.

Many small ISPs have also given up on the idea of having residential service contracts. It’s a major pain to collect from somebody who breaks a contract and drops service. Most of the small ISPs I know feel that their quality of service is superior to the competition and they don’t want to fight to keep unhappy customers.

Bandwidth Needed to Work from Home

The pandemic made it clear that the millions of homes with no broadband or poor broadband were cut off from taking the office or the school home. But the pandemic also showed many additional millions of homes that their current ISP connection isn’t up to snuff for working or doing schoolwork from home. Families often found that multiple adults and students couldn’t share the bandwidth at the same time.

The simplest explanation for this is that homes were suddenly expected to connect to a school or work servers, use new services like Zoom, or make telemedicine connections to talk to doctors. These new requirements have significantly different bandwidth needs when a home’s big bandwidth need was watching multiple video streams at the same time.  Consider the following bandwidth needs listed by Zoom:

Zoom says that a home should have a 2 Mbps connection, both upload and download to sustain a Zoom session between just two people. The amount of download bandwidth increases with each person connected to the call, meaning Zoom recommends 6 Mbps download for a meeting with three other people.

Telemedicine connections tend to be even larger than this and also require the simultaneous use of both upload and download bandwidth. Connections to work and schools servers vary in size depending upon the specific software being used, but the VPNs from these connections are typically as large or larger than the requirements for the Zoom.

Straight math shows fairly large requirements if three or four people are trying to do make these same kinds of 2-way simultaneous connections at the same time. But houses are also using traditional bandwidth during the pandemic like watching video, gaming, web browsing, and downloading large work files.

The simplistic way to look at bandwidth needs is to add up the various uses. For instance, if four people in a home wanted to have a Zoom conversation with another person the home would need a simultaneous connection of 8 Mbps both up and down. But bandwidth use in a house is not that simple, and a lot of other factors contribute to the quality of bandwidth connections within a home. Consider all of the following:

  • WiFi Collisions. WiFi networks can be extremely inefficient when multiple people are trying to use the same WiFi channels at the same time. Today’s version of WiFi only has a few channels to choose from, and so the multiple connections on the WiFi network interfere with each other. It’s not unusual for the WiFi network to add a 20% to 30% overhead, meaning that collisions of WiFi signals effectively waste usable bandwidth. A lot of this problem is going to be fixed with WiFi 6 and 6 GHz bandwidth which together will add a lot of new channels inside the home.
  • Lack of Quality of Service (QoS). Home broadband networks don’t provide quality of service, which means that homes are unable to prioritize data streams. If you were able to prioritize a school connection, then any problems inside the network would affect other connections first and would maintain a steady connection to a school. Without QoS, a degraded bandwidth signal is likely to affect everybody using the Internet. This is easily demonstrated if somebody in a home tries to upload a giant data file while somebody else is using Zoom – the Zoom connection can easily drop temporarily below the needed bandwidth threshold and either freeze or drop the connection.
  • Share Neighborhood Bandwidth. Unfortunately, a home using DSL or cable modems doesn’t only have to worry about how other in the home are using the bandwidth, because these services used shared networks within neighborhoods, and as the demand needs for the whole neighborhood increase, the quality of the bandwidth available to everybody degrades.
  • Physical Issues. ISPs don’t want to talk about it, but events like drop wires swinging in the wind can affect a DSL or cable modem connection. Cable broadband networks are also susceptible to radio interference – your connection will get a little worse when your neighbor is operating a blender or microwave oven.
  • ISP Limitations. All bandwidth is not the same. For example, the upload bandwidth in a cable company network uses the worse spectrum inside the cable network – the part that is most susceptible to interference. This never mattered in the past when everybody cared about download bandwidth, but an interference-laden 10 Mbps upload stream is not going to deliver a reliable 10 Mbps connection. There are a half dozen similar limitations that ISPs never talk about that affect available bandwidth.

The average home experiencing problems when working at home during the pandemic is unlikely to be able to fully diagnose the reasons for the poor bandwidth. It is fairly obvious if you are having problems with having multiple zoom connections if the home upload speed isn’t fast enough to accommodate all of the connections. But beyond the lack of broadband capacity, it is not easy for a homeowner to understand any other local problems affecting their broadband experience. The easiest fix for home broadband problems is for an ISP to offer and deliver faster speed, since excess capacity can overcome many of the other problems that might be plaguing a given home.

Regulating Cable TV versus OTT

Regulation often makes no sense, particularly in times when technology is transforming an industry. There is no better example of this than the way we regulate cable TV today.

Traditional cable TV is heavily regulated at the federal, state, and local levels. The FCC website has a nice summary of the history of federal cable regulation. The industry is less heavily regulated today than it was forty years ago, but there are still a lot of federal regulations that apply to cable TV. At the local level, franchise taxes levied on cable service are a huge revenue source for local government.

The FCC website includes a definition of cable television as follows: “Cable television is a video delivery service provided by a cable operator to subscribers via a coaxial cable or fiber optics.  Programming delivered without a wire via satellite or other facilities is not “cable television” under the Commission’s definitions.”

All of the federal cable regulations are aimed at cable TV signal that enters the home via a coaxial or fiber wire. Satellite or wireless delivery of television signal is not considered to be traditional cable TV, although the FCC does regulate satellite TV under a different set of rules.

The FCC has chosen to ignore its own definition of cable TV for programming that is delivered over the web. I’ve subscribed to the online cable alternatives Sling TV, Playstation Vue, and YouTube TV. Over time those services have come to look more and more like traditional cable TV. My subscription to Playstation Vue (before it folded) included all of the same local channels that I would receive from a traditional cable subscription. The service included a channel guide, and from a functional perspective, it was impossible to make any meaningful distinction between the Playstation Vue product and the same product I might buy from a cable company.

From a technical perspective it’s hard to see the difference between the online programming and traditional cable. Both come into the home over coaxial or fiber cables. Both offer a line-up of local channels and a similar mix of national programming. Both services offer options like DVR service to record programming to watch later. If you were to show both services to somebody who had never seen TV before, they’d probably not see any difference in the two services.

But there is a huge regulatory difference between traditional cable TV and online programming, particularly at the local level. Franchise fees of up to 5% are levied onto traditional cable TV from Charter, Comcast, or AT&T – but no franchise fees are levied against Sling TV or YouTube TV. Cable companies are arguing that this difference alone gives online programming a competitive edge – and it’s hard to disagree with them.

To make matters even more confusing, there are now cable products that sit somewhere in between traditional TV and online TV. ISPs are no longer building cable headends to download cable signal from satellites. Instead they are buying cable channels wholesale. The entire channel line-up is pumped into an ISP on a big broadband connection. The channel line-ups look a lot like both traditional cable channels and online cable line-ups like YouTube TV. In the newest cable wholesale products the ISP doesn’t even need a traditional setup box and can deliver straight to smart TVs or use something like a Roku stick.

For now, most ISPs that are reselling the wholesale TV are registering as cable providers and are collecting franchise fees. But I won’t be surprised if an ISP challenges this and argues that wholesale cable service is not the same as traditional cable TV.e

From a regulatory perspective, our current treatment of cable service is closely analogous to the difference between traditional telephone service and voice over IP (VoIP). ISPs successful fought to define VoIP as a non-regulated service, although there is no functional difference between the two products at the customer level. There is no discernible difference between a telephone line provided by AT&T over telephone wires and telephone service provided by Comcast over cable wires – but the products get a drastically difference regulatory treatment. It’s hard to think that we aren’t going to soon see legal challenges by cable companies trying to avoid collecting franchise fees – and I think there is a decent chance that courts will side with them.

Smart Cities and Surveillance

One of the linchpins is most proposals to provide smart city technology is the deployment of surveillance cameras. This is usually sold to cities as a way to improve security and to give police a leg up before responding to 911 calls.

A case in point is the city of Oakland. Oakland, along with other major US ports received a grant to install security cameras in the port in an attempt to step up national security after 9/11. But the City Council decided to take this concept farther and voted to expand the port security system to cover the entire city and its 400,000 residents. The City justified the system because, in addition to providing video that might help to solve crimes, the system came with other bells and whistles like a gunshot detector that could pin down the origin location for gunshots.

But the camera systems in the City went beyond just providing crime-fighting tools. For instance, the system purchased by the City included software that could read and record vehicle license plates and included first-generation facial recognition software. In 2014 the City removed the surveillance system everywhere outside the port after a huge outcry from citizens about being watched by the City.

More recently in May, Sidewalk Labs, a division of Google Alphabet, scrapped plans to build the city of the future on 800 acres along Lake Ontario in Toronto. Sidewalk Labs had proposed a smart city where sensors would be embedded everywhere in the new city. They envisioned a smart city made life easier by melting snow from sidewalks, automatically deploying awnings to block the summer sun, and making sure that traffic always flows without interruption. Sidewalk Labs envisioned a horde of robots using underground streets that would deliver packages and remove trash. The public pushed back against the idea because they feared that Google would track everything done by residents and would use that data to profile every aspect of the lives of people living in the new city.

More quietly, over 600 police departments have partnered with citizens that install Ring cameras at their homes. Citizens can register their Ring cameras with police departments which can use the cameras to see what’s happening on residential streets. In answering inquiries from Congress, Amazon admitted that the police were free to use video they collected through the Ring cameras in any way and could store and use the images forever.

Recently there are new concerns about surveillance as facial recognition software is maturing as a technology. The Boston City Council recently passed an ordinance that banned the use of facial recognition technology other than in limited personal-use cases such as allowing facial recognition as a tool for logging onto computers. The City Council worried that facial recognition is a massive invasion of privacy and a threat to civil liberties. Boston joined San Francisco, Oakland, and Cambridge, MA in banning the technology.

There is starting to be a lot of public pushback against facial recognition. Amazon recently announced that it would suspend police use of its facial recognition software. Microsoft made a similar pledge and said they won’t sell facial recognition technology to police departments until there is a federal law governing the use of the technology.

Not every City is against surveillance. Currently, 8 of the top 10 cities in the world with the most surveillance cameras are in China. In China, the country is rapidly migrating to a system where people can shop and pay for things using facial recognition technology – a person’s face is their credit card. Shoppers peer into a camera at check-out and are automatically charged for their purchases. The downside of the technology is that the State knows everywhere that people go, everywhere they shop, and everything they buy.

The other two cities with the most surveillance cameras are London and Atlanta. London began installing cameras years ago as a result of security fears concerning Northern Ireland. But the camera systems were greatly expanded after a few terrorist attacks in the city in recent years. Atlanta has installed a network of over 11,000 cameras that are used by the police department under the name of Operation Shield. The video surveillance system routinely identifies stolen cars by monitoring license plates. The City says that they curtail privacy abuses by limiting the ability of most police staff to use the system – but privacy advocates are not so sure. Interestingly, most of Atlanta’s network, estimated to cost $300 million, was privately funded.

Surveillance is a sticky topic and will likely become more so as more cities start using facial recognition software. My bet is that future deployment of smart city technology will depend upon where communities land on the surveillance issue.

Charter Considering RDOF Grants

Charter let the world know that it plans to pursue RDOF grant funding in its most recent 8-K filing with the Securities and Exchange Commission. The company says that it might pursue grant funding to build to ‘multi-million passings’ involving ‘multi-billion investments’. It’s an interesting strategy. Charter already serves rural county seats and other towns across the country which puts them close to many of the areas where RDOF funding is available.

The RDOF grants cover the most rural and remote pockets of customers in the country. While there are some small rural towns included in the RDOF grant footprint, most of the customers covered by the grants are truly rural, consisting of farms and scattered homes in rural counties.

Charter will have to make some technology choices about how to serve rural America. The company can win the most money in the grant process if they file as a gigabit-speed provider. Gigabit speeds are available today with fiber technology and also with the hybrid fiber-coaxial networks operated by Charter and other cable companies. The RDOF grants can be awarded to technologies that support speeds over 25/3 Mbps. However, the grant includes incentives to favor ISPs willing to use faster technologies.

Charter could pursue slower technologies, like fixed wireless, but that funding is harder to win. To date, none of the big cable companies have ventured into wireless technology, other than a few trials. It’s always been a bit of a mystery why Charter and other cable companies haven’t erected wireless antennae at the fringes of their network to cheaply capture customers just out of reach of the HFC networks. My theory has always been that big cable companies are not nimble enough to handle drastically different technology platforms since all of their processes are designed for around coaxial and fiber technologies.

Charter is likely considering building fiber-to-the-home networks if they win RDOF grant funding. The hybrid fiber-coaxial technology that cable companies use in urban areas is poorly suited to serving scattered rural customers. The signal on an HFC network has to be boosted every two miles or so, and every time the signal is amplified some of the effective bandwidth carried on the network is lost. It would be a major challenge to maintain gigabit speeds required by the grants on a rural HFC network. It would only be possible with lots of fiber and tiny neighborhood nodes serving only a few homes. Charter has often cited the technology challenges of uses HFC technology in low-density areas as the reason it doesn’t expand outward from existing markets – and those reasons still hold true.

Charter claims to have expanded to add 1.5 million homes to its existing networks over the last two years, and in the 8-K filing says these are mostly rural customers. However, from what I’ve heard, most of these new Charter neighborhoods are in small subdivisions surrounding existing Charter markets. Charter has not been building rural networks to reach 1.5 million farms.

Charter and the other big cable companies have quietly introduced last-mile fiber technology into their networks. When cable companies build into new subdivisions today, they mostly do so with fiber technology.

It would be interesting if Charter’s strategy is to use the grant money to build fiber to farms. I know plenty of other ISPs considering the same business plan in places where there is enough RDOF grant funding available to make a business case.

There is no guarantee that Charter will ultimately win any grant funding and filing the grant short form on July 15 only gives Charter the option to participate in the auction in October. However, if the company bids in the auction, it will be good news for markets where Charter would build fiber technology. The big downside to the RDOF grant process is that in markets where no ISPs propose to build gigabit technology, the funding could end up going to satellite broadband providers – and there is no rural neighborhood that would prefer Viasat over Charter.

Will There Be a Credit Crunch?

ISPs are collectively going to be borrowing huge amounts of money over the next year as a result of the various state and local grant programs. For example, the $16.4 billion RDOF grant likely will drive ISPs to borrow many billions to match the grant awards. The federal ReConnect grants and the numerous State grant programs will also drive significant new debt.

I’ve followed the banking industry for decades and I’ve seen how banks react to economic stress. In my adult lifetime, I’ve witnessed several major economic downturns. The economy took a major downward turn in 1973-75, in 1981-82, after the dot-com crash in 2001, and most recently in 2007-09. In each of these cases, banks reacted by tightening credit, meaning that it became harder, or even impossible to borrow money.

The COVID-19 pandemic is different than these other recessions in that the reaction to the virus crashed an otherwise healthy economy. The pre-pandemic economy was showing some signs that the decade of growth was slowing, but the economy at the beginning of this year was in relatively good shape. That pre-pandemic economy should easily have been able to support the loans needed for a major expansion of broadband.

The pandemic has stressed banks in unusual ways. For example, banks have generated a huge amount of loans to small businesses to support the Paycheck Protection Program that’s part of the recent stimulus relief plans. While these loans are ultimately backed by the government, they’ve eaten severely into bank cash reserves.

Banks are also seeing a lot of bad debt due to the pandemic. Tens of millions of people are currently out of work and many are having trouble making debt payments on mortgages, car loans, student loans, credit cards, you name it. Huge numbers of businesses have shut their doors, or even if still open have curtailed or stopped making rent or mortgage payments. I’ve read numerous predictions that there will be a business real estate crisis soon as landlords react to suddenly vacant buildings.

Banks have already started to react in ways that you would expect during any downturn. Small businesses that are still open have had lines of credit frozen. It’s gotten harder to apply for a home mortgage. Banks have already cut back on lending new money to small businesses.

During past downturns, banks also curtailed loans to larger businesses. I can remember several times when industry lenders like CoBank and RTFC either stopped lending or became far more selective in making loans. Just a decade ago there was a short period of time when even Fortune 500 companies had trouble borrowing money.

It’s really hard to predict bank behavior right now since this is not a ‘normal’ recession. Underneath all of the current ugly financial news is a hope that the economy can spring back to life if medical science develops a vaccine or effective treatment. Unfortunately, there are parts of the economy that are not likely to come back quickly, or even at all. Many of the small businesses that are still shut due to the pandemic are likely not coming back. We’ve seen a big string of major retailers fail, and that is going to cascade to kill shopping malls and shopping districts. A lot of businesses say that they intend to continue with work-from-home initiatives that were forced upon them during the pandemic – and that means a lot of empty business real estate.

What matters most to ISPs right now is what the banking industry is going to be like by the end of this year. What happens if many of the ISPs looking for matching funds for grants are unable to borrow? How might the FCC react if billions of grants fall onto the floor due to a lending crisis?

I don’t have a crystal ball and this blog is not meant as a prediction that borrowing is going to dry up. But I’ve seen enough recessions to know that lending is not going to continue unchanged. Anybody thinking about accepting large amounts of grants needs to think about a back-up plan if it becomes harder to borrow. The FCC and ISPs have all assumed that that matching funds will  be readily available for anybody that lands a large grant. It’s historically been relatively easy to borrow for projects that are funding largely by grants – but this is definitely not normal times.

The FCC Finally Tackles New Mapping

Almost a year after having first approved the concept, the FCC recently started the process of developing new databases and maps. Last August the FCC approved the concept of having ISPs report broadband coverage by polygons, meaning that ISPs would draw lines around areas where they have active broadband customers or areas where ISPs can install a customer within a week of a request for service.

The FCC has been slow-rolling the process for the last year. They made announcements over a year ago that made rural America think that better maps are coming that will make it easier to correctly identify areas that have poor broadband. But last year’s big announcement only adopted the concept of better maps, and the recent vote took the first step towards implementing the concept.

Even now, it’s not clear that the FCC is ready to implement the new maps and the agency is still saying that it doesn’t have the money to change the ISP reporting process. This is hard to believe from an agency that is self-funded by fees and by spectrum auctions – the agency could have required the industry to pay for the new mapping at any time – but the FCC wants a specific allocation of funding from Congress. This feels like another delaying tactic.

There are good reasons for the FCC to not want better mapping. The FCC is required by law to take action to solve any big glaring difference between broadband availability in urban and rural areas. The agency has been doing everything possible over the last decade to not have to take such extraordinary steps.

Everybody involved in rural broadband knows that the current maps are dreadful. ISPs are free to claim broadband coverage and speeds in any manner they want, and from my experience, most rural counties have areas where broadband coverage or speeds are overstated. In many cases the overstatement of broadband is unbelievable. I recently was working with counties in Washington, New Mexico, and Minnesota where the FCC databases show 100% broadband coverage in rural areas when in real life there is almost zero broadband outside of towns.

This same mandate is the primary reason why the FCC doesn’t increase the definition of broadband, which has been set at 25/3 Mbps since 2015. Residents in well over half of the country, in cities and suburbs, have the option to buy broadband of 100 Mbps or faster. But the FCC sticks with the slower definition for rural America so that it doesn’t have to recognize that millions of rural homes, many in county seats in rural counties, don’t have broadband as good as in larger cities.

It is that same requirement to solve poor broadband that has driven the FCC to stick with mapping that FCC Commissioners all admit is inadequate. If the FCC fixes the maps, then many more millions of homes will become properly classified as not having broadband, and the FCC will be required to tackle the problem.

Unfortunately, I don’t hold out a lot of hope for the new broadband mapping process. The biggest reason that today’s mapping doesn’t work is that ISPs are not required to tell the truth. Drawing polygons might decrease some of the areas where the ISPs claim coverage that doesn’t exist – but there is nothing in the new rules that force ISPs to report honest speeds. A rural county is still going to have overstated broadband coverage if ISPs continue to claim imaginary speeds – sometimes amazingly exaggerated. One of the counties I recently was working with has two wireless ISPs that claim countywide coverage of 100 Mbps broadband when it looks like the ISPs don’t operate in the county. The new mapping is not going to fix anything if an ISP can draw false polygons or report imaginary speeds. The new maps aren’t going to stop the exaggeration of rural DSL speeds by the big telcos.

Unfortunately, there are huge negative repercussions for areas where the ISPs lie about broadband coverage. The best example is the current RDOF auction where the FCC is awarding $16.4 billion in grants. None of the areas where ISPs have lied about broadband coverage are included in that grant program and won’t be included in future grants as long as ISPs keep lying about broadband coverage.

Lets not forget that ISPs have motivation for lying to the FCC about broadband coverage. Keeping grants our of rural areas shields the ISPs already operating there and protects rural ISPs that are selling 2 Mbps broadband for $70 per month. If these areas get grants the ISPs lose their customers. The penalties for overstating broadband speeds and coverage ought to be immense. In my mind, if an ISP deprives a rural county from getting broadband grants, then the ISP ought to be liable for the lost grant funding. If the FCC was to assess huge penalties for cheating the maps would be cleaned up overnight without having to switch to the polygons.

As usual, the FCC is pursuing the wrong solution and I suspect they know so. The big problem with the current maps is that ISPs lie about their coverage areas and about the speeds that are being delivered to customers. The FCC has the ability to require truthfulness and to fine ISPs that don’t follow its rules. The FCC could have implemented penalties for false reporting any time in the last decade. Implementing new mapping without implementing penalties for lying is just kicking the can down the road for a few more years so that the FCC won’t have to address the real rural broadband shortfalls in the country.