FCC Touts 6G

The FCC has seemingly joined forces with the marketing arm of the cellular industry in declaring that the spectrum between 7–16 GHz is now considered to be 6G. Chairman Jessica Rosenworcel recently announced that the agency would soon begin looking at the uses for this spectrum for mobile broadband. Specifically, the agency will be looking at 550 MHz of spectrum between 12.7-13.25 GHz for what Rosenworcel characterized as airwaves for the 6G era.

This 7-16 GHz spectrum is already used for a wide range of purposes, including fixed point-to-point microwave links, radio astronomy, communications with airplanes, and various military uses. Probably the biggest current use of the spectrum is for communicating with satellites. Rosenworcel said the agency would consider ways to share some of the spectrum between satellite and terrestrial uses.

The use of the 6G description for this spectrum is a big departure from the recent past. It was just in 2019 when Verizon defined 5G to include the millimeter-wave spectrum as high as 28-39 GHz as part of 5G. I’m sure most of you remember the never-ending TV commercials showing cellphones receiving 1-gigabit speeds. Verizon and a few other cellular carriers had deployed millimeter-wave spectrum in downtown areas of a few major cities as a gimmick to show how fast 5G could be. Verizon labeled this as Ultra Wideband to distinguish it from the 4G LTE spectrum that Verizon and others were starting to label as 5G.

It has to be confusing to be a cellular customer because I try to follow this stuff, and I can’t keep up with the cellular marketers. When Verizon used millimeter-wave spectrum and labeled it as Ultra Wideband, the company flashed a 5G UW icon to users to denote having access to the superfast speeds. But I’m hearing that people are now getting the 5G UW icon when connecting to Verizon’s C-Band spectrum, which is mid-range spectrum between 3.7-4.2 MHz.

The funny thing about everything that cellular marketers are doing is that 5G has nothing to do with any specific frequency range. 5G is a set of specifications to define how cell towers work, and the specification can be used with any spectrum. The 5G spectrum can work in the mid-range spectrum, in the band that the FCC just labeled as 6G, and in the higher millimeter wave spectrum.

I’m mystified that the FCC would suddenly label the spectrum between 7-16 GHz as 6G. There will be no 6G specification – anything we do in this spectrum will still either use the 4G LTE or 5G specifications.  Wireless scientists around the world have started experimenting with what they are calling 6G using terabit spectrum that ranges between 100 GHz and 1 THz. These high frequencies sit right below light and have the capability of being harnessed to transmit huge amounts of data for short distances, such inside superfast computer chips. Scientists expect within the next decade to develop the new 6G specifications.

Scientists understood that the 5G specifications would cover all spectrum up to 100 GHz. But apparently, we’re going to now carve up spectrum into tiny slices and label each tiny slice as a new generation of G. I’ve always joked that we’re going to be to 10G before we know it – and it turns out that was no joke at all and extremely conservative.

Behind all of the confusion behind mislabeling things as 5G and 6G is the fact that we will eventually need new cellular spectrum. Cellular networks seem robust today, but the demand for mobile data keeps growing. There are already a lot of complaints that the new spectrum labeled as 5G is overcrowded. The FCC knows it takes many years after declaring a new cellular spectrum until it shows up in the market. This is the time to look at new spectrum bands to put into use a decade from now. This is not going to be easy because satellite companies will be screaming loudly that cellular companies are trying to steal their spectrum. They aren’t completely wrong about this, and I don’t envy the FCC the job of refereeing between the competing uses of spectrum. Just recently, the FCC made it easier for satellite providers to share in existing spectrum bands. But when the FCC labeled this spectrum as 6G, I think we already know it ultimately favors the cellular companies.

Is Charter the Largest Rural ISP?

Once in a while, I see something in the industry press that gives me a pause. Telecompetitor reported that Charter CEO Chris Winfrey said on the company’s first quarter earning call that Charter is the “largest rural provider today.” As much as I work in and track the industry, I would never have connected the dots enough to think that.

I can see how Charter is on the way to being a big rural player. The company was the largest winner of the RDOF reverse auction in terms of passings and is slated to bring broadband to pass over 1 million rural homes and businesses. The company says it is ahead of schedule and has already built 40% of those passings. But does passing 400,000 homes make Charter the biggest rural provider in the country?

In the last few years, there has been an explosion of FWA fixed cellular wireless from T-Mobile and Verizon. At the end of 2022, T-Mobile had over 2.6 million FWA customers and added 524,000 in just the fourth quarter of 2022. Verizon had almost 1.5 million customers and added 389,000 in the fourth quarter. While not all of those customers are rural, it seems likely that both companies have a lot more rural customers than Charter.

It’s hard to get specific statistics from the big telcos, but it’s hard to imagine that CenturyLink and Frontier don’t still have more rural customers than Charter. In all fairness, the rural telco DSL customers are the prime target for Charter and everybody else who is building rural networks – but it’s unlikely that Charter has yet eclipsed them in customer counts.

Jonathan Chambers of Conexon wrote a recent blog that notes that electric cooperatives collectively have more rural customers than Charter.

Nobody knows who the eventual biggest rural winner will be. There are somewhere north of 10 million rural passings that will be tackled by the upcoming BEAD grants. Meanwhile, huge amounts of funding have been provided in rural America through CARES and ARPA funding administered through states or awarded by local governments. I think we’re going to have to wait for the BEAD grants to play out to see who will ultimately be the largest rural ISP.

And even at the end of those grants we might not know. I’ve been predicting that there will be a major roll-up of last-mile fiber networks, and there is no reason that won’t include rural properties. We might have to wait a decade to see who the biggest rural players will be.

I have to think that Winfrey knew his statement wasn’t factual, and I think that he was making the point that Charter is now a major player in rural America. He caught the industry’s attention through the statement which was aimed at Charter’s stockholders. We’re seeing big cable company customer counts level off after a decade of spectacular growth, and I think his message was that Charter is still a growing company.

One thing that Charter didn’t say is that whoever builds fiber in rural areas today is creating monopoly markets. It’s going to be hard for anybody to compete against rural fiber over the long run, and Charter and other companies pursuing grants are counting on being the monopoly provider across large swaths of rural areas. I see a lot of speculation asking why companies are pursuing rural broadband – and I think the appeal of having markets where a company will eventually have an 80%+ market penetration is something that pencils out well.

Unwinding the PSTN

This blog is aimed mostly at telephone companies and various CLECs who have been operating on the legacy Public Switched Telephone Network (PSTN). This network has been used for interconnection to the local incumbent offices and tandem switches, for connecting to 911 centers, for connecting to operator services, for connecting to cellular carriers, or for connecting to other neighboring carriers.

At CCG, we are finally starting to see that network being shut down, route by route and piece by piece. But like everything related to operating in the regulated legacy world, it’s not easy to disconnect the PSTN connections called trunks. The big incumbent telcos like AT&T, Verizon, CenturyLink, and others will continue to bill for these connections long after they stop being functional.

I don’t use this blog to make many pitches for my consulting practice, but I think we’re one of the few consultants left in the industry that can help to unwind and stop the billing of the old PSTN network arrangements. We spent many years helping ILECs and CLECs originally order these connections. The ordering process for the PSTN has always been complicated and cryptic. Carriers need to go through those same systems to cut a circuit dead. You often can’t stop the billing by calling or writing to the incumbents – network arrangements need to be unwound in the reverse manner they were built in the first place.

It’s not surprising that this is hard to do. The ordering system was made difficult on purpose after the big telcos decided they didn’t like the requirements of the Telecommunications Act of 1996 that required them to share their networks with other carriers. After that FCC order, big telcos purposefully made it hard to initiate a connection with them – and now it’s just as hard to disconnect. The big telcos will be glad to continue to bill for circuits for years after they no longer work.

I have no idea how long it’s going to take the PSTN to die, but it’s finally starting to be disassembled, piece by piece. In some ways, it’s a shame to see this network die because it was the first nationwide communication network. It was built right, and it was reliable. Outages came from the same issues that still plague networks, and a fiber cut has always been able to isolate a town or a region from the PSTN.

Sadly, the big telcos never spent the money to create route redundancy. Folks like me have shouted for decades that there was no way to justify multi-day rural network outages when we know how to solve the problem. These outages are still happening today – and the fibers that carry the PSTN are often the same fiber routes that act as the only broadband backbone route into a rural area.

I remember twenty years ago when I had a few small telephone company clients who were willing to solve the redundancy problem by building a new fiber route. We were shocked when Verizon and AT&T refused to connect the new routes into the PSTN. Apparently, the big telcos were more worried about being bypassed than they were about having a more reliable network.

Over time, and as a result of some orders from State regulators, the big telcos allowed route redundancy when somebody else paid for it. Today, large carriers like Level 3, Zayo, and many others cross the country with alternate transport routes, but unfortunately, there are still a lot of rural places where the only available fiber comes from the incumbents.

If you are having problems disconnecting or rearranging connections with other carriers, give us a shout. This could be connections with a large telco, with cellular towers, or to other local carriers. You can contact Derrel Duplechin at CCG at dduplechin@ccgcomm.com. We hate to see the PSTN starting to go. But even more, we hate to see folks who can’t figure out how to get a divorce from the big telcos.

Starlink’s New Business Broadband

Starlink has quietly updated its business broadband offerings. The original plan for businesses was $500 per month with a two-terabyte data cap. If a customer exceeded the data cap, the speed reduced to 1 Mbps for the remainder of the month unless a customer bought additional broadband at $1 per gigabyte. Starlink business comes with a premium antenna from HP at a one-time cost of $2,500.

The new plans are:

  • 1 TB Data Cap. $250/month plus $2,500 equipment costs.
  • 2 TB Data Cap. $500/month plus $2,500 equipment costs.
  • 6 TB Data Cap. $1,500/month plus $2,500 equipment costs.

Extra data now costs $0.50 per additional gigabyte.

Starlink promises faster speeds for businesses with the HP business antenna. This antenna has a 35% better field of view, is less sensitive to hot weather, handles rain better, and melts snow faster. The company now claims the following speeds on its website:

Download Upload
Residential 20 – 100 Mbps 5-15 Mbps
Business 40 – 220 Mbps 8-25 Mbps
RV 5-50 Mbps 2-10 Mbps

Interestingly, the speed claims above from the Starlink website are a lot slower than what was promised as recently as September 2022. For example, residential customers were told in 2022 that download speeds would be between 50 – 200 Mbps with upload speeds of 10 – 20 Mbps. Customers have been saying online that speeds are getting slower – something that has been validated by Ookla speed tests.

In the most recent FCC maps, Starlink claims speeds up to 350/40 Mbps. That matches the maximum speeds that Starlink promised to business customers in September 2022. We’ll have to see if the company drops the speeds claimed to the FCC now that it has dropped its maximum claimed speed down to 220/25 Mbps.

On May 9, Starlink notified customers that it would no longer deprioritize traffic after a customer hits the monthly data cap. Customers were being slowed to speeds as slow as 1 Mbps. Now, customers can sign-up to automatically be billed for excess data usage.

To some degree, the business offering is going to be a concern for some residential customers since business customers will get bandwidth priority. That might make a difference in neighborhoods with multiple business customers.

It will be interesting to see how Starlink performs over the long run. The company still has plans to add many thousands of satellites. But the company still has a waiting list of customers – and the company admits that it can get easily get oversubscribed in a neighborhood.


In 2021, Elon Musk said that he foresaw a future where Starlink could provide backhaul bandwidth to rural cell towers. That may still be coming in the future, but not with the current constellations. The speeds above are not nearly what a cell tower owner wants to buy. Even the most rural small cell site is going to want a steady 500 Mbps of bandwidth, with a more typical requirement of 1 Gbps. I would think that residential subscribers have to hope the company never sells to cell towers, or the coverage at peak times could suffer.

The Rural Cellular Crisis

Over the last few years, I have helped dozens of counties get ready for the upcoming giant broadband grants. We’ve been very successful in helping counties identify the places in their County that don’t have broadband today – which is often drastically different than what is shown by the FCC maps. We then help county governments reach out to the ISPs in the region and open up a dialog with the goal of making sure that all rural locations get better broadband. This takes a lot of work – but it’s satisfying to see counties that are on the way to finding a total broadband solution.

In working with these counties, one thing has become clear to me. Some of these counties have a bigger cellular coverage problem than they do a broadband problem. There are often a much larger number of homes in a county that don’t have adequate cellular coverage than those who can’t buy broadband.

The counties I’ve helped have reached out to me – either directly or through an RFP looking for a consultant. Only a tiny number of the Counties identified their cellular problem up front when they hired me. Yet, when I talk to residents and businesses in the County – I hear more horror stories about poor cellular coverage than I do about poor broadband coverage.

I always knew that the cellular coverage maps published by the big cellular carriers were overstated. You might recall back before cellular advertising was all about 5G that the cellular carriers would all claim to have the best cellular coverage. They would proudly show their coverage map in the background on ads and on their websites to show how they covered most of the country.

I’ve come to learn that those maps were pure garbage. They weren’t just an exaggeration, and when you drilled down to look at specific counties, they were outright fabrications. I’ve worked recently with two counties that are the homes of major universities and one state capital. In all three of these counties, cellular coverage dies soon after people leave the biggest urban center.

If anything, I think that cellular coverage has gotten worse with the introduction of the spectrum that the carriers are all claiming as 5G. These are new frequency bands that have been introduced in the last few years to relieve the pressure on the 4G LTE networks. It makes sense that coverage would be reduced with the higher frequencies because one of the first rules of wireless technology is that higher frequencies tend to dissipate more quickly than lower frequencies. When I hear the complaints in these counties, I have to think that the 5G spectrum is not carrying as far into the rural areas.

This is a problem that is well-known to everybody in the industry, including the FCC. Back before the pandemic, the FCC came up with a plan to spend $9 billion from the universal service fund to build and equip new rural cellular towers – using a reverse auction method much like RDOF. This process derailed quickly when the biggest cellular companies produced bogus maps that Showed decent coverage in rural areas that were close to some of the smaller cellular carriers. The FCC was so disgusted by the lousy maps that it tabled the subsidy plan.

The FCC finally reconsidered this idea in 2021. Now the cellular carriers are required to produce maps every six months at the same time as ISPs report broadband coverage. If you haven’t noticed, you can see claimed cellular coverage on the same dashboard that shows the broadband map results. I haven’t spent much time digesting the new cellular maps since all of my clients are so focused on broadband. But I checked the maps in the region around where I live, and the maps still seem to exaggerate coverage. This is supposed to get better when wireless carriers are supposed to file heat maps for the coverage around each transmitter – we’ll have to see what that does to the coverage. It’s going to get harder for a wireless carrier to claim to cover large swaths of a county when it’s only on a tiny handful of towers.

There is a supposed way for folks to help fix the cellular maps. The FCC has a challenge process that requires taking a speed test using the FCC cellular speed test app. Unfortunately, this app requires a lot of speed tests in a given neighborhood before the FCC will even consider the results. I’m doubtful that most rural folks know of this app or are motivated enough to stop along the side of the road and repeatedly take the speed tests. And frankly, who knows if it will make any real difference even if they do?

The big cellular companies have clearly not invested in many new rural cell towers over the last decade because they’d rather have the FCC fork out the funding. I haven’t the slightest idea if $9 billion is enough money to solve the problem or even put a dent in it. No doubt, the FCC will saddle the program with rules that will add to the cost and result in fewer towers being built. But whatever is going to happen, it needs to start happening soon. We are not a mobile society, and it’s outrageous that a lot of people can’t make a call to 911, let alone use all of the features that are now controlled by our cell phones.

Using Fiber as a Sensor

I am an admitted science nerd. I love to spend time crawling through scientific research papers. I don’t always understand the nuances since scientific papers are often written in severe jargon, but I’ve always been fascinated by scientific research, because it presages the technology of a few decades from now.

I ran across research by Nokia Bell Labs concerning using fiber as a sensor. Scientists there have been exploring ways to interpret the subtle changes that happen in a long strand of fiber strand. The world is suddenly full of fiber strands, and scientists want to know if they can discern any usable real-life data from measuring changes in fiber.

They are not looking at the transmission of the light inside the data. Fiber electronics have been designed to isolate the light signal from external stimuli. We don’t get a degraded signal when a fiber cable is swaying in the wind. We probably don’t marvel enough about the steady and predictable nature of a fiber light signal.

The research is exploring if the physical attributes of the fiber can be used to predict problems in the network before they occur. If a network operator knows that a certain stretch of fiber is under duress, then steps can be taken to address the issues long before there is a fiber outage. Developing ways to interpret the stresses on fiber would alone justify the research many times over.

But scientists can foresee a much wider range of sensor capabilities. Consider a fiber strung across a bridge. It’s hard to measure tiny shifts in the steel infrastructure in a bridge. However, a fiber cable across the bridge can sense and measure subtle changes in the tensions on the bridge and might be able to understand the way that a bridge is shifting long before it becomes physically obvious.

There is already some physical sensing used to monitor underseas fibers – but more can be done. The fiber can possibly measure changes in temperature, current flows, and seismic activity for the full length of these long fibers. Scientists have developed decent sensors for measuring underground faults on land, but it’s much harder to do in the depths of the open ocean.

To test the capabilities to measure and interpret changes to fiber, Bell Lab scientists built a 524-kilometer fiber route between Gothenburg and Karlstad in Sweden as the first test bed for the technology. This will allow them to try to measure a wide range of environmental data to see what can or cannot be done with the sensing technology.

It’s hard to know where this research might go, which is always the case with pure research. It’s not hard to imagine uses if the technology works as hoped. Fiber might be able to identify and pinpoint small forest fires long before they’ve spread and grown larger. Fibers might serve as an early warning system for underground earthquakes long before we’d know about them in the traditional way. The sensing might be useful as a way to identify minor damage to fiber – we know about fiber cuts, but there is often no feedback today from lesser damages to fiber that can still grow to finally result in an outage.

Will Congress Fund the ACP?

The clock is ticking on the Affordable Connectivity Program (ACP). Current estimates show the program may run out of funding as soon as the end of the first quarter in 2024, ten months from now. The ACP provides a $30 monthly discount to eligible households and up to a $75 monthly discount to households residing on Indian reservations. The program started with a little over 9 million households at the start of 2022, and in March 2023 was up to over 18 million enrollees. You can see the enrollment statistics on this website.

The only solution for keeping ACP operating is for Congress to refill the ACP funding bucket somehow. This topic was discussed at the recent House oversight hearings on broadband. Angela Siefer of NDIA (National Digital Inclusion Alliance) testified at that hearing and said that reauthorizing ACP was one of the biggest broadband issues on the plate for Congress. She talked about the many gains that have been made in getting broadband to low-income homes.

ACP was not created through a normal budget appropriations bill but was funded by $14.2 billion from the Infrastructure Investment and Jobs Act (IIJA). There was also rollover funding of $2.2 billion added from the previous Emergency Broadband Benefit program that had been funded by the CARES Act. That was a one-time funding event, and that means specific legislation is needed to keep the program running.

There has been talk of moving the responsibility of the ACP to the FCC’s Universal Service Fund. But that would mean the agency would have to find a new way to pay for it. The current fees levied on Internet telecommunications are not nearly large enough to absorb the ACP obligations. Congress has already been considering ways to eliminate the FCC’s Lifeline fund, so the FCC might not be a politically viable solution.

Big ISPs are in favor of the ACP. The largest recipient of the funding is Charter, and Comcast is the fourth largest. One of the things that makes it harder to continue the funding for ACP is that eleven of the top fifteen recipients of ACP are wireless carriers. There is some concern that there is fraud embedded in the claims of some of these companies, which gives ammunition to those who don’t want to see the subsidy continue.

For ISPs, one of the biggest issues that will arise from the end of the ACP is that the upcoming BEAD grants require ISPs to have a low-income plan. Most ISPS have been pointing to the ACP as their low-income solution. But if the ACP expires, ISPs will have to develop a self-funded discount plan in order to win grant funding.

Anybody who has been watching Congress this year understands the challenge of getting a divided Congress to agree to continue funding a subsidy program. Many DC pundits are convinced that there will be very little bipartisan legislation passed in 2023 and 2024. There has been a lot of recent effort aimed at getting more folks enrolled in ACP – but that effort will mean very little in the long run if the program runs out of money.

Our Uneven Regulatory System

The Florida Legislature recently passed a bill that brings poles under state jurisdiction for any electric cooperative that elects to enter the broadband business. That would be a change from current regulatory rules that exempt cooperatives and municipally-owned electric companies from federal and state oversight of pole regulation.

This blog isn’t going to debate the pros and cons of this specific legislation but will instead point to our uneven regulatory environment in the U.S. Over the years, State legislatures have passed laws that create regulatory rules that apply only to specific entities and not to everybody.

Laws affecting electric cooperatives related to broadband are a good example. Over the last several decades, a lot of states passed laws that prohibited electric cooperatives from entering the broadband business. These laws were clearly prompted by telephone and cable companies that didn’t want a new competitor. In the last few years, a lot of legislatures reversed these laws due to pressure from the public to allow their local electric cooperatives to bring them fiber broadband. These new laws have been effective, and a Google search tells me that 250 of the 900 electric cooperatives are either offering broadband or have plans to do so.

Most of the legislation that created exceptions has been aimed at squelching broadband competition. A lot of states have laws that either prohibit or restrict municipalities from entering the broadband business. It’s a little hard to get a precise count these days because there is a wide range of laws against municipal broadband that range from outright bans to rules that only create a few hurdles for a municipality to enter the business that aren’t faced by other competitors.

Most of the restrictions against municipal broadband are written in such a way as to make it seem like there is a path for cities to provide broadband. But most such laws often have a kicker that makes it extremely difficult for a municipality to comply.

As an example, North Carolina imposes a long list of requirements on a municipality that wants to offer broadband. A public entity must impute phantom costs into their rates, conduct a referendum before initiating service, forego popular financing mechanisms like bond financing, refrain from using typical industry pricing plans, and must make their commercially sensitive information available to their incumbent competitors. These rules collectively make it nearly impossible for a municipality to launch a broadband business.

As the new Florida law shows, it’s a never-ending battle with incumbents trying to legislate away competition. Last year there was a major push in Ohio for new legislation that would prohibit municipalities from providing broadband. The law was ultimately defeated, but it seems that several laws that create hurdles to market entry are introduced around the country every year.

Because of the push to get broadband to everybody, some restrictions have been relaxed in the last few years. It’s now a bit easier for municipalities in Arkansas to offer broadband. In Washington, there was a law prohibiting the municipal Public Utility Districts (county-wide electric companies) from offering any broadband except open-access. Last year, in a bizarre legal move, the Washington legislature passed two conflicting laws that allow PUDs to offer retail broadband. The Governor simultaneously signed the two bills rather than choose one of them, and now it’s unclear what the law is. Colorado just lifted the requirement that municipalities must hold a referendum before entering the broadband business.

It’s not surprising that there are laws that restrict some entities from becoming ISPs. Every textbook about monopoly power and abuse predicts that monopolies can’t resist the temptation to quash competitors.

Community-Wide WIFI

Somebody sent me an article from BocaNewsNow that talks about the trend that Hotwire is seeing in communities that want broadband everywhere. Residential communities in Florida are investing in outdoor WiFi networks that allow residents to connect to broadband from everywhere on a property, including tennis courts, lakefronts, and common community areas.

Communities are advertising ubiquitous broadband as an attractive amenity, and homeowners associations are investing in the technology at the prompting of residents.

It’s an interesting idea, but not a new one. Folks might remember the municipal WiFi craze of twenty years ago when cities everywhere were considering installing massive outdoor WiFi networks as a way to provide broadband to everybody. This was such a hot topic that there was even a magazine for municipal WiFi and conventions where folks came to learn about it. The largest such experiment was in Philadelphia, but there were many other cities that tried this on a smaller scale.

All of the early attempts for creating massive outdoor WiFi failed. The main reason for the failure was technical. The technology required deploying large numbers of pole or building-mounted radios that operated in a mesh network. The radios were mounted fairly close to each other so that there was a radio every several blocks in all directions. The advantage of a giant mesh network was that a customer walking around a community never left the network and didn’t have to keep logging in to keep the same connection.

But there was a giant downside that was never solved. The mesh radios constantly communicated with neighboring radios so the network could reconfigure to avoid a faulty or overloaded radio. It turns out that large early mesh networks spent more bandwidth communicating between neighboring radios than in providing bandwidth to users. The whole concept crumbled once a few cities tried this on any scale.

The other issue that killed the idea was that home broadband was improving drastically during this same time period. Speeds were climbing from cable companies and telcos, and folks were suddenly able to buy speeds of 6 Mbps to 12 Mbps, which quickly made the 1-2 Mbps speeds on wireless mesh networks feel glacial.

Over the years, outdoor WiFi technology has improved dramatically like other technologies. Since the early days of the technology, the FCC approved the 5 GHz, and more recently the 6 GHz bands of spectrum for use in WiFi networks. Outdoor hotspots that are fed with significant backhaul can now easily deliver speeds that are adequate for most of the kinds of uses of broadband that would be expected outdoors. Folks can watch videos, join Zoom calls, and use the outdoor WiFi network to stay connected.

Hotwire claims that the demand for outdoor WiFi has also grown due to people now working from home. It’s attractive for employees to take a laptop to the pool or a park rather than be tied to a desk all day.

I’ve talked to a lot of cities that have already expanded or are considering expanding public WiFi to parks and other public areas. The pandemic showed a lot of city officials that there are a lot of folks who need broadband access and don’t have it at home for some reason. It’s one of those amenities that, once you have it, you wonder how you lived without it.

Should We Trust the Companies that Created the Digital Divide?

For those of you who don’t know Bruce Kushnick, he’s been tracking the promises made and broken by Verizon since the 1990s and written extensively on the issue. His latest article is “NTIA: Require Every State Broadband Agency to Investigate Those Responsible for Creating the State’s Digital Divide.”

Bruce has been arguing eloquently for years that the big telcos like Verizon, AT&T, and CenturyLink caused the rural digital divide by extracting profits from the regulated telephone and broadband businesses in rural and low-income areas while neglecting maintenance and not using any of the profits to modernize the technology. According to Bruce, the only reason we need massive federal grant programs today is to make the investments that the big telcos refused to make for the last several decades.

He argues that the NTIA should require states to investigate how the digital divide was created in rural areas and center cities. He uses the two examples of New Jersey and Los Angeles to make his point. He’s been tracking the promises made by Verizon to the State of New Jersey for the last thirty years. Verizon repeatedly sought regulatory relief through deregulation along with rate increases that were supposed to fund modernizing the network in the State – upgrades that were never done. When Verizon finally upgraded to fiber, it did so only in neighborhoods with the lowest costs, avoiding rural areas and most low-income neighborhoods.

I’ve been tracking this issue during my career as well. Consider West Virginia. I remember when Verizon was looking for a buyer of the telco network there as far back as the early 1990s. When big companies are trying to sell a property, they do what valuation folks call ‘dressing up the pig”. This means cutting expenses to make the property look more profitable. The cuts are usually deep, and drop maintenance below the level needed to keep up with routine repairs and maintenance.

Verizon didn’t end up selling the West Virginia network until the sale to Frontier in 2010. By then, the networks had been neglected for more than fifteen years. Frontier made only minimal upgrades to the properties they purchased – but it’s hard for an outsider to know if this was due to an intention to continue to milk cash flow out of the acquired network like Verizon had done or due to a lack of the capital and impact of the heavy debt used to buy the property. In any case, the West Virginia network continued to degrade under Frontier’s ownership.

For years, Bruce has made the point that there has not been any financial or regulatory cost to the big telcos for their bad behavior. They’ve repeatedly broken promises made to states. They’ve routinely milked profits out of networks while ignoring customers as the properties deteriorate.

In fact, we’ve seen the opposite of penalties. For example, the big telcos were rewarded with over $10 billion of CAF-II subsidies to support dying and neglected rural DSL networks. That money was supposed to be used to increase rural data speeds to 10/1 Mbps at a time when that speed was already obsolete. We’ve seen far too many places where even that basic upgrade was not made.

Bruce’s conclusion is that it would be ludicrous to give grant funding now to the companies that caused the digital divide in the first place. That would be using public money to upgrade the networks for these companies when profits should have been used over the decades to do so. He makes a solid argument that giving money to these same companies will not solve the digital divide since there is no reason to think the big telcos won’t turn around and do it all over again.