Hearings on Broadband Grants

I’ve always tried to keep politics out of this blog. That hasn’t been too hard since I’ve found that getting better broadband for any community is almost always a non-partisan issue. I’ve worked with City and County Councils and Boards all over the country, and I haven’t seen that the political makeup of the local governing body has much impact on the degree to which they want to find better broadband for citizens.

That’s why I was surprised to see that the newly seated House of Representatives immediately announce a set of hearings to look at broadband grants. You know from reading my blog that I think there is a lot of room for improvement in the BEAD grant program – due in large degree to the complicated grant rules established by Congress. I would welcome hearings that would examine some of the over-the-top grant rules if the purpose of the hearings was to create legislation to make it easier to award and spend the BEAD grant funds.

But that doesn’t seem to be the intent of these hearings. The hearings want to look at two issues. The first is to make sure that the grants are only used for connecting to unserved locations and not used for ‘overbuilding’. This has been a major talking point for the big cable companies for years – they don’t want to see any grant money used to encroach on areas they think of as their service territories. The whole idea of not using grants for overbuilding is ludicrous – there are not many homes in the country where at least one ISP can’t provide service – so every new broadband network that is constructed is overbuilding somebody.

The vast majority of the BEAD grants will be used in rural areas, and the idea that rural funding will be used for ‘overbuilding’ is ludicrous. I don’t know anybody who advocates using grant funding to overbuild rural fiber networks or other existing fast networks. All of the state grant programs I’ve worked with have a challenge process to make sure this doesn’t happen, and it looks like the BEAD grants have several crosschecks to make sure this doesn’t happen. Even if a BEAD grant is awarded in error, I would think a State Broadband office would yank the grant award before letting grant money be used to overbuild rural fiber.

The issue that has the big cable companies up in arms is that the IIJA grant legislation says that once a state has satisfied bringing broadband to unserved and underserved locations, grant funding can be used to improve broadband in inner cities and places that the big ISPs have ignored. There will not likely be a lot of BEAD grant money that goes to this purpose, but there will be some.

It’s hard to understand the reason to have a hearing on this issue. The BEAD rules are clearly defined by language crafted and enacted by Congress. The hearings will likely involve grilling officials from the NTIA on this issue. It’s an absurd scenario to picture, because the NTIA has no choice but to follow the law as written by Congress. Any hearings on this issue will likely beat up n officials at the NTIA or FCC, but will really be Congress investigating its own law.

The other stated purpose of the hearings is to make sure that the grants don’t have waste, fraud, or abuse. It’s going to be really interesting to see where this leads in hearings. The only big historical cases of grant waste and abuse I know of are the way the big telcos often took CAF II funding and made no upgrades. I don’t picture these hearings dredging up past abuses by the big ISPs, so I’m having a hard time imagining where else this line of inquiry might go.

I fear that the biggest repercussion of this kind of hearing is that it’s going to make already-cautious grant officials even more cautious. The folks at the NTIA and State Broadband offices are going to worry that everything they do will be under a microscope from Congress – and they are going to get even more careful not to make any bad mistakes in awarding grants. Nobody wants to be yanked in front of Congress in a year and be grilled about a specific grant award. And perhaps that’s the purpose of these grants – to intimidate officials into funneling more grant funding to the safe choice of giving it to big ISPs.

What puzzles me the most is why hold broadband hearings of this sort. Bringing better broadband to communities is immensely popular. In the many surveys we’ve administered on the issue, the public support for bringing better broadband has always been above 90%. This is true even in communities where there is already fast broadband offered by a cable company – folks want competition. It’s hard picturing any headlines coming from these hearings that can benefit the politicians holding the hearings.

These hearings only make sense as a way to appease the large ISPs which contribute heavily to politicians. It’s hard to imagine that these hearings will change anything. Congress can change the BEAD grant rules any time this year, but that will take bipartisan cooperation – something that seems to have disappeared from Washington DC. But the hearings will only allow for the airing of the big ISP grievances, and I guess that is something.

Are We Facing the Splinternet?

One of the consequences of the war between Russia and the Ukraine is that Russia has largely stopped participating in many large worldwide web applications. Russia has blocked Facebook and Twitter. Other applications like Apple, Microsoft, TikTok, Netflix, and others have withdrawn from Russia.

The European Union is in the process of trying to block Russian-generated content such as the state-owned news outlets of RT (formerly Russia Today) and Sputnik. There are discussions of going so far as block all Russian people and businesses from EU search engines.

Russia has responded by declaring Meta, the owner of Facebook, Instagram, and WhatsApp, to be an extremist organization. This has also led the Russian government to withdraw its participation in organizations that set international policies such as the Internet Council of Europe. The EU countered by suspending Russia from the European Broadcasting Union.

There is a new phrase being used for what is happening with Russia – the splinternet. In a full splintenet scenario, Russia could end up being totally separate from the rest of the world as far as participating in the Internet.

There are already countries that don’t fully participate in the worldwide web. North Korea has blocked participation in much of the web. China and Iran block a lot of western content. However, these countries still participate in supporting the general structure and protocols of the Internet, and not all western applications are blocked.

The folks from the worldwide governing bodies that oversee Internet protocols are concerned that Russia, and perhaps China and Iran could decide to fully withdraw from the web and develop their own protocols for use inside the countries. If the countries that have peeled off from the rest of the web don’t maintain the same protocols, then communications with the rest of the world eventually becomes difficult or impossible.

This would have a drastic impact on the web as an international means of communication. There are huge amounts of digital commerce between these countries and the rest of the world over and above social apps. Commerce between these countries and the world depends on email, messaging apps, and collaboration platforms. People and businesses in these countries participate in digital meetings in the same manner as the rest of the world. The economic impacts of large countries effectively withdrawing from worldwide e-commerce would be immense.

This is something that we’ve seen coming for many years. For example, Google and Facebook located servers in Russia so that content generated in Russia would stay in the country and not be stored in servers and data centers outside the country.

A Russian withdrawal from the Internet would be far more drastic than Chinese censoring of web contact – it would cut communications with the outside world to zero. It’s hard to even imagine the impact this would have on Russian businesses, let alone cutting the ties between the Russian people and everybody else. This would create a digital Berlin Wall.

It doesn’t seem likely that having Russia or China withdraw from the web would have much impact on how the rest of the world uses the web. It would mean that citizens in those countries would not benefit from the newest innovations on the web. But most countries already today understand how important the web is for commerce, and for most countries, that’s a good enough reason not to tinker with something that works.

From my perspective, the whole world suffers if folks stop participating in worldwide communications. The web is the great equalizer where folks with similar interests from around the world get to know each other. But we live in a world full of politics and controversy, so it’s probably inevitable that this will spill eventually over to the Internet, like it does to many other parts of the world economy.

What Happened to Verizon Fiber-to-the-Curb?

Back in 2018, Verizon got a lot of press for the release of a fiber-to-the-curb (FTTC) technology it called Verizon Home. The first big test market was Sacramento. The company built fiber along residential streets and used wireless loops to reach homes. At the time, Verizon touted speeds of 300 Mbps but said that it was shooting for gigabit speeds using millimeter-wave spectrum. Verizon tried to make this a self-installed product, and customers got instructions on how to place the receiver in different windows facing the street to find the best reception and speeds.

There were quotes from the time that Verizon intended to build fiber to pass 25 million homes by 2025 with the technology. But then the product went quiet. In 2020, the Verizon Home product reappeared, but it is a totally different product that uses cellular spectrum from cell towers to bring broadband. This is the product that the industry is categorizing as FWA (fixed wireless access). The company no longer quotes a target broadband speed and instead sayshttps://www.verizon.com/5g/home/Verizon 5G Home is reliable and fast to power your whole home with lots of devices connected. So all of your TVs, tablets, phones, gaming consoles and more run on the ultra-fast and reliable Verizon network.” In looking through some Ookla speed tests for the FWA product, it looks like download speeds are in the 100 – 150 Mbps range – but like any cellular product, the speed varies by household according to the distance between a customer and the transmitter and other local conditions.

The new cellular-based product has gone gangbusters, and Verizon had over one million customers on the product by the end of the third quarter of 2022, having sold 342,000 new customers in that quarter. The relaunch of the product was confusing because the company took the unusual step of using the same product name and website when it switched to the wireless product. It even kept the same prices.

But the two products are day and night different. Verizon’s original plan was to pass millions of homes with a broadband product that was fast enough to be a serious competitor to cable broadband. Even if the product never quite achieved gigabit speeds, it was going to be fast enough to be a lower-priced competitor to cable companies.

While the new Verizon Home product is selling quickly, the product is not close in capabilities to the FTTC product. Cellular bandwidth is never going to be as reliable as a landline technology or one where fiber is as close as the curb. Verizon (and T-Mobile) have both made it clear that the FWA customers will take second priority for bandwidth availability behind cell phone customers. I don’t know that these companies could do it any other way – they can’t jeopardize unhappiness from a hundred million cellular customers to serve a much smaller number of FWA customers.

I think everybody understands the way that cellular broadband capabilities change during the day. We all see it as the bars of 4G or 5G at our homes bounce up and down based on a variety of factors such as weather, temperature, and the general network usage in the immediate neighborhood. The most interesting thing about being a broadband customer on a cellular network is that the experience is unique to every customer. The reception will vary according to the distance from the cell tower or small cell and the amount of clutter and interference in a given neighborhood from foliage and other buildings.

I expect that large bandwidth users will get frustrated with the variability of the signal and eventually go back to a landline technology. The FWA product is mostly aimed at bringing broadband to rural customers who have no better broadband alternative or to folks in towns for whom saving money is more important than performance. There are a lot of such people who have stuck with DSL for years rather than upgrading to the more expensive cable broadband, and these are the likely target for FWA. In fact, FWA might finally let the telcos turn off DSL networks.

Verizon says it’s still on track with what it calls the One Fiber initiative which is aimed at building Verizon-owned fiber to cell towers and small cell sites. This backbone was likely the planned starting point for neighborhood fiber, but now this is mostly a cost-cutting step to stop paying fiber leases.

Measuring Sustainability

I’ve seen folks around the country suggesting that State Broadband offices ought to put a priority on sustainability when selecting winners of broadband grant funding. It’s a concept that has instant appeal, but I immediately asked myself what it means. How do you measure sustainability in a way that can be used to score grant requests?

It’s likely that most folks would agree on the definition of sustainability. If we are going to use government grant money to build a broadband network, we want that network to be providing broadband service for as long as possible. We expect sustainability for other kinds of infrastructure, such as roads, bridges, and buildings, so why shouldn’t we expect the same thing from a grant-funded broadband network?

But what does sustainable mean for a broadband network? The first test of sustainability is the expected life of the assets being constructed.

  • The longest-lived asset that is being constructed with grants is conduit. There is no reason why a well-maintained conduit system shouldn’t still be fully functional a century from now.
  • There are big debates about the economic life of fiber. If you go by the economic lives allowed by IRS depreciation, then the expected life of fiber is 25 or 30 years. We know that’s ridiculous because there is plenty of forty-year-old fiber still chugging along in the field. We also know that fiber constructed today is far better than fiber built forty years ago. The manufacturers have learned to make higher-quality glass with less impurities. But the big change in the industry is that the folks that install fiber have learned techniques that minimize damage during construction. Poor handling of fiber manifests twenty years later as micro-fissures – and that means cloudy glass. Nobody will give an expected life for well-maintained fiber, but scientists at some of the manufacturers have privately told me that they think it’s at least 75 years – we’ll just have to wait to find out.
  • The assets that cause the most concern for sustainability are electronics – be that fiber electronics or fixed wireless electronics. All electronics must periodically be replaced. I’ve seen some fiber electronics last fifteen years – but that seems to be near the upper end of economic life. The general industry wisdom is that fixed wireless systems have to be replaced every 7 to 10 years.
  • We largely eliminated some ISPs from grant eligibility due to poor sustainability. For example, low-orbit satellites like Starlink are designed to only last 5 to 7 and then fall from orbit. It’s hard to make an argument that grant funding buys great value with this kind of asset.

This all means that the sustainability of electronics must be a concern for all technologies. Any ISP that wins grant funding will likely be replacing some electronics within a decade. One test of any ISP on sustainability is the financial ability and willingness to replace those electronics. That’s hard to judge.

There is another measure of sustainability that is even harder to measure. A big factor in sustainability is the operating philosophy of the ISP that owns the networks. We know there is a big range of what I would call corporate responsibility between ISPs.

If we go strictly by the past, then the ISPs that have the most likely chance of operating a sustainable network for the long term are cooperatives or other ISPs that expect to still be serving the same customers fifty years from now. But not all cooperatives are the same. We see this when looking at how some electric cooperatives have allowed their poles to deteriorate badly over time.

Next in line in trustworthiness might be small telcos that have been around for as long as a hundred years. But over the last few decades, a large percentage of these companies sold to larger ISPs – so, the question for a grant reviewer is if the small telco that gets a broadband grant today will be the same owner of the network a decade or two from now?

A big question mark for many folks is the large ISPs. We saw the big telephone companies let copper and DSL networks rot in place by basically ceasing all maintenance years ago. This was clearly done as a cost savings measure. These companies will argue that there was no sense in continuing to support a dying technology, but we know that is nonsense. The copper networks in places like Germany were well-maintained and still offer DSL today with speeds in many places over 100 Mbps. The big telcos decided to unilaterally cut costs at the expense of customers. Should a grant office award funding to a company that has already failed the public once before? I’m guessing that grant offices will make awards to the big companies by reasoning that fiber networks will last a long time, so maintenance doesn’t matter. But I would argue just the opposite. I think a fiber network can deteriorate even faster without good maintenance than a copper network because the technology is less forgiving. There is still 20-year old DSL cards chugging away, something that likely won’t happen with fiber. If an ISP ignores and doesn’t maintain fiber network electronics, a fiber network could quickly turn into a brick.

I’ve not said anything above that is not common knowledge. But I am at a loss of how to turn what we’ve learned from the past behavior of ISPs in a way to consider sustainability when awarding grants. If sustainability was the most important factor in awarding a grant, I personally would give all of the money to cooperatives and none to big ISPs. And I wouldn’t fund technologies that must be largely replaced within a decade. This is probably why nobody is asking me to award grants!

Will the FCC Maps Get Better?

It is unfortunate timing that the new FCC maps were issued in the middle of the process of trying to determine the BEAD grant funding. Congress said that the amount of funding that will go to each state must be based upon the FCC maps – and the first draft of the FCC maps is clearly flawed. The FCC maps whiffed in many cases in counting the location of homes and business, and too many ISPs have clearly exaggerated both the coverage and the broadband speeds that are available to customers. This really bollixes the BEAD grant allocations, but I don’t know anybody who thought the first version of the new maps would have any merit.

Assuming that that grant funding question gets resolved somehow, there remains the bigger issue of whether the new FCC maps will ever accurately portray broadband availability. Is there any hope for these maps to get better? Getting better maps requires improving the three basic flaws of the new FCC maps – the mapping fabric that defines the location of possible customers, the claimed coverage that defines where broadband is available, and the broadband speeds available to customers.

The mapping fabric will get better over time if state and local governments decide this is something that is important to fix. Local folks understand the location of homes and businesses far better than CostQuest. But there are two reasons why the fabric might never be fixed. First, many rural counties do not have the staff or resources to tackle trying to fix the mapping fabric. There are still a lot of counties that don’t have a GIS mapping system that shows the details of every home, business, land plot, etc. But counties with GIS systems are not easily able to count broadband passings. Questions like how to count cabins or farm buildings are always going to be vexing. One of the flaws of asking local governments to fix the maps is that local governments don’t spy on citizens to see which homes are occupied or how many months a year somebody uses a cabin. My bet is that once the BEAD funding has been allocated that state and local governments will quickly lose interest in the FCC mapping fabric. I expect a lot of counties will refuse to spend the time and money needed to fix a federal database.

The FCC has held out hope that the coverage areas claimed by ISPs will become more accurate over time. One of the new aspects of the FCC maps is an individual challenge by any homeowner who disputes that a given ISP can deliver broadband to their home. If Comcast incorrectly claims a home can get broadband, the homeowner can challenge this in the FCC map – and if the homeowner is correct, Comcast must fix its mapping claim. But I have to wonder how many homeowners will ever bother to tackle a broadband challenge. The real kicker is that there is no big benefit to a homeowner to make the challenge. Using this example, Comcast would fix the map, but that doesn’t mean that Comcast is likely to offer broadband to the homeowner who challenged the map – it just means the map gets fixed. Once folks realize that a challenge doesn’t change anything, I’m not sure how many people other than the broadband diehards will care much.

The coverage challenge is only going to get better if ISPs report honestly. Using this same example, there would not be much improvement in the FCC map if Comcast were to fix a false speed claim for a specific homeowner challenge unless Comcast was to fix the maps for neighboring homes – something that a challenge does not require.

The issue that most people care about is broadband speeds. Unfortunately, the new maps are as badly flawed on this issue as the old ones – maybe worse. ISPs are still allowed to claim marketing speeds instead of some approximation of actual speeds – and an ISP gets to define what it means by marketing speeds. For example, it’s hard to dispute a marketing speed if it’s something the ISP displays on its website.

Other than the challenge process, there is another possible remedy for fixing mapping problems. The Broadband Deployment, Accuracy, and Technology Availability (DATA) Act that created the new maps gives the FCC the ability to levy fines against ISPs that knowingly or recklessly submit inaccurate mapping data. But does anybody really think that the FCC is going to fine some small local WISP that exaggerates broadband speeds? I have a hard time thinking that the FCC will ever wade into the issue of disputing claims of marketing speeds versus actual speeds. Doing so would just highlight the fact that reporting marketing speeds is acceptable under the FCC rules.

The State of Vermont reacted quickly to the new FCC maps and showed the extent of the problems. The State sent a challenge letter to the FCC saying that 11% of the locations in the FCC mapping fabric don’t exist. Worse, Vermont says that 22% of locations are missing from the FCC map. Vermont also said the speeds portrayed in the new maps don’t align with its own local mapping effort. The new FCC map shows that over 95% of Vermont homes have access to broadband of at least 100/20 Mbps. The State’s broadband maps show that only 71% of homes in the state can receive broadband at 100 Mbps or faster at the end of 2021.

I really hate to say this, but I doubt that the new maps will ever be significantly better than the old ones. I don’t enjoy being pessimistic, and I should probably let the various challenge processes run the course before complaining too loudly. I think after the flurry associated with allocating the BEAD grant funding ends that most people and local governments will quickly lose interest in the map challenge process. I can’t think of any reason why ISPs won’t continue to misreport broadband speed and coverage if they think it somehow benefits them. And I’m doubtful that the FCC will take any meaningful steps to make the maps better.

Challenging Cellular Data Speeds

There has been a lot of recent press about the new ability for households to challenge broadband coverage claimed at their homes by ISPs. The new FCC National Broadband Map also allows folks to challenge the coverage claimed by cellular carriers. Anybody who lives in rural areas knows that the big national cellular coverage maps have always been badly overstated.

The new FCC maps require each cellular carrier to separately declare where it provides, 3G, 4G, and 5G coverage. You can easily see the claimed cellular broadband coverage at your house by toggling between Fixed Broadband and Mobile Broadband on the map. The FCC has plotted cellular coverage by neighborhood hexagons on the map.

There are two ways to challenge the claimed cellular coverage – by individuals or by local governments. The process of challenging the maps is not as easy as challenging the landline broadband map. The challenge process for individuals is as follows:

  • First, a challenger must download the FCC Speed Test App, which is available on the Google App store for android or the Apple Store for IOS devices. This App has been around since 2013. The app is set to not use more than 1 gigabyte of data in a month without permission. Folks probably don’t realize that repeated speed tests can use data a lot of data.
  • Tests should only be taken between 6:00 AM and 10:00 PM.
  • Users will have to make sure to disconnect from a WiFi network since the goal is to test the cellular connection. Many people don’t realize that cell phones use your home broadband connection for moving data if set on WiFi.
  • The FCC provides only two options for taking the test – either outdoors and stationary, or in a moving car. You’ll have to verify that you are not taking the test indoors.
  • You can take the test anonymously. But if you want the FCC to consider the test results, you’ll have to provide your contact information and verify that you are the authorized user of the cellphone.
  • Individual speed tests are not automatically sent to the carriers until there are enough results in a given local area to create what the FCC is calling a crowdsourced data event.

There are some major flaws for testing rural cellular coverage. If you are in any areas where a certain carrier doesn’t provide service, you obviously can’t take the speed test if you can’t make a cellular connection. You can also only challenge your subscribed carrier and you can’t claim that another carrier doesn’t have the coverage that is claimed in the FCC map. On the plus side, you can take the speed test from anywhere, not just your home, and I picture folks taking the test just to help document cellular coverage.

The other flaw is the low thresholds that constitute a successful test. The tests are based on the FCC’s massively outdated definition of acceptable cellular broadband speeds. The test for acceptable 4G coverage is a paltry 5/1 Mbps. The FCC has two thresholds for 5G at 7/1 Mbps and 35/3 Mbps. These speed definitions are out of touch with actual cellular performance. According to Ookla’s nationwide speed tests, the national average cellular speed at the end of the third quarter of 2022 was 148 Mbps download and 16 Mbps upload. The national median speed (meaning half of people are either faster or slower) was 75 Mbps download and 9 Mbps upload. This is another outdated definition that probably won’t be updated unless the FCC gets the much-needed fifth Commissioner.

I don’t know how useful it is to find out that a carrier can deliver 5/1 Mbps to my home. That’s what is claimed at my home by AT&T for 4G (the company is not yet claiming any 5G). A recent speed test from inside my house showed 173/10 Mbps. How can the FCC adopt any policies for cellar broadband if they are only asking carriers to certify that they meet an absurdly low threshold?

Local governments can also initiate challenges. This can be done by coordinating multiple people to take the tests at various locations to paint a picture of the cellular coverage across a city or county. Local governments can also use engineering-quality devices to take the test, which provides more guaranteed results than a cell phone. Local governments have the ability to document areas with no cellular coverage – something that will be hard to document without a huge number of individual speed tests.

The next time you’re driving in a place where the cellular coverage is lousy, stop by the side of the road, get out of your car, and take the speed test. It’s going to take all of us to document the real rural cellular coverage map. Also, let’s collectively push the FCC to increase the definition of acceptable broadband speeds. We talk about landline broadband speeds all of the time, but cellular coverage in rural areas is equally, or even more important.

Telephony in 1927

This may seem like an odd topic to write about. The topic comes from browsing through the 2.8 million documents from 1927 that just entered the public domain as the original copyrights expired. The big national headlines focused on Winnie the Pooh, Disney’s Steamboat Willie, and Sherlock Holmes entering the public domain. But it’s fascinating to browse through the wide range of documents that provide a snapshot of the science, technology, and culture in 1927.

Today’s blog comes from a 12-page pamphlet on The Telephone: Its History and Methods of Operation, published in 1927 by the Illinois Committee on Public Utility Information. The document is a great primer on the state of the telephone industry that year. The most interesting impression for me seeing how pervasive the telephone became only 51 years after the first lab test of the technology.

Some of the more interesting facts from the pamphlet:

  • There were 18 million telephones in the U.S. that year, including 1.6 million in Illinois. That’s one telephone for every 15 people.
  • The U.S. had 61% of all working telephones in the world at the time. Europe had 27%, with only 12% of telephones for the rest of the world. Illinois had 1,189 central offices.
  • We talk about industry consolidation today, but in 1927 there were 39,000 telephone companies in the country, most serving small towns or neighborhoods.
  • There were 380,000 people employed by telephone companies in the U.S., including 15,000 girls employed as private branch exchange operators, just in in Illinois. Illinois had over 30,000 telephone-related manufacturing jobs.
  • A state-of-the-art handset is shown as the picture at the top of this blog.
  • Telephone service cost an average family less than 1% of household income, and it was the affordability that led to the rapid acceptance and deployment of the technology.

The pamphlet gushes about the deployment of telephone technology. “Yet behind this little act of calling by telephone and holding converse with some distant person, there is the story of a marvel so great as to almost put to shame the winder of Aladdin’s Lamp. . . Beginning 5 years ago, the record and the development of the telephone has been so wonderful, so vital in the affairs of man, that it has actually changed the course of human history and has played no small part in the civilization of mankind.”

There were three types of central offices in 1927. Small exchanges used the magneto system that had a battery at the base of each telephone that was charged by turning a crank on the telephone. Larger telephone exchanges used a common battery system that supplied power to telephone sets over copper wires. This system alerted an operator that somebody wanted to place a call by lighting a small lamp on a switchboard. Operators manually placed calls to other exchanges to arrange the connection of a call. Large city exchanges were using the new switchboard technology that allowed an operator to complete calls by connecting a jack to the appropriate trunk line, eliminating most of the time-consuming labor needed to set up a call.

There is a fascinating section describing the network used to place transatlantic calls. A U.S. originating call used a voice path to Europe routed to Rocky Point, Long Island, where the calls were transferred to a powerful radio system that transmitted the call to Cupar, Scotland. The return voice path took a similar path from Rugby, England to Houlton, Maine.

Within seven years of this pamphlet, Congress passed the Telecommunications Act of 1934, which put some regulatory restraints on the large Bell Telephone monopoly that was gobbling up telephone systems across the country. Looking at telephony from a 1927 perspective shows us a time when telephony was still new and was a wonderment to most people.

Here is a look at all of the books and periodicals from 1927 that are now in the public domain. Here is the Telephone Pamphlet. Pay particular attention to the last section that instructs people how to talk on the telephone.

An FCC on Hold

It’s now 2023, and it’s been two years since Ajit Pai left the FCC and created an open Commissioner spot. It’s always been routine for the Senate to replace open Commissioner slots within a reasonable time. Washington DC has always been partisan, but the Senate has routinely approved the nominee of the sitting president, giving the administration the swing vote in the FCC and other regulatory bodies like FERC, the SEC, and the FTC. This approval is sometimes given begrudgingly, but both political parties want the courtesy of having its party get to choose regulators when it holds the White House.

For the first time in my memory, the Senate does not have the votes to approve the nominated FCC Commissioner, Gigi Sohn. This is extraordinary, and it has meant that the FCC has been on hold and we’ve had a two-year deadlock between the two Democratic and Republican Commissioners on any controversial issues.

I don’t know Gigi Sohn, but I’ve seen her speak many times, and she seems like a perfect nominee. She has more knowledge of the industry than most past new Commissioners. People from both parties who know her say she is fair-minded and that her main priorities are to look out for broadband consumers and to make sure that all voices are heard through open media. She’s even drawn strong support from conservative organizations like Newsmax, which support her nomination since she believes in open airwaves.

I don’t have any insight into why the Senate, with a Democratic majority, has been unable to muster the votes to approve the nomination. There is a long article in The Verge that postulates that the nomination has been blocked by heavy lobbying by Fox and Comcast. There are other articles saying that there is also heavy lobbying against the nomination from AT&T, Verizon, and T-Mobile. It makes perfect sense for ISPs to oppose a fifth Democratic Commissioner since one of the first items on the agenda after sitting a fifth Commissioner would be to reinstate net neutrality and broadband regulation. Large content providers want to delay adding a fifth Commissioner since the topic of media consolidation would also be high on the list of topics that a full FCC would investigate. I imagine that these big companies don’t have any personal objection to Gigi Sohn – they would just as strongly oppose any Democratic nominee.

The reality is that the big carriers and content providers are better off with a tied Commission regardless of which political party holds the White House. The best scenario for big corporations is a regulatory agency that don’t rock the boat and change regulatory rules. Big corporations hate regulatory uncertainty and regulatory change. This is true of all regulated industries, not just telecom. It says something about our body politic when lobbyists are strong enough to upset the long-standing mutual consensus in the Senate that a White House ought to be able to select qualified candidates for open slots at regulatory agencies.

There are a number of FCC initiatives that are on hold until there is a fifth Commissioner. Consider some of the following:

  • The big issue is net neutrality, which says that there should be no discrimination used in delivering Internet content. But everybody understands that the real issue at stake in this discussion is the overall regulation of ISPs. Reintroducing net neutrality means having to reinstate Title II authority or some other similar mechanism to regulate broadband. Re-regulation of broadband is the issue that the big ISPs most strongly oppose. Broadband regulation could result in many new rules that big ISPs would hate, like perhaps outlawing data caps.
  • The FCC has needed for years to increase the definition of landline broadband, which still sits at 25/3 Mbps. Equally out of touch is the definition of acceptable 4G cellular broadband set by the FCC at 5/1 Mbps.
  • The FCC recently ordered broadband labels that are supposed to inform customers about their home broadband. The FCC got this authority through the IIJA legislation. But oddly, since the FCC doesn’t currently have the authority to directly regulate ISPs, the Commission can be stymied by ISPs that blatantly fail to honestly disclose the facts to the public.
  • The FCC is considering spending up to $9 billion on subsidies to improve rural cellular coverage. It’s a great idea, but there needs to be a fifth Commissioner to make sure this isn’t just a handout to the giant cellular carriers and not another boondoggle like RDOF.
  • There are probably not the needed votes in the current Commission to impose penalties against ISPs that continue to falsely report to the FCC mapping database.

This article might have a short life if the newly seated Senate finally approves Gigi Sohn. But lobbying money carries a lot of weight in DC, and it’s possible that I’ll be publishing an update to this article next January.

Why the Complexity?

It’s been over a year since the BEAD grant program was announced. While there has been a lot of activity on BEAD, there is still a long way to go before this grant money is used to build new broadband infrastructure. Most of the delay is due to the incredible complexity of the BEAD grant rules.

I work with a lot of different state broadband grant programs, and I can’t help but notice the tremendous difference in the complexity of the process between state and BEAD grants. The priority for state grant programs is usually to quickly get the money out the door and spent on infrastructure. State legislators that approve grant funding want to see construction started no later than the year after the grant award, and hopefully sooner. State grant offices are generally given instructions to identify worthwhile projects and get the money approved and quickly into the hands of the ISPs to build networks.

The differences between state grants and BEAD are stunning. I have one client that won a $10 million state grant based on a simple grant application of less than 20 pages. The grant reviewers asked a few follow-up questions, but the whole process was relatively easy. The grant office was relying on the challenge process by ISPs to identify grants that were asking to overbuild areas that already have broadband. The challenge process seemed to work – a number of the grants filed in this particular program were successfully challenged. But the bottom line is that the funding was made available to start construction in less than a year from the date when the grant office originally solicited grant applications.

Why are the BEAD grants so complicated? It starts with Congress, and a lot of the complexity is directly specified in the IIJA legislation that created the grants. My pet theory is that the complexity was introduced by lobbyists of the large ISPs that wanted to make the grants unfriendly to everybody other than big ISPs with the resources to tackle the complex rules. It’s unfathomable to me that congressional staffers would have invented these complex rules on their own. I knew on my first reading of the IIJA legislation that the grants favor big companies over small ones.

In the legislation, Congress decided to give the administration of the grants to the NTIA. The NTIA had a major decision to make on day one. The agency could have taken the approach of smoothing out the congressional language to make it as easy as possible for ISPs seeking the funding. The NTIA had political cover to take a light-touch approach since the legislation stressed the importance of quick action to solve the rural broadband crisis. The White House has also been urging federal agencies to speed up the process of turning IIJA funding into infrastructure projects.

Unfortunately, the NTIA didn’t take this approach. It looks like the agency did just the opposite – the agency embellished and strengthened the congressional language and made it even more complex to file for the grants.

I don’t think the NTIA had any agenda to make the grant more complicated. It’s impossible to think the agency had early discussions about how to make it harder to use the grant funding. But the agency did have an overriding desire to do these grants the right way. The general industry consensus is that the grants were given to the NTIA instead of the FCC because of the terribly botched RDOF subsidy program. It would be hard to design a federal broadband program that would have been more poorly handled than RDOF (except perhaps for CAF II, which also was done by the FCC).

I think BEAD became more complex, one topic at a time. I think folks at NTIA looked at each congressionally mandated rule and asked how they could make sure that no money went to an unqualified ISP. Instead of softening grant requirements, I think the NTIA staff instead asked how they could be positive that no unworthy ISPs sneak through BEAD process – something that clearly happened in the FCC’s RDOF process. The final NTIA BEAD rules are not a manual on how to get grant money spent efficiently – but a manual on how to make sure that only qualified ISPs win the funding.

That doesn’t sound like a bad goal. Some of the ISPs that won the RDOF funding were spectacularly unqualified – either financially, managerially, or technically. But as each of the many BEAD rules was made as safe as possible, the collective combination of all of the BEAD rules being made super-safe creates major hurdles for ISPs. Almost every ISP I know is going to have a problem with at least a few of the rules – and I think many qualified ISPs are going pass on the BEAD grants. There is something wrong with a grant program that has hundred-year-old telephone companies wondering if they can qualify for the grants.

There is still a chance for State broadband offices to smooth out the worst of the BEAD rules. A State broadband office can push back against the NTIA BEAD rules that make it too hard for ISPs to get funded. The NTIA can’t excuse any specific mandate that was created by Congress – but the NTIA can relax and compromise on its interpretation of these rules.

The big challenge facing State grant offices is how hard they are willing to push back against the NTIA. Every State is under pressure to finally get the grant process underway, and any challenge will likely add time before funding is available. Every State broadband office already knows the ISPs it would like to see win the funding – those ISPs that will be conscientious in operating the network after its built. State broadband offices need to listen and react to the concerns that these ISPs have about the grant process – because if they don’t, many of the best ISPs are going to take a pass on the grants.

What Ever Happened to IPv6?

It’s been over ten years since the launch of IPv6, the Internet address system that was supposed to give us nearly infinite number of IP addresses. But after a decade of implementation, just over 21% of all websites worldwide will support IPv6 addresses.

On the surface, this makes no sense. The original IPv4 standard only supports about 4.3 billion IP addresses. We clearly have far more people and devices connected to the Internet than that number. By contrast, IPv6 provides 340 trillion trillion trillion IP addresses, a number that, for all practical purposes, is unlimited.

Even though we exhausted the supply of IPv4 addresses years ago, it doesn’t look like there is any rush for most of the world to move to the new IP addresses. There are obvious barriers to making the conversion that most ISPs and businesses are not ready to tackle. Most of the barriers to making the conversion can be categorized as hardware limitations, lack of training, and the overall cost of the conversion.

It’s a little hard to believe after a decade, but many older computers, servers, and routers will still not recognize IPv6 addresses. One would have to think that we’ll eventually ditch the older devices, but there are apparently still a huge number of devices that can’t process IPv6 addresses. The good news is that newer operating systems and devices will handle the new addresses. But the world still has plenty of folks using older versions of Windows, Linux, Android, and iOS. Big corporations are reluctant to make the switch to IPv6 out of fear of older technology around the company that would stop working. Smaller companies are not willing to make the change until they have no choice.

This issue is compounded by the fact that direct communication between IPv4 and IPv6 devices is impossible, and all exchange of data must pass through an IPv4/IPv6 dual-stack conversion to enable communications. This was originally envisioned as a temporary fix, but as IPv4 continues to be used, this is looking to be permanent.

Companies are also loath to tackle the cost and effort of the upgrade without some compelling reason to do so. Companies that have made the change report a number of unexpected problems with a conversion that can be disruptive, and companies are not willing to tackle something this complicated unless they have to.

It’s interesting to see how various countries have decided to make the switch to IPv6. Google has been collecting statistics on IPv6 conversions that are summarized on this map. At the time I wrote this blog, the world leaders in conversion to IPv6 are France (75%), India (68%), Germany (67%), Malaysia (62%), and Saudi Arabia (61%). Much of the rest of the world is far behind with the upgrade, including Russia (7%), China (3%), and much of Africa below 1%.

The US is just above 50% utilization of IPv6. Interestingly, the US backslid and was at a 56% IPv6 conversion rate in 2019. The resurgence of IPv4 is being credited to the huge flood of folks working at home during the pandemic – since residential ISPs have mostly not made the conversion.

Internet experts believe we’ll still be running dual IPv4 and IPv6 networks for at least a few more decades. We’ve found ways to work around the lack of IPv4 addresses, and very few companies or ISPs are seeing any urgency to rush toward a conversion. But as the worldwide penetration of broadband continues to grow and as we add more connected devices, the pressure will increase to eventually make the conversion. But don’t expect to see any headlines because it’s not happening any time soon.