Regional Broadband Matters

One of the more interesting trends that I see coming out of the pandemic is the concept that the quality of regional broadband is becoming a key component of economic development. Existing or prospective employers no longer care only about the broadband in a city but in the area around a city.

This is a radical departure from traditional economic development. Countless cities have spent time and money making sure that business parks get fiber broadband. That’s still important and nobody is going to move a business or even keep a business in a location with poor broadband. A growing percentage of businesses are becoming reliant on broadband for day-to-day operations.

But broadband at the main headquarters of a business is no longer enough. A majority of businesses are going to want employees to be able to work from home, at least part-time. For some businesses, working from home is going to be a core component of the business, but it’s a rare business that doesn’t care about working from home.

I think most people would be surprised by the amount of automation that has been incorporated into businesses. I recently interviewed a brewer who is able to initiate a lot of the brewing steps remotely. The brewmaster is able to mix ingredients and add to the brewing at exactly the right time without having to travel to the brewery. He’s able to run numerous tests and diagnostics remotely to keep an eye on the fermentation process. I also talked to an ice factory owner who can initiate most of the processes of the factory from home. These are both businesses that ten years ago would have required somebody on site for all of these functions.

Because employers want employees to be able to work from home, they suddenly care about the quality of home broadband. The pandemic woke employers to the dreadful state of home broadband for many employees. Companies were sending people home who live outside of towns and found that they were unable to connect to the office servers. In most of the country, you don’t have to travel far outside of city limits to find poor broadband. In many cases, the only landline broadband option in the rural areas surrounding the city is slow rural DSL that has download speeds of only a few Mbps. In some places, there is also a fixed wireless provider, but in many cases this service isn’t much faster than DSL. The only other option today for rural broadband is satellite, which has speeds up to 50 Mbps but which has super-high latency that makes it hard to engage in real-time applications like connecting to a home or school server.

The pandemic was a real wake up call for employers and they have suddenly become the biggest advocates for good broadband for the region around their business. It is going to come as a shock to a lot of towns and cities that think they have good broadband because the town is served by a cable company. It turns out that many homes with a cable company ISP struggled with upload connections when more than one person tried to work from home at the same time. But businesses everywhere were slapped in the face with the incredibly poor condition of rural broadband.

I’ve talked to businesses since the onset of the pandemic where a significant percentage of the workforce is unable to work from home. You can bet that when the pandemic is over that many of these businesses are going to be thinking about relocating if there is a community nearby with fast rural broadband. Community leaders who think they’ve solved the broadband issue will be shocked when existing businesses leave for greener pastures.

There is really only one solution to this problem. Communities need to take a regional approach for fixing broadband. Cities and towns need to start caring about the broadband five or ten miles outside of town if that is where people live who work at the businesses in a community. Cities that are thinking about building fiber need to think bigger and care about the broadband in a circle around the city. Cities that solve broadband regionally are going to have an economic development advantage over the many cities that are surrounded by broadband dead zones.

This concept flies in the face of the way communities have done economic development for the past fifty years. Cities took steps to make sure that broadband in the cities, and particularly in business parks was ideal for businesses. But that approach doesn’t address the new reality that businesses want employees to be able to work from home. Cities need to accept this new reality and adjust accordingly.

The Fantasy of Measuring Speeds

The FCC issued an order on January 19 that takes the next step towards implementing better data collection and mapping of broadband data. I’ll discuss some of the details of that order in an upcoming blog. But today’s blog asks a more fundamental question about basing broadband policies on broadband speeds. For most of the broadband technologies widely deployed in the US, it’s challenging or impossible to accurately measure broadband speed.

The two most challenging technologies to measure are DSL and fixed wireless. Consider the following issues that impact the speed of DSL at a given customer:

  • A telco might be using multiple vintages and types of DSL in the same market. How do you report speeds in a market when some types of DSL are five times faster than others?
  • DSL signal decreases over distance. A home at the end of a long city block might have significantly slower speeds than a home at the start of a block.
  • The size of the copper wire in the network influences speed. In a city there is likely to be copper carrying between 16-gauge and 24-gauge.
  • The age and quality of the copper matter since copper wire slowly degrade over time, particularly if the copper comes into contact with the elements. This is a local issue, house by house and block by block.
  • The backhaul network used to bring broadband to a neighborhood can be undersized. If there are too many customers being served in a node (oversubscription), then speed suffers.
  • Telcos don’t deploy technology consistently. Two adjoining neighborhoods might be using the same vintage of DSL, but one has newer and faster cards in the neighborhood cabinet.

You can make a similar list of issues that affect the speeds delivered to a customer using wireless technologies:

  • The specific spectrum being used matters because each band of spectrum carries varying amounts of data according to the wavelength.
  • Environmental factors like foliage or being blocked by a neighboring home have a huge impact on data speeds at a given customer. Speed also varies by outside temperatures, humidity, and weather events like rain.
  • Distance is important, just like with DSL. A customer who is further away from the transmitter will experience a slower speed.
  • Wireless technology is subject to varying degrees of interference that can vary widely during the day.
  • Lack of adequate backhaul and oversubscription can be deadly to wireless speeds.

We tend to think of cable company networks as being more homogeneous, but oftentimes they are not. We’ve done speed tests in cities and found some neighborhoods where customers get more than 100% of advertised speeds and other neighborhoods where homes are getting less than a quarter of the advertised speeds. There are a variety of network issues that might cause a big difference in speeds. Are cable companies going to be honest about network inadequacies in FCC reporting and report slow or fast speeds – they aren’t that honest today.

All networks, including fiber, can be negatively impacted during times of heavy neighborhood usage. What’s the right speed on a broadband network? The speed that can be obtained at 4:00 in the morning when the network is empty or the speed at 8:30 in the evening when the network is bogged down with the heaviest neighborhood usage?

If you’ve never done it, I suggest you run multiple speed tests, back-to-back. I am on a Charter cable network and I ran speed tests for any hour recently and I saw reported speed varying by as much as 50%. What speed is my broadband connection? The cable company will claim it’s at the fastest possible speed (or might even claim a marketing speed that is faster than my fastest measured speed). But is that really the speed? There is an argument that the slowest speed I encounter during the day defines a limit on how I can use broadband.

It’s absolutely impossible to define a speed other than perhaps for a customer that has a dedicated fiber connection where the ISP removes all of the factors that might decrease speeds (such a connection is expensive). The speed on all other broadband products varies – some a little, such a GPON fiber connection, and some a lot, like DSL.

The FCC is about to embark on a grand new scheme to force ISPs to better define and report broadband speeds. It’s bound to fail. If I can’t figure out the speed on my cable modem connection, then the FCC is on a fool’s mission.

The trouble with the FCC’s approach is that the agency wants an ISP to report actual speed by clusters of homes – today it’s by Census block and soon it will be polygons. But this is a waste of everybody’s time when nobody can even define the speed for an individual home. Further, speed is not the only issue that affects broadband performance. The FCC is ignoring that latency and jitter can have more to do with a bad broadband experience than broadband speed. No matter what the FCC tries to do to improve reporting, any speeds reported by ISPs are going to mostly be pure fantasy – and that’s true even if ISPs strive for honesty, which nobody expects. We need to find a better way to define broadband because we are basing policies and grants on an imaginary set of reported broadband speeds.

FCC’s 2021 Broadband Deployment Report – Part 2

As mentioned in yesterday’s blog, the FCC is required to annually report on the state of broadband to Congress. The prior FCC chaired by Ajit Pai issued the 2021 report on January 19, the last day of Pai’s leadership. Following is the primary finding of the report:

Since the Commission’s last Broadband Deployment Report, the number of Americans living in areas without access to at least 25/3 Mbps (the Commission’s current benchmark) has dropped from more than 18.1 million Americans at the end of 2018 to fewer than 14.5 million Americans at the end of 2019, a decrease of more than 20%. Moreover, more than three-quarters of those in newly served areas, nearly 3.7 million, are located in rural areas, bringing the number of rural Americans in areas served by at least 25/3 Mbps to nearly 83%. Since 2016, the number of Americans living in rural areas lacking access to 25/3 Mbps service has fallen more than 46%. As a result, the rural–urban divide is rapidly closing; the gap between the percentage of urban Americans and the percentage of rural Americans with access to 25/3 Mbps fixed broadband has been nearly halved, falling from 30 points at the end of 2016 to just 16 points at the end of 2019.

This includes the extraordinary statement that 83% of rural Americans have broadband speeds of at least 25/3 Mbps. If that was even close to being true, then Chairman Pai would deserve to make this the crowning achievement of his tenure at the FCC. Unfortunately, it’s not even remotely close to being true.

If the real number is less than 83%, then what is the right number? Unfortunately, nobody knows, because the FCC statistics are based upon the big lie that’s embedded in the poor FCC mapping and data collection. It’s extraordinary that the FCC would issue a report to Congress without a huge bold print caveat at the top of the first page saying, “The statistics in this report are all as reported to use by various ISPs. We have a large amount of anecdotal evidence that these numbers are overstated, and so the Congress should not base any policy decisions based upon these numbers.”

There is a huge amount of anecdotal evidence. For example, the State of Georgia created its own broadband map and estimated that the number of homes in Georgia that didn’t meet the 25/3 Mbps threshold was twice the number claimed in the FCC data. My firm works with rural counties all across the country and helps them perform speed tests, and we have never once seen a speed test from a rural DSL customer of CenturyLink or Frontier that was receiving 25/3 Mbps service. In fact, in most counties we’ve studied we don’t even find any DSL customers getting speeds of 10/1 Mbps. Yet the FCC mapping data shows a lot of coverage of these two speeds in rural areas.

To make matters worse, we don’t see many urban DSL customers with speeds of 25/3 Mbps or greater. DSL can only achieve such speeds by upgrading to newer versions of DSL and by bonding two pairs of copper into one DSL circuit. In towns that haven’t upgraded to these faster technologies – and there are a lot of them – nobody has speeds of 25/3 Mbps.

The FCC statement claims that 37% of rural homes got upgrades between 2016 and 2019 that pushed broadband speeds over 25/3 Mbps. There were some such upgrades, notably in areas where smaller telcos upgraded speeds by installing fiber. But those upgrades don’t come close to accounting for 37% of all rural homes. I am positive there were very few, if any rural homes in those years seeing this upgrade from CenturyLink, Frontier, AT&T, Verizon, or Windstream.

This means that the only way that the rural statistics could have gotten better was from ISPs claiming to bring fast speeds to rural areas. We’ve seen evidence of that. I know of rural counties where WISPs claim speeds of 100 Mbps download or faster where they have virtually no customers, and the few customers have speeds of only a few Mbps. It’s also likely that the big telcos claimed better coverage during this time period as they claimed upgrades due to CAF II (which largely are imaginary). Recall that just before the RDOF grant process that Frontier tried to claim that over 16,000 census blocks suddenly had broadband speeds that would have exempted those areas from getting grants.

Don’t bother printing this FCC report because it would be a waste of paper. The report is full of statistics from beginning to end – and practically none of the statistics are believable, and many are outright lies.

FCC’s 2021 Broadband Deployment Report – Part 1

On the last day of Ajit Pai’s term as FCC Chairman, the agency issued the 2021 Broadband Deployment Report. This is a report that is mandated to be delivered to Congress each year. One of the most extraordinary things said in the report is the following:

We find that the current speed benchmark of 25/3 Mbps remains an appropriate measure by which to assess whether a fixed service is providing advanced telecommunications capability. We conclude that fixed services with speeds of 25/3 Mbps continue to meet the statutory definition of advanced telecommunications capability; that is, such services “enable[] users to originate and receive high-quality voice, data, graphics, and video telecommunications.” 

This statement displays a willful disdain for the public and completely ignores the pandemic that the country has been suffering through since March 2020.  Nobody in the industry, other than this FCC, thinks that a 3 Mbps upload path is sufficient for families to work and tackle schoolwork from home. Any home that has a 3 Mbps upload path is not able to originate advanced telecommunications and there is no rational way that a 3 Mbps upload stream can still be considered as broadband.

It’s clear why the Pai FCC wants to stick with 25/3 Mbps as the definition of broadband. If the upload definition is increased to something even a little faster, such as 25 Mbps upload, then probably 80% to 90% of the country would no longer be considered to have broadband. Chairman Pai does not want to get the black eye for admitting that our national broadband was not up to the challenges of the pandemic. Rather than face a simple truth to which any homeowner can attest, the Pai FCC has decided to stick with the definition of broadband that was developed in 2015.

The 2020 deployment report wasn’t issued until June of 2020, so this latest 2021 report is extraordinarily early by FCC standards. The timing is not accidental, and the report was issued on Chairman Pai’s last day in office. He rushed this report out the door so that his own FCC could pronounce that the state of broadband is good and getting better – instead of recognizing the reality that millions of homes who thought they had good broadband suddenly found out during the pandemic that they don’t.

Chairman Pai had an opportunity with this report to be bold and do the right thing. A recognition by Chairman Pai that upload speeds are inadequate would have set off a furious debate after he left. Cable companies won’t be prodded into improving upload speeds unless pushed to do so by the FCC. The entire purpose of the report to Congress is to let legislators know the state of US broadband. Every member of Congress has already heard loudly from their constituents that broadband is not good enough. It’s insulting to Congress to publish this whitewashed report, and it’s a huge disservice to homes still suffering through the pandemic to be told that the state of broadband is great.

It’s clear that the cable companies are quietly hoping that the upload issue blows away after the end of the pandemic. However, it’s looking more and more like millions of people will continue to work from home even after the pandemic ends – and the FCC’s definition of broadband needs to recognize the real world.

I don’t know if an incoming FCC can issue a revised report – but they should strongly consider doing so. As a country, we can’t properly tackle broadband issues if we’re not even willing to admit where problems exist. We now know that upload speeds on most technologies, including cable company networks, has been inadequate for a lot of home over this last year.

Of course, the upload speed issue is not the only problem with this latest report – stay tuned for part 2 tomorrow.

Bill Gates on the Pandemic

Bill Gates has spent much of his time since Microsoft working to eliminate diseases in the world. In 2015 Gates presented a TED talk that discusses how the world was unready for a major disease outbreak. That talk was prophetic and everything he predicted came to pass.

This blog looks at the broadband industry, but in 2021 we can’t have any real discussions about the direction of broadband in 2021 without acknowledging the continuing impact of the pandemic. Gates has started a new podcast and his first episode looks at how the pandemic is going to change the way we live. Some of his predictions have direct relevance to broadband, and ISPs ought to heed some of these predictions when thinking about the future of their businesses.

Here are a few of his predictions and my thoughts on how the pandemic will affect broadband.

The pandemic is going to last longer than we all hope. Gates says that even after a vaccine tamps down the numbers in the US that we’ll continue to have flare-ups until the whole world gets the pandemic under control. He said this even before there was news that the virus seems to be mutating and creating more contagious forms. This means that while things will certainly get better than what we are experiencing in January 2021, ISPs need to expect to continue practicing pandemic protocols for a long time. Companies that get sloppy and careless are likely to pay the price by having to isolate employees. It’s unlikely that the world will control the pandemic completely during 2021, so brace for this lasting even longer.

Working at home is here to stay. Gates believes that many jobs will never return to the office. People will continue to work at home, many permanently. This has a bunch of implications for ISPs.

First, ISPs that have concentrated on downtown businesses are going to see a downturn. Gates thinks its likely that many downtown businesses will sit empty, which means that ISPs that made such building the focus of their business will need to expand elsewhere. It’s a really interesting twist to see ISP  business plans turned upside down. A good case in point is CenturyLink. Before the merger with Level 3 the company was aggressively pursuing building fiber in residential neighborhoods. The company pulled a 180 and has concentrated on adding buildings to fiber rather than neighborhoods. In retrospect, the original direction might have been the right one.

This also has big implications for the cable companies. It’s clear that many households are unhappy with the upload capabilities of cable ISPs. In surveys that CCG has been doing this year, we’re seeing 30% to 50% of homes telling us that home broadband connections in cities are inadequate for working from home. I think the cable companies have been hoping this problem will blow over, but if a lot of people stay home to work, unhappiness with cable broadband is going to grow. Cable companies are going to have to invest in expensive upgrades to get faster broadband or be more vulnerable to fiber overbuilders.

Virtual meetings are here to stay. Gates predicts that the platforms used for online meetings will improve significantly over the next few years and video meetings will be a permanent alternative to travel. A huge number of people in the broadband business are road warriors, and live meetings are no longer going to be expected, or for many people even acceptable. I know that I’ve been giving some serious thoughts about largely eliminating work travel – a drastic change for a consultant. But I’ve just spent a year proving that live meetings are not needed. I’ve actually gotten to know clients better through a string of video meetings instead of a few live visits.

People are choosing where to live. Millions of people are pouring out of big cities where real estate and rents are too expensive and moving to suburbs, small towns, and rural areas instead. ISPs need to join the rest of the work world and consider remote employees. Obviously, employees that visit customers or who take care of networks must be local, but every ISP has a few functions that don’t require a person to be in the office to be effective. It’s more important to find the most talented people than it is to find people within commuting distance from your office.

A more mobile workforce has a lot of implications for employers. Employees who work from home will have options and are going to look for employers who treat them well and who offer interesting work. That means a lot of turnover for companies that don’t value employees – in the new economy many of your current employees can find other jobs without relocating. A bigger challenge for companies with remote staff is going to be creating a sense of company identity and fostering teamwork.

The Legacy of the Pai FCC

As is normal with a change of administration, there are articles in the press discussing the likely legacy of the outgoing administration. Leading the pack in singing his own praises is former FCC Chairman Ajit Pai, who recently published this document listing a huge list of accomplishments of the FCC under his Chairmanship. Maybe it’s just me, but it feels unseemly for a public servant to publish an official self-praise document. The list of accomplishments is so long, I honestly read it twice to make sure Chairman Pai wasn’t taking credit for inventing 5G!

More disturbing to me are industry articles like this one that lists the primary achievements of the Pai FCC to include “the repeal of Title II regulation of the internet, rural broadband development, increased spectrum for 5G, decreasing waste in universal service funding, and better controlling robocalls.” I see some of those as failures and not accomplishments.

I find it unconscionable that the regulatory agency that is in charge of arguably the most important industry in the country would deregulate that industry. The ISP industry is largely controlled by a handful of near-monopolies. It’s easy to understand why the big ISPs don’t want to be regulated – every monopoly in every industry would love to escape regulation. It’s the government’s and the FCC’s role to protect the public against the worst abuses of monopolies. Lack of regulation means that carriers in the industry can no longer ask the FCC to settle disputes. It means that consumers have no place to seek redress from monopoly abuses. We’re within sight of $100 basic broadband, while the FCC has washed its hands of any oversight of the industry. Killing Title II regulation comes pretty close in my mind to fiddling while Rome burns.

We saw the results of broadband deregulation at the start of the pandemic. Had the FCC not deregulated broadband, then chairman Pai could have directed ISPs on how they must treat the public during the pandemic. Instead, the FCC had to beg ISPs to voluntarily sign on to the ‘Keep America Connected Pledge’, which only lasted for a few months and which some of the big ISPs seemingly violated before the ink dried. During this broadband crisis, the FCC stood by powerless due to its own decision to deregulate broadband. This is downright shameful and not praiseworthy.

Everywhere I look this FCC is getting praise for tackling the digital divide, and admittedly the FCC did some good things. There were some good winners of the CAF II reverse auction that will help rural households – but that was offset by awarding some of that grant to Viasat. The FCC did some good by increasing the Lifeline subsidy for tribal areas. But on the downside, the FCC decided to award a seventh year of CAF II subsidy of $2.4 billion to the big telcos – with zero obligations to use the money to expand broadband. The FCC knows full well that the original CAF II was mostly a sham and yet took no action in the last four years to investigate the failed program. The Pai FCC closed out its term by largely botching the RDOF grants.

The area where the FCC did the most good for rural broadband was making more wireless spectrum available for rural broadband. This FCC missed a few chances early, but in the last few years the FCC nailed the issue. The FCC might have made the best long term impact everywhere with the rulings on 6 GHz spectrum. Spectrum decisions might be the biggest lasting legacy of this FCC.

But we’re never really going to know how this FCC did in narrowing the rural broadband gap, because this FCC has no idea how many homes don’t have broadband. The lousy FCC mapping was already a big issue when Chairman Pai took over the FCC. There was a lot of gnashing of teeth about the issue under Chairman Pai, but in four years nothing was done to fix the problem, and if anything, the maps have gotten worse. It might not be so disturbing if the bad mapping was nothing more than lousy data – but the bad data has been used to justify bad policy and even worse, has been used to determine where federal grants should be awarded.

To add salt to the wound, the FCC issues a mandated report to Congress every year that reports on the state of broadband. The reports from the Pai FCC are so full of imaginary numbers that they are closer to fiction than fact. About the most the FCC under Chairman Pai can say is that the imaginary number of people without broadband grew smaller under his watch. On the last day as Chairman, the FCC released the latest report to Congress that concludes incorrectly that broadband is being deployed to Americans “on a reasonable and timely basis”. This recent report also concludes yet again that 25/3 Mbps is still a reasonable definition of broadband – when homes with that speed were unable to function during the pandemic.

In looking back, it’s clear that this FCC tilted as far as possible in favor of the big ISPs. There is nothing wrong about regulators who work to strengthen the industry they regulate. But regulators also have a mandate to protect the public from monopolies abuses. The FCC seems to have forgotten that half of its mandate. If there is any one event that captures the essence of this FCC, it was when they voted to allow Frontier to bill customers for an extra year for equipment that customers own. I didn’t see that accomplishment on Chairman Pai’s list.

Looking Back at 2020

I periodically take a look at broadband trends into the future. But as I was thinking about how unique 2020 was for everybody, I realized that there were some events during the year that we’re going to look back on a decade from now as important to the broadband industry. Interestingly, most of these events were not on anybody’s radar at the beginning of the year.

Upload Broadband Entered the Picture

For the first time, we all started caring about upload speeds due to the pandemic. Millions of homes that thought they had good broadband suddenly found that the home broadband connection wasn’t good enough for working or schooling. Millions of people reacted to this by upgrading to faster download broadband speeds, only to find in many cases that the upgrade still didn’t fix the upload speed problems.

It also appears that a lot of people will continue to work from home after the end of the pandemic which means that the demand for upload speeds is not going to go away. This is going to put a lot of pressure on cable companies in markets where there is a fiber competitor. Fiber ISPs only need to advertise as the work-from-home solution to snatch customers.

Charter Pursues Rural Broadband

Charter looks to be the only ISP out of the largest four that is adopting a strategy to expand to rural areas surrounding existing markets. Charter has been the fastest growing ISP over the last few years, and it looks like the company wants to continue that growth.

I think the rural telcos are going to look back in a decade and realize they made a big mistake. The telcos have had repeated opportunities to upgrade broadband and dominate the rural markets, where they could have been a permanent monopoly. Instead, Charter is going to sweep through many markets and take most of the customers. Charter is going to be aided in this expansion by the $1.22 billion they snagged out of the recent RDOF grant.

Windstream Decides to Chase Fiber

If you go by what they’re saying, Windstream is coming out of bankruptcy as a new company. The company has said recently that it intends to build fiber to cover at least half of its historic telephone serving areas. This will catch Windstream up to the smaller telcos that have largely migrated to fiber as the only chance for long term survival. Of course, this also means that half of Windstream’s markets are largely going to be abandoned. Windstream customers have to be wondering which half they live in.

Satellite Broadband Goes into Beta

After years of being somewhat theoretical, Starlink has customers in beta test that are loving the download broadband speeds between 50 Mbps and 150 Mbps. All of the satellite companies still have a long way to go in terms of launching sufficient satellites to become a viable competitor – but we now have proof on concept.

Rough Year for the Supply Chain

The telecom industry, like so many others, has largely taken the supply chain for granted without a lot of thought of where network components are manufactured. 2020 started with price pressure on electronics due to tariffs and went into a tailspin when the pandemic hit Wuhan Province in China, where the majority of laser technology is made.

Electronics vendors have spent much of 2020 developing new sources of manufacturing. This means a downside for the Chinese economy, but an upside for many other places in the world. The new administration says it will fund an effort to move much of US chip manufacturing back to the US, and hopefully other electronic components will follow. The big advantage that the far east has had over US manufacturing has been cheap labor, but that might be largely overcome by modern and largely robotized factories. Hopefully, telecom vendors will take the needed steps to make sure we aren’t caught flat-footed again.

And Now . . . Really Fast Internet

It was inevitable that ISPs would eventually start offering residential broadband speeds faster than 1 gigabit. It’s a little hard to believe it was so long ago, but it was back at the end of 2012 when Google Fiber announced it was bringing gigabit fiber to Kansas City. I know of a few small ISPs, like the municipal ISPs in Lafayette, LA and Chattanooga, TN that offered gigabit service before then – but Google Fiber was the first to make gigabit the only product.

Gigabit data speeds were revolutionary in 2012. At that time, the speed of the basic product on most cable company networks was ‘up to 30 Mbps’. Google leaped the market speeds with over a 30X increase in speed – the biggest jump since we leaped from dial-up to 1 Mbps DSL.

At the time, the cable companies all scoffed at Google Fiber as a gimmick – but in markets where they competed against gigabit fiber, the cable companies scrambled to roll out gigabit products using DOCSIS 3.1. And the biggest cable companies like Comcast and Charter stepped up the broadband game and will have moved the speed of basic broadband from 30 Mbps to 200 Mbps in 2021.

It took a long time for users to buy into the gigabit speed tier. This was often due to price since gigabit products on cable company networks are priced over $100 per month before adding the modem fee. But Google Fiber has stayed with the same $70 price it announced in 2012 and numerous other fiber ISPs now have gigabit products under $100. That price no longer looks high when the price of the standalone Comcast broadband and modem is at $90.

OpenVault reports on broadband usage and subscriptions and reports that in the last year that residential gigabit subscriptions have climbed to 5.6% of all broadband subscriptions – a 124% increase from just a year earlier. Families sent home during the pandemic are deciding in mass that faster speeds are needed to support their new broadband needs.

Google Fiber has announced the introduction of a 2-gigabit product. For now, the product is only being offered in Huntsville, AL and Nashville, TN.  One has to think that the company will eventually offer 2-gigabit service in its other 30 markets. Google Fiber has priced the 2-gigabit tier at $100. This comes with a new modem that is capable of handling the 2-gigabit speeds as well as WiFi 6 to efficiently transport large bandwidth applications around the home without interference.

An even faster product is now being offered by MLCG of Enderlin, North Dakota. The company announced a 5-gigabit bandwidth product and a 2.5-gigabit product. The 5-gigabit product is priced at $199 per month and the 2.5-gigabit product at $150 per month. MLCG advertises the 5-gigabit product for Multiple users simultaneously – Stream or edit 4K videos, upload/download LARGE files, gaming with NO worry about lag, multiple smart home devices, video chat, play online games, download files, streaming HD shows and movies, social media and web browsing“.

Skeptics will say that these new products are a gimmick and that nobody needs Internet access faster than gigabit speeds. That would have been a valid observation in 2012 when there was nothing that could be done over a residential internet connection that needed a gigabit  of speed. But I know users who tell me they are stretching a gigabit product. I have a friend who has several heavy gamers in the household and who also also backs up his office servers at home daily who tells me there are times when a gigabit feels a little slow. I have several clients who have told me that doctors are asking for something faster than a gigabit in order to be able to view 3D medical scans in real-time at home. Already today, one out of 18 homes in the country has upgraded to a gigabit product, and it’s not hard to imagine that some of those homes want more than a gigabit.

I’m sure that the initial penetration rates on the products faster than a gigabit will be minuscule at first. But let’s look back at this in five years when a lot of ISPs offer multi-gigabit products. ISPs that can’t offer gigabit speeds never miss an opportunity to pooh-pooh fast broadband – but over time the penetration rate for these new faster products will climb, just like it has for gigabit broadband.

The Politicization of the FCC

Before the citizens of Georgia elected two new Democratic Senators, it looked like a Republican Senate was on a path to lock up the FCC by not approving any new Commissioners. This was threatened by Mitch McConnell and other Senators who didn’t want the FCC to pursue the reintroduction of net neutrality and broadband regulation.

The current FCC was already politicized when late last year the President didn’t reappoint Mike O’Reilly as a Commissioner after O’Reilly voiced his opinion that the FCC didn’t have the authority to overturn Section 230 of the FCC’s rules that provide a shield for web companies to not be sued over content posted by the public. O’Reilly thought that only Congress has that authority, and from I can tell, he is right. The politicization continued when the President appointed Nathan Simington as the new FCC Commissioner – somebody with virtually no telecom experience, but who is a vocal supporter of eliminating Section 230 rules.

The FCC has always been a little political, in that new administrations have been able to appoint a new Chairman who supposedly follows the political inclination of the new administration. But the FCC is an independent agency and sometimes FCC Commissioners surprise the White House. But for the most part, FCCs tend to follow the basic philosophy of the party in power. This is something that is part of what I call the regulatory pendulum, where the FCC and other regulatory agencies tilt due to politics towards the corporations they regulate or towards the public they are supposed to protect.

But in the past, the shifts that came with changes of administration have been subtle, because the vast majority of what the FCC does is not political or controversial. Probably 90% or more of the topics that make it onto the FCC’s dockets are not political but have to do with overseeing the telecom industry. There is nothing political about FCC actions like approving new cellular handsets or trying to stop robocalling.

To some extent, the current politicization of the FCC can be attributed to Congress, which has been too divided and partisan to pass a new telecom act. The current primary telecom rules were passed in 1996 when broadband access meant AOL and CompuServe, and the rules governing broadband are badly out of date. It’s like we’re regulating self-driving cars with horse and buggy rules.

Without updated directions from Congress, the FCC is forced to somehow fit desired policy changes inside of existing rules. That was the primary reason for the convoluted process the current FCC undertook to eliminate broadband regulation. The problem with these ad hoc workarounds is that a subsequent FCC can undo every workaround, and the new FCC in 2021 is likely to reimpose broadband regulation and net neutrality.

None of this regulatory back and forth is healthy for the FCC or healthy for the country. When the FCC gets tainted by charges of political bias, then the public and the industry come to have no faith in the FCC or anything they order.

The courts ruled that the current FCC was within its regulatory powers to undo broadband regulation – and that same court ruling will mean that a new FCC has the power to undo anything the last FCC did. If you ask the executives of the largest ISPs what they most want out of regulation, they will tell you its consistency. The big ISPs were perfectly fine living with the net neutrality rules, and the CEO of every big ISP went on the record saying so. What they are not fine with is the FCC changing rules on net neutrality, privacy, and other important issues every time there is a change of administration.

Unfortunately, the power to stop this policy yo-yo is in the hands of Congress and I don’t hold out any big hope that Congress can agree on important telecom issues to the extent needed to issue an updated Telecom Act. New telecom legislation would provide a clear set of policies that would apply to an FCC appointed by Democrats or Republicans. But maybe Congress will surprise us all and dig in on a bipartisan basis and figure this out. Broadband and related topics are too important to allow a big policy shift every time there is a change in the White House.

Powering the Future

For years there have been predictions that the world would be filled with small sensors that would revolutionize the way we live. Five years ago, there were numerous predictions that we’d be living in a cloud of sensors. The limitation on realizing that vision has been figuring out how to power sensors and the other electronics. Traditional batteries are too expensive and have a limited life. As you might expect, scientists from around the world have been working on better power technologies.

Self-Charging Batteries. The California company NDB has developed a self-charging battery that could remain viable for up to 28,000 years. Each battery contains a small piece of recycled radioactive carbon-14 that comes from recycled nuclear fuel rods. As the isotope decays, the battery uses a heat sink of lab-created carbon-12 diamond which captures the energetic particles of decay while acting as a tough physical barrier to contain the radiation.

The battery consists of multiple layers of radioactive material and diamond and can be fashioned into any standard batter size like a AAA. The overall radiation level of the battery is low – at less than the natural radiation emitted by the human body. Each battery is effectively a small power generator in the shape of a traditional battery that never needs to be recharged. One of the most promising aspects of the technology is that nuclear power plants pay NDB to take the radioactive material.

Printed Flexible Batteries. Scientists at the University of California San Diego have been researching batteries that use silver-oxide zinc chemistry. They’ve been able to create a flexible device that offers 10-times the energy density of lithium-ion batteries. The flexible material means that batteries can be shaped to fit devices instead of devices designed to fit batteries.

Silver–zinc batteries have been around for many years, and the breakthrough is that the scientists found a way to screen print the battery material, meaning a battery can be placed onto almost any surface. The printing process paints in a vacuum and layers on the current collectors, zinc anode, the cathode, and separator layers to create a polymer film that is stable up to almost 400 degrees Fahrenheit. The net result is a battery with ten times the power output of a lithium-ion battery of the same size.

Anti-Lasers. Science teams from around the world have been working to create anti-lasers. A laser operates by beaming protons while an anti-laser sucks up photons from the environment. An anti-laser can be used in a laptop or cellphone to collect photons and use them to power the battery in the device.

The scientific name for the method being used is coherent perfect absorption (CPA). In practice, this requires one device that beams out a photon light beam and devices with CPA technology to absorb the beams. In the laboratory, scientists have been able to capture as much as 99.996% of the transmitted power, making this more energy-efficient than plugging a device into electric power. There are numerous possible uses for the technology, starting with the obvious ability to charge devices that aren’t plugged into electricity. But the CPA devices have other possible uses. For example, the devices are extremely sensitive to changes in photons in a room and could act as highly accurate motion sensors.

Battery-Free Sensors. In the most creative solution I’ve read about, MIT scientists started a new firm, Everactive, and have developed sensors that don’t require a battery or external power source. The key to the Everactive technology is the use of ultra-low power integrated circuits which are able to harvest energy from sources like low-light sources, background vibrations, or small temperature differentials.

Everactive is already deploying sensors in applications where it’s hard to change sensors, such as inside steam-generating equipment. The company also makes sensors that monitor rotating machinery and that are powered by the vibrations coming from the machinery. Everactive says its technology has a much lower lifetime cost than traditionally powered sensors when considering the equipment downtime and cost required to periodically replace batteries.