Interesting Science Summer 2023

I ran across two interesting new technologies.

Power Out of Humidity. Jun Yao, a professor of engineering at the University of Massachusetts at Amherst, published a paper in Advanced Materials that shows that energy can be pulled from the moisture in the air using material that has been harvested from bacteria. The study says that almost any material can be used for this purpose as long as the material can be smashed into tiny particles and then reformed to include microscopic pores that are less than 100 nanometers in size. The tiny holes work by allowing water vapor to pass through the device in a way that creates an electric charge imbalance between the top and bottom of the device, Each nanotube is effectively a tiny battery that can continuously produce electricity.

The test devices created by the Jun Yao team have been labeled as Air-gen. One Air-gen unit is about the size of a fingernail and as thin as a hair. One unit can produce continuous energy that is strong enough to power a dot on a computer screen.

Jun Yao refers to the Air-gen as creating a tiny man-made cloud. The next step for the team is to determine the materials that produce the most electricity. There are also challenges to efficiently harvesting and storing the power from multiple Air-gen units. Making this into a useful technology will mean somehow stacking large numbers of these units together.

The potential for the technology is immense if it can ever be scaled. This would enable power to be generated locally in a way that produces no waste or byproducts. Since humidity drives the electric power generation, this would best be used in places with high humidity instead of in deserts. The ideal clean energy technology has always been described as one that can pull power out of nature – and this technology might become the ideal source of power if it can pull electricity out of the water vapor in the air.

The Anti-Laser. Physicists at the Hebrew University of Jerusalem and the Vienna University of Technology have developed what is being dubbed the anti-laser. This is a device that traps light until it is fully absorbed.

There are a lot of uses for technologies that can absorb light. Photovoltaic cells would be increasingly efficient if all light can be absorbed and turned into electricity. Light sensors could be far more precise by eliminating stray light signals. The ability to capture faint light images with a telescope could be enhanced by eliminating spurious light.

The technology takes advantage of the quantum properties of electromagnetic waves, where waveforms can undergo destructive interference if combined in an exact way. The scientists have created a device that pulses light in a way to enhance the interference. This is accompanied by a set of mirrors and lenses that trap light inside a cavity and bounce it over and over again until the light is absorbed by light-absorbing materials.

Interestingly, this mimics a phenomenon in nature. When a flashlight is shined in the eyes of animals with night vision, like owls, the light appears to be reflected back. This is due to a reflective layer that sits behind the retina of such animals. Reflecting the light back out allows two chances for the retina to capture what is being seen in the near-dark.

When the researchers started this experiment, they found that light entering the trap from different angles was not fully absorbed, and some light escaped. They solved the problem by arranging the mirrors in a way to force all light into a circular path until it is fully absorbed.

The FCC’s 12 GHz Decision

One of the hardest things that the FCC does is to decide spectrum policy. The agency has full authority to determine the details of how we use each slice of available spectrum. Most importantly, the agency can determine who can use spectrum – and that’s why the task is challenging.

In the last decade, it’s hard to think of any spectrum deliberation and decision that didn’t have to weigh the interests of multiple spectrum users. There is almost always somebody using spectrum that must be considered. The FCC must decide if there is more national benefit in allowing others to use the spectrum, and in doing so, the FCC has to decide if the current users can somehow stay in place. If not, the FCC has to find existing users a new slice of spectrum and cover the cost of moving existing users to the new frequencies.

There are multiple users of spectrum that want more spectrum than they have today. Probably first on this list are the cellular carriers who say they need scads more spectrum to keep up with the demands of our connected world. Satellite carriers are now clamoring for spectrum as they continue to add more users onto satellite broadband – and as they contemplate launching IoT and cellular services. The U.S. government and the military insist on having bands of spectrum for a wide variety of uses. WISPs want more spectrum for rural broadband. The companies that make WiFi equipment want more free spectrum for public use. Then there are the important niche players like connected automobiles, GPS, weather satellites, etc.

Finally, as odd as it sounds, there are also investors who have purchased spectrum in the past and who lobby the FCC to increase the value of their ownership – only in America would this be one of the underlying reasons to deliberate on the use of spectrum.

The recent FCC decision on the use of the lower 12 GHz spectrum is a good example of the FCC deliberation process on spectrum. This spectrum sits in the middle of the range of spectrum that the FCC recently dubbed as 6G. This spectrum has great characteristics – it can carry a lot of data while still being transmitted for decent distances. In general, the higher the frequency, the shorter the effective distance of a broadcast transmission.

This spectrum has been used for satellite broadband connections. At the prompting of others in the industry, the FCC decided to investigate if there are other ways to use this spectrum to satisfy more national needs.

  • Dell owned a lot of the 12 GHz spectrum and was lobbying to expand the use of the spectrum to improve its value.
  • DISH was hoping to use the 12 GHz spectrum as part of its nationwide roll-out of a new cellular network.
  • The other big cell companies jumped in with the suggestion that the spectrum be sold at auction for FWA broadband.
  • WISPs jumped in and suggested they could coexist with the other users and use the spectrum for rural broadband.
  • The WiFi coalition asked that the spectrum be allowed for free indoor usage.

As is usual in FCC spectrum proceedings, the various parties all filed testimony from experts that demonstrated that their proposed use could work. In this case, many of the proposals tried to show that the FCC could order terrestrial use of the spectrum without interfering with satellite base stations. The experts on both sides of the argument said that the arguments on the other side were incorrect.

The spectrum engineers at the FCC are left to somehow glean the truth from the conflicting arguments. Meanwhile, the FCC commissioners have to wrangle with the policy and lobbying aspects of the issue since all of the players do their best to bring pressure to bear on such FCC decisions.

The FCC decision was that the lower 12 GHz spectrum should continue to be used for satellite backhaul. The big winner in the decision was Starlink, and the biggest loser was DISH.

But the FCC left the door open to other uses and will continue its investigation. The FCC is still interested in hearing more about the use for point-to-point and point-to-multipoint wireless connections. That would serve as backhaul between towers and could be used to connect FWA and WISP customers. The FCC is also willing to consider the free unlicensed use of the spectrum for indoor use. So, as is often the case, the debate continues.

Another Lumen Reinvention?

Lumen has been the hardest large big telco to figure out. Verizon, AT&T, Frontier, Windstream, and others have clearly decided that building fiber is the future path to survival. Consequently, the other telcos are far ahead of Lucent in terms of fiber passings. In the recent investor webcast, CEO Kate Johnson talked about Lumen’s upcoming fiber plans. In doing so, she mentioned that Lumen only covers 12% of its passings with fiber – far behind the other telcos.

CenturyLink was one of the first big telcos after Verizon to embrace fiber. In 2017, under CEO Glen Post, the company had plans to pass 900,000 homes and businesses with fiber, with similar plans in upcoming years. Post, with a long telco background, had a clear vision of CenturyLink becoming a fiber-based ISP, at least in the many large cities it served.

However, at the end of 2017, the company took a sharp turn when it acquired Level 3. To nobody’s surprise, Jeff Storey from Level 3 took over as CEO, and the company changed its focus from residential fiber expansion to a focus on large business customers and small cell sites – the bread and butter of Level 3. By 2019, new fiber construction had dropped to 300,000 passings, with many of those coming from connecting large buildings to the network.

In 2020, the company’s stated focus was on adding large buildings to the network, and the company added 18,000 buildings to its fiber network, while only passing 400,000 total new fiber passings. That’s a low number of new fiber passings for a company that had 4.5 million broadband customers that year. 2020 was also the year when the company rebranded to become Lumen, a move to  distance itself from identification as a copper telco.

Instead of expanding fiber, Lumen decided to ditch copper assets and announced the sale of its last-mile networks in twenty states to Apollo Global Management in 2021. This brought a cash infusion needed for expansion and got rid of deteriorating copper networks.

Last year the company announced that its major expansion thrust was to beef up its large intercity fiber network across the country, with the goal of adding over 6 million miles of fiber strand by 2026. The original CenturyLink fiber network was starting to show some age, with many routes built forty years earlier. The company planned to upgrade to the newest fiber from Corning that can support 400-gigabit electronics. The new long-haul networks have fiber bundles between 432 and 864 fiber strands – much larger than the historical networks that had 96 to 144 fibers.

Lumen has been penalized by the many changes in its future direction by seeing the stock price go into the tank. CenturyLink stock peaked at almost $49 in 2007. By 2017, the stock had slipped to the mid-$20 range. Since then, the stock has dropped steadily and recently hit $1.80 per share after sitting at $10 per share a year earlier.

CEO Kate Johnson admitted that the company needs to do something different. The company eliminated its dividends to shareholders in the fourth quarter of 2022. The company is instead going to reinvest that money into building new fiber passings and has plans this year to connect 500,000 homes and businesses in 2023. It plans to build deeply into six major metro areas this year.

The company needs to reinvent itself. Lumen lost 253,000 broadband customers in 2022 – 7.7% of its broadband base. The company lost another 56,000 broadband customers in the first quarter this year, dropping the company to fewer than 3 million broadband customers while falling to be the eighth largest ISP after being surpassed by T-Mobile.

It’s not hard to understand, in retrospect, why the company has lost value. The company has seemingly reinvented itself every year since 2017 by changing its primary focus each year. Some of the changes, like adding more business buildings and beefing up the long-haul fiber network will likely generate a lot of cash and value in the long run. But Wall Street has clearly told the company to pick a future and stick to it.

Urban and Rural Speed Parity

I wrote a recent blog that talked about a recent trend where over 81% of U.S. households are now subscribed to a broadband speed of at least 200 Mbps. I got a lot of comments about that post, mostly from ISPs who think that we are fixated too much on speed and that consumers don’t need faster speeds – they think that the marketing departments of the big ISPs have just convinced folks that faster speeds are important.

But when talking about rural versus urban broadband speeds, the discussion can’t only be about what people need or don’t need. There was an edict from Congress in the Telecommunications Act of 1996 that directed the FCC to have parity between urban and rural broadband. There has been no change of law that has softened this mandate, so it’s still something that the FCC should be following:

ACCESS IN RURAL AND HIGH COST AREAS.—Consumers in all regions of the Nation, including low-income consumers and those in rural, insular, and high cost areas, should have access to telecommunications and information services, including interexchange services and advanced telecommunications and information services, that are reasonably comparable to those services provided in urban areas and that are available at rates that are reasonably comparable to rates charged for similar services in urban areas.

The FCC has repeatedly ignored this mandate. Probably the most extreme example is when they gave over $11 billion to the biggest telephone companies with CAF II to supposedly upgrade DSL to 10/1 Mbps. This was done at a time when cable companies had mostly upgraded to DOCSIS 3.0, and most urban areas had access to speeds between 100 Mbps and 250 Mbps. By the time the CAF II subsidy ended, cable companies had mostly upgraded to DOCSIS 3.1, and urban speed capabilities in most places had reached 1 gigabit.

If anything surpasses the absurdity of CAF II, it’s the national definition of broadband that still sits at a ridiculous speed of 25/3 Mbps. According to the OpenVault latest statistics, only 4.7% of households with broadband are subscribed to speeds under 50 Mbps. That number doesn’t include rural households who can’t buy broadband because there is no reasonable option where they live – but still, the number of households that are using slow speeds has gotten to be a small fraction of broadband users.

Over the last two years, FCC Chairperson Jessica Rosenworcel suggested that the definition of broadband should be updated to 100 Mbps download. The OpenVault statistics now put that speed in the rearview window. Any federal definition of broadband has to be at least 200 Mbps. I don’t need to put forth any elegant argument why this is so – the statistics make the point for me. Over 80% of U.S. households are now subscribed to speeds of at least 200 Mbps download. The language in the 1996 Act makes it clear that rural residents ought to have access to broadband that is reasonably comparable to the speeds offered in urban areas. Any interpretation of the phrase “reasonably comparable” would conclude that rural speeds ought to at least be at the low end of subscribed urban broadband speeds – 200 Mbps is the minimum speed for over 80% of households.

The fact that over 80% of households are already subscribed to 200 Mbps speeds or faster (40% of households are subscribed to speeds of 500 Mbps or faster) means that all of our hand wringing over counting homes in the country with speeds of at least 100/20 Mbps is largely a joke. Under any current reasonable definition of broadband, those homes already are at speeds that shouldn’t be counted today as having broadband if the FCC was doing its job.

I really hate the numbers game with broadband, and no matter how we define broadband or set a cutoff for grant eligibility, there will be ISPs that will exaggerate the speeds of their current or planned technology to try to game the system. ISPs naturally work to try to protect their service areas from grant funding competition. Other ISPs want to be given grants for technologies that don’t reliably deliver broadband.

But the one thing we should stop doing is measuring broadband by standards that are already in the past in the real world. All of the angst, arguments, and fighting about whether areas are underserved with 100/20 Mbps broadband or slower ought to be scrapped – but unfortunately, the impetus of following grant rules will keep us squabbling about the wrong things for  years to come.

 

Another Twist in The BEAD Grant Process?

Word has been circulating that the NTIA recently informed State Broadband Offices that they must submit a final BEAD plan to the NTIA one year after receiving approval of the Initial Proposal of grant rules. That’s not a surprise since this language is straight out of the legislation, and the NOFO for BEAD – An Eligible Entity may initiate its competitive subgrantee selection process upon approval of its Initial Proposal and will have up to one year to conduct additional local coordination, complete the selection process, and submit a Final Proposal to NTIA.

The ugly twist is that the NTIA is expecting the Final Proposal to include a final list of all BEAD grant winners. Everybody has always assumed that the Final Proposal would be just that – a proposal that describes and fine-tunes the rules being used to award grants. Most State Grant Offices have assumed that they would have multiple years to pick BEAD grant winners.

Consider what has to happen once a state gets approval of its Initial Proposal:

  • A State Broadband Office must finalize the rules for awarding grants through attorneys and state leadership. Some states are going to be required to get the Legislature involved to approve grant rules. This will likely take 3-4 months for most states, but a few will take much longer.
  • The Grant Office would then be ready to announce the date for the first round of grant applications. They would typically give applicants 60-90 days to submit grant applications.
  • A Grant Office will need at least 30 days for the initial review of applications and to provide time to ask for clarifications from applicants.
  • Next, the detailed grant scoring must be done. The BEAD grants are complex, and it’s hard to see a state scoring and ranking grant applications in less than 60 days. There is a lot of complicated due diligence needed by grant offices that are often manned by first-time grant reviewers.
  • The State is then going to have to allow 15-30 days to post the grant applications and allow for protests and challenges. There would be another 30-60 days to resolve protests.
  • Finally, grant awards are announced, and it can easily take three months to negotiate contracts with grant winners. Inevitably, some winners will back out during this process.

The timeline above totals 16 months – and that’s if everything goes smoothly. The BEAD grants are complex, and reviewing and resolving grants that ask to serve overlapping areas is going to add a lot of complication to the process. To put this timeline into perspective, my state of North Carolina is 18 months into the $350 million ARPA grant process and still has not finished identifying all of the grant winners. And that’s with a capable and experienced Grant Office – some states are new to the grant process. The BEAD grants are for more dollars, are more complicated, and will take more time to review than ARPA grants.

The above timeline doesn’t reflect the added rules that are specific to BEAD. State Broadband offices have a mandate to bring broadband to every unserved location. They also must contend with special handling of high-cost areas. Both of these processes will require a lot more time than listed above for Broadband Offices to reach out to and negotiate with ISPs. States that are lucky enough to fund all unserved and underserved areas will need more time to figure out what comes next.

I’m fairly certain that any pressure to speed up the grant time frame comes from the recent White House emphasis on getting infrastructure money out the door quickly. I think everybody in the industry thinks that the BEAD grant process should have gone faster. But the BEAD process has been glacially slow and it’s been 19 months since the IIJA legislation was signed. It’s absurd that we are just now announcing the amount of money that states will get.

But we can’t make up for the glacial process of launching the BEAD grants by rushing at the end so that the money is shoved out the door without taking time to make sure that each State is getting the best long-term solution. States have been having a lot of internal debates about the technologies and types of ISPs they hope will win funding – any deliberation and chance of directing the funds responsibly will be cut short if the process is hurried. One of the most important parts of any grant process is to give worthy applicants a chance to refine and amend a grant request in a subsequent round. The BEAD grants are the first grants in my memory where the States had to reach out to stakeholders to get public feedback. If we rush, all that was learned in that process will be tossed aside.

If the NTIA really insists on a speedy timeline, it will be creating an RDOF-type disaster. The only way to get this process done in a year (or even 18 months) would be through a single round of grants – done hastily. With a tight time frame, the grants won’t be reviewed closely and grants that include errors will be pushed through. ISPs that aren’t really qualified will sneak through.

Having only one round of grants will feel a lot like the RDOF reverse auction. A giant pile of grants will be shoved into the funnel, and it’s likely that the grants will go to ISPs that ask for the lowest percentage of grant funding. A friend of mine has jokingly been saying that 95% of BEAD money will go to the large incumbent providers, and if there is a single-round grant process, he might not be far from the truth.

I’m hoping that this is just a trial balloon being circulated by the NTIA to get feedback, and if so, every State Broadband Office needs to push back hard. If the grants are going to be hurried, we’re going to end up with yet another disastrous federal grant program. I was hopeful that BEAD would avoid the mistakes of the past since the money was given to the States. But if the NTIA forces State Broadband Offices to rush the grant process, we’ll be watching a slow-motion train wreck over the next year.

Should Grant-funded Networks be Open-Access?

There was an interesting political effort in the Washington legislature recently to expand the use of open-access networks. There was language included in the Substitute House Bill 1147 that would require that any network funded from BEAD grants must become open-access and available to other ISPs.

Open-access has been a topic in Washington for many years. There was a long-time prohibition against Public Utility Districts (PUDs) from offering retail broadband. These county-wide government-owned utilities wanted to bring better broadband and settled by building open-access networks. Over the last few decades, a number of PUDs have launched open-access fiber networks, some of them with tens of thousands of fiber customers.

For those not familiar with open-access, it is a network where multiple ISPs can buy access to reach customers. This provides customers with the choice of using multiple ISP on the same network. All reports are that customers like the extra choices they get. Every broadband survey my firm has ever conducted has shown a huge public preference preference for choice.

The legislature finally relaxed the prohibition for PUDs last year, but in a bizarre fashion. The legislature passed two conflicting bills that allow PUDs to provide retail broadband services. Rather than choose between the two bills, the Governor signed both simultaneously (a pen in each hand) so that both bills went into effect. As might be imagined, this created as much confusion as clarity over the issue.

I doubt that anybody will be surprised that the biggest ISPs in the state vehemently opposed this legislation. The big cable companies have always immediately fought any suggestion that they allow other ISPs to use their networks. The big telcos were forced to sell unbundled copper loops starting with the Telecommunications Act of  1996, but that requirement continues to wane as the amount of copper keeps shrinking. The telcos started fighting against the unbundling rules as it was enacted, and over the years succeeded in greatly weakening the ability of outsiders to use their copper.

I don’t think anybody will be surprised to find out that the big ISPs in Washington succeeded in killing this idea. The big ISPs threatened to not pursue any grant funding if this proposal becomes law. Some even made veiled threats to stop investing in the state if this became law.

But it’s an interesting concept. The BEAD grant rules have a clear preference for open-access networks, and any carrier promising an open network will get extra points on a grant application. But the open-access preference is only a suggestion and not a requirement – something the big ISPs in Washington all pointed out.

Requiring open-access is not a far-fetched idea because open-access is required on all of the middle-mile networks that were announced this week as recipients of NTIA grants. But the whole point of the NTIA middle-mile networks is to build networks to places where backbone connections are unavailable or unaffordable. Requiring the grant recipient to sell affordable connections to everybody is a good use of federal grant dollars.

But this raises a much larger question. I know there are a lot of open-access proponents in the country who think that any network funded with government dollars ought to be made open-access to provide the most value to the taxpayers who are funding it. That is exactly what was suggested in Washington, but it didn’t take very long for the big ISPs to kill the idea.

Many industry folks want to take this idea even further. I don’t think I’ve seen a thread on this topic that doesn’t include somebody who thinks government should own all grant-funded fiber infrastructure, which should then be made available to all ISPs that want to use it. Obviously the BEAD grant rules weren’t written that way, and with the sway that big ISPs hold in D.C. it probably never will happen. But is something that Congress could do if they ever have the will to enact it. We’re starting to see cities who are adopting this idea, so we’re going to keep seeing new open-access networks coming to life. I have to think that the citizens in every city close to an open-access network is going to be asking why they can’t have the same thing.

The FCC to Look at Data Caps

FCC Chairwoman Jessica Rosenworcel asked the other Commissioners to join her in opening an investigation into broadband data caps. According to FCC rules, a majority of Commissioners must agree to open any official proceeding. For those not familiar with the data cap concept, it’s where an ISP bills extra for using more than a defined amount of broadband in a month.

Not all ISPs use data caps. The ISP that gets the worst press about data caps is Comcast, but it doesn’t bill data caps in all markets – seemingly only where it doesn’t have a lot of competition. Charter would love to bill data caps, but it has been prohibited from doing so because of an arrangement reached with the FCC when it got approval to buy Time Warner. That agreement just lapsed on May 18 of this year. We’ll have to wait to see if Charter will impose data caps – but it seems likely it will do so since the company asked permission from the FCC to impose data caps in 2021.

AT&T imposes Data caps on DSL and on some fiber connections. Astound broadband charges data caps in Washington, Oregon, and California. Cox has data caps that kick in after a user exceeds 1.25 terabytes per month. Mediacom imposes data caps on many of its plans. All of the products of the high-orbit satellite companies, HughesNet and Viasat, have severe data caps. So do cellular hot spot data plans.

It’s an interesting request by the Chairwoman. Under current FCC rules, the FCC has no authority to do anything about data caps. This authority went away when the previous FCC under Chairman Ajit Pai eliminated the regulation of broadband by killing Title II authority. Chairman Pai went even further and pushed remaining vestiges of any broadband regulation to the Federal Trade Commission.

This makes me wonder why Chairwoman Rosenworcel would try to open this docket. I can see several possibilities. First, this could just be done to show that the FCC cares about an unpopular ISP practice. It’s clear that the public hates data caps. I saw that the press that covers the FCC immediately flooded the news after this was announced. I would hope the Chairwoman would not be so callous as to investigate something for which the FCC is powerless to make any changes.

That leads to the second possibility that Chairwoman Rosenworcel believes that adding a fifth Commissioner will provide the votes needed to reinstate Title II authority or some updated version of it. Starting the investigation into data caps now might sync up well with renewed FCC regulatory authority and let the FCC make a popular change in the future to ban or modify data caps.

I’ve written several blogs over the years that make the argument that data caps are nothing more than a way for ISPs to extract extra payments from customers. There is zero justification from a cost perspective that residential customers that use more data than average cause any significant incremental cost for an ISP. ISPs buy wholesale broadband based on the busiest times of the usage in a month. Within the pile of broadband purchased to meet that peak need, it doesn’t matter how much broadband customers use as long as it doesn’t push up the busy hour for the month.

Additionally, the big ISPs that use data caps also engage in peering arrangements where they directly hand off broadband traffic to the largest web services like Google, Netflix, Facebook, and Microsoft. While there is a cost to create the peering points, once established, the amount of data sent through peering arrangements saves a huge amount of money compared to shipping this same traffic through the Internet.

It’s harder each year for affected homes to avoid data caps. Data caps accumulate both download and upload usage, and homes are increasingly using upload bandwidth that most folks don’t even realize. According to OpenVault, the average home in the U.S. now uses over 560 megabytes of data per month, an amount that keeps climbing. The household average as recently as 2017 was only 273 megabytes per month.

This will be an interesting process to watch. Chairwoman Rosenworcel has created a form for folks to describe their data caps stories. I’m sure that everybody who does so will be hoping that the FCC can help them – but that remains to be seen. It means getting a fifth Commissioner who is willing to reintroduce broadband regulation – something that is going to have a lot of opposition.

The Remaining RDOF Funds

The FCC originally budgeted $20.4 billion dollars for the RDOF subsidy program to be spent over ten years. The original RDOF reverse auction offered $16 billion in subsidies. But in a story that is now well known, some entities bid RDOF markets down to ridiculously low subsidy levels, and only $9.4 billion was claimed in the auction. $2.8 billion of this funding ended up in default, including some of the bidders who had driven the prices so low.

That means that only $6.4 billion of the original $20.4 billion has been allocated. The question I’m asking today is what the FCC will do with the remaining $14 billion.

It seems unlikely that there will ever be another RDOF-like reverse auction. RDOF was meant to bring broadband to areas that were unserved according to the FCC’s broadband maps at the time of the reverse auction – meaning areas where no ISP claimed broadband speeds of at least 25/3 Mbps. But since ISPs are able to claim marketing speeds under the FCC mapping rules instead of actual broadband speeds, many millions of unserved locations were left out of the RDOF process.

Since the RDOF auction, there have been many billions spent to bring broadband to unserved areas through ReConnect grants, local ARPA grants, state broadband grants, and several smaller grant programs. To understand how poor the original FCC RDOF maps were, even after these many grants, the latest FCC maps still show over 8 million unserved locations. Folks like me who look at the map at a granular level think there are even more areas that are still mistakenly claimed to have 25/3 Mbps broadband but that don’t in real life.

To be fair, the RDOF is doing some good things. A lot of electric coops, telephone coops, telephone companies, and independent fiber overbuilders are building networks using the RDOF subsidy as the basis for getting funding. Charter and a few other larger ISPs are building networks using the RDOF funding.

But the RDOF awards also left behind a lot of messes. First, it took too long to eliminate the default bidders. Areas claimed by these bidders were off-limits to other federal grants and most state grants – many of these areas would have fiber today had they not been in RDOF limbo.

The bigger problem is that the FCC made an absolute mess by awarding RDOF in what can be best called a checkerboard RDOF serving area. The following map is a good example of what this look like in a real county. In this particular county, the areas to the east have no people due to large parklands, but in the rural areas where people live, the RDOF awards covered some areas but not adjacent Census blocks. The Census blocks that were not awarded have the same lousy broadband options and were not included in the RDOF award due to the mapping problems discussed earlier.

This creates a real challenge for anybody now trying to get a BEAD or other grant to serve what is left. The areas left after RDOF don’t make a big coherent serving area, but a jumbled mess of remaining Census blocks. For somebody building a fiber network, these checkerboard areas are a nightmare because a builder must go through RDOF areas to reach the remaining areas. It’s one more factor that will drive up the cost of the BEAD grants in counties that got a lot of RDOF funding.

The FCC is dreadful at awarding grants and subsidies. The RDOF process was used so the FCC didn’t have to review traditional grants where ISPs proposed coherent grant serving areas. This is the same FCC that gave over $11 billion to the biggest telcos for CAF II to upgrade DSL to 10/1 Mbps.

Now that the states have broadband offices, the easiest way for the states to award the remaining RDOF billions would be to let state broadband offices do the heavy lifting. It would be one more tool for state broadband offices – that hopefully would not follow the complicated BEAD rules. The worst possible way to use the money would be for the FCC to take some easy path to shovel the money out the door again – please don’t give us RDOF II!

New Broadband Trends

The latest Broadband Insights Report is out from OpenVault providing statistics on average broadband usage at the end of the first quarter of 2023. In looking over the latest statistics I’m starting to see some interesting trends.

The average household used 560.5 gigabytes of broadband per month by the end of the quarter. That is the combination of 524.8.2 gigabytes of download and 35.7 gigabytes of upload. I also looked back over past years, and I think a new trend of broadband growth is emerging. Consider the following simple tables based upon the average household usage at the end of the first quarter since 2019.

Gigabytes
1Q 2019 273.5
1Q 2020 402.5 147%
1Q 2021 461.7  15%
1Q 2022 513.8  11%
1Q 2023 560.5  9%

2020 growth was crazy due to the pandemic, and that level of growth is likely never going to be seen again absent some other similar catastrophic event. Since then, growth has slowed a bit year after year. We’re settling into a pattern where the average household is using approximately 50 gigabytes more per month than the year before.

This is easy to understand. More and more things we are moving online. We’re storing more pictures and files each year. We’re watching more videos, and those videos are growing more data intensive as we migrate to 4K video. Most of the software we use is now in the cloud. The devices in our house are often connected to the cloud.

This is starting to feel like a new trend. We used to have a paradigm that broadband usage doubled every 3-4 years. Once we’ve moved most of our data lives to the cloud, there is no longer the likelihood of explosive growth in household usage. I think more and more homes are settling into a mature stage of having moved to the cloud. The only thing that can upset this would be some new widely used data function that uses a lot more data, or another event similar to the pandemic.

The other trend is that people have wholeheartedly decided that they want faster broadband speeds. There are a lot of folks in the industry who will argue vehemently that households don’t need more than 25 Mbps – but it doesn’t matter what they think. Huge numbers of families believe they should have faster speeds and are upgrading. Consider the following table that compares the percentage of subscriptions to various speeds from the first quarter of 2021 and 2022.

 Subscribers

1Q 2022 1Q 2023
Under 50 Mbps 7.6% 4.7%
50 – 99 Mbps 6.3% 4.8%
100 – 199 Mbps 17.0% 9.3%
200 – 499 Mbps 49.7% 41.1%
500 – 999 Mbps 6.1% 22.0%
1 Gbps+ 13.4% 18.1%
200 Mbps + 69.2% 81.2%

The percentage of homes that are subscribed to 200 Mbps or faster has skyrocketed in one year from 69% of homes to 81% of homes. Some of this increase comes from ISPs arbitrarily increasing speeds for customers, but a lot of the growth comes from people deciding to upgrade. This is clearly now a major trend.

The day of talking about 100 Mbps being an acceptable broadband speed is now behind us when 81% of the homes in the country are subscribed to speeds at least double that. As usual, the politicians who wrote the rules for the BEAD and other federal grants are far behind the real-life curve. Grants that allow somebody to build a network that can deliver only 100 Mbps are investing in obsolete technology. By the time those grant networks are constructed, any new networks that deliver only 100 Mbps will be years behind the rest of the broadband in the country. Let’s hope that broadband offices pay attention to this trend and require technologies that are forward-looking rather than buried in the past before they are even constructed.

The Benefits of Thinner Fiber

Fiber manufacturers are always trying to make it easier to deploy fiber. One of the most interesting trends is the increasing migration from 250-micron fiber to 200-micron fiber. For those not familiar with the metric system, a micron is one-thousands of a millimeter. A 250-micron fiber has a diameter of 0.25 millimeters, while a 200-micron fiber has a diameter of 0.2 millimeters.

That may not sound like a big difference, but when each fiber is thinner, the overall size of a fiber bundle is smaller. A larger fiber bundle of 200-micron fiber can be 20-30% smaller than the equivalent bundle of 250-micron fiber.

200-micron fiber has been around for five years, but it’s growing rapidly in popularity. It’s estimated that as much as 10% of all fiber sold is now 200-microns.

Interestingly, the core glass in the two types of fiber is identical, with a 9-micron core and a 125-micron surrounding glass. The difference in the overall fiber size is due to different coatings. The identical core means that it’s possible and easy to fuse a 200-micron fiber with a 250-micron fiber – allowing a fiber builder to mix the two kinds of fiber in a network.

There are big benefits of using smaller fiber. The smaller size means smaller fiber bundles that are far easier to use during the construction process. There is the added benefit of getting a lot more of the thinner fiber on a reel, meaning fewer changes of reels.

One of the more interesting benefits is to use a bendable version of the thinner fiber (Bending Inflexible Fiber). Bendable fiber is more tolerant of bending during the construction process, leading to less damage and stress on the fiber. Thinner fibers make it easier to install indoor fiber in places with a lot of 90-degree bends.

Perhaps the ultimate benefit is that smaller diameter fiber makes it easier to use microduct conduits. If using 200-micron fiber, it’s possible to fit a 432 fiber into a 10-millimeter microduct or an 864 fiber bundle into a 14-millimeter microduct. This opens up the possibility of installing huge numbers of fiber along a street in those situations where it’s needed.