Deploying 10-Gigabit PON

From a cost perspective, we’re not seeing any practical difference between the price of XGS-PON that offers a 10-gigabit data path and traditional GPON. I have a number of clients now installing XGS-PON, and we now recommend it for new fiber projects. I’ve been curious about how ISPs are going to deploy the technology in residential and small-business neighborhoods.

GPON has been the technology of choice for well over a decade. The GPON technology delivers a download path of 2.4-gigabits bandwidth to each neighborhood PON. Most of my clients have deployed GPON in groups of up to 32 customers in a neighborhood PON. In practical deployment, most of them pack a few less than 32 onto the typical GPON card.

I’m curious about how ISPs will deploy XGS-PON. From a pure math perspective, an XGS-PON network delivers four times as much bandwidth to each neighborhood than GPON. An ISP could maintain the same level of service to customers as GPON by packing 128 customers onto each GPON card. But network engineering is never that nicely linear, and there are a number of factors to consider when designing a new network.

All ISPs rely on oversubscription when deciding the amount of bandwidth needed for a given portion of a network. Oversubscription is shorthand for taking advantage of the phenomenon that customers in a given neighborhood rarely all use the bandwidth they’ve been assigned, and never all use it at the same time. Oversubscription allows an ISP to feel safe in selling gigabit broadband to 32 customers in a GPON network and knowing that collectively they will not ask to use more than 2.4 gigabits at the same time. For a more detailed description of oversubscription, see this earlier blog. There are ISPs today that put 64 customers or more on a GPON card – the current capacity is up to 128 customers. ISPs understand that putting too many customers on a PON card will start to emulate the poor behavior we see in cable company networks that sometimes bog down at busy times.

Most GPON networks today are not overstressed. Most of my clients tell me that they can comfortably fit 32 customers onto a GPON card and only rarely see a neighborhood maxed out in bandwidth. But ISPs do sometimes see a PON that gets overstretched if there are more than a few heavy users in the same PON. The easiest solution to that issue today is to reduce the number of customers in a busy PON – such as splitting into two 16-customer PONs. This isn’t an expensive issue because over-busy PONs are still a rarity.

ISPs understand that year after year that customers are using more bandwidth and engaging in more data-intensive tasks. Certainly, a PON with half a dozen people now working from home is a lot busier than it was before the pandemic. It might be years before a lot of neighborhood PONs get overstressed, but eventually, the growth in bandwidth demand will catch up to the GPON capacity. As a reminder, the PON engineering decision is based on the amount of demand at the busiest times of the day. That busy hour level of traffic is not growing as quickly as the overall level of bandwidth used by homes – which more than doubled in just the last three years.

There are other considerations in designing XGS-PON. Today, the worst that can happen with a PON failure is for 32 customers to lose bandwidth if a PON card fails. It feels riskier from a business perspective to have 128 customers sharing a PON card – that’s a much more significant network outage.

There is no magic metric for an ISP to use. You can’t fully trust vendors because they are going to sell more PON cards if an ISP were to be extremely conservative and put only 32 customers on a 10-gigabit PON. The ISP owners might not feel comfortable leaping to 128 or more customers on a PON. There are worse decisions to have to make because almost any configuration of PON oversubscription will work on a 10-gigabit network. The right solution will balance the need to make sure that customers get the bandwidth they request without being so conservative that the PON cards are massively underutilized. Over time, ISPs will develop internal metrics that work with their service philosophy and the demands of their customer base.

The Battle for IoT

There is an interesting battle going on to be the technology that monetizes the control of Internet of Things devices. Like a lot of tech hype, IoT has developed a lot slower than originally predicted – but it’s now finally becoming a big business. I think back to a decade ago when tech prognosticators said we’d soon be living in a virtual cloud of small monitors that would monitor everything in our life. According to those early predictions, our farm fields should already be fully automated, and we should all be living in the smart home envisioned by the Jetsons. Those predictions probably say more about the tech press that hypes new technologies than about IoT.

I’ve been noticing increasing press releases and articles talking about different approaches to monetizing IoT traffic. The one that we’ve all heard the most about is 5G. The cellular companies told Wall Street five years ago that the monitoring of IoT devices was going to fuel the 5G business plan. The wireless companies envisioned households all buying a second cellular subscription to monitor devices.

Except in a few minor examples, this business plan never materialized. I was reminded of it this week when I saw AT&T partnering with Smart Meter to provide patient monitoring for chronic conditions like diabetes and high blood pressure. The monitoring devices worn by patients include a SIM card, and patients can be monitored anywhere within range of a cellular signal. It’s a great way for AT&T to monetize IoT subscriptions – in this case, with monthly fees likely covered by health insurance. It sounds like an awesome product.

Another player in the IoT world is LEO satellites. In August of last year, Starlink made a rare acquisition by buying Swarm. This company envisions using satellites to be able to monitor outdoor IOT devices anywhere in the world. The Swarm satellites are less than a pound each, and the Swarm website says the goal is to have three of these small satellites in range of every point on earth by the end of 2022. That timeline slowed due to the purchase by Starlink, but this could be a huge additional revenue stream for the company. Swarm envisions putting small receivers in places like fields. Like with Starlink, customers must buy the receivers, and there is an IoT data monitoring plan that will allow the collection of 750 data packets per month for a price of $60 per year.

Also still active in pursuing the market are a number of companies promoting LoRaWAN technology. This technology uses tall towers or blimps and CBRS or some other low-power spectrum to communicate with IoT monitors over a large geographic area. The companies developing this technology can be found at the LoRa Alliance.

Of course, the current king of IoT is WiFi. Charter recently said it is connected to 5 billion devices on its WiFi network. WiFi has the advantage of a free IoT connection for the price of buying a broadband connection.

Each of these technologies has a natural market niche. The AT&T health monitoring system only makes sense on a cellular network since patients need to be monitored everywhere they go during the day. Cellular should be the go-to technology for mobile monitoring. The battle between LoRaWAN and satellites will be interesting and will likely eventually come down to price. Both technologies can be used to reach farm fields where cellular coverage is likely to never be ubiquitous. WiFi is likely to carry the signals from the devices in our homes – the AT&T vision of everybody buying an IoT cellular data plan sounds extremely unlikely since we all can have the same thing for the cost of a WiFi router.

Add Affordability to the Definition of Broadband

We spend an inordinate amount of time in the industry fixating on whether broadband connections meet certain speeds. There are rural communities all over the country asking residents to take speed tests right now hoping to prove that their broadband is inadequate. What I find sad is that every community that is doing this already knows the answer before they start gathering speed tests – they know their rural broadband is inadequate for the way that people want to use it.

I was glad recently to be reminded by a survey from the Leichtman Research Group survey that showed that 45% of homes have no idea what broadband speed they are supposed to be getting or are getting. I’ve seen the same thing in the numerous surveys we’ve done – people don’t fixate on Internet speeds and most just care if their home Internet works.

The surveys we coordinate for communities show one other issue that people care about, which is the price of broadband. In the hundreds of communities where we’ve done surveys, we almost always see that at least half of respondents think broadband is too expensive. In some communities, more than 90% of residents tell us that broadband is too expensive. When we ask people why they don’t have home broadband, the primary response in every survey is the cost of broadband.

I’ve had a half-baked idea floating around my brain for some time that might help address this problem. I’ve been wondering if prices shouldn’t be part of the definition of broadband.  There is a lot of discussion about setting a new definition of broadband speed at 100/20 Mbps. But I’ve learned in working with dozens of communities that there is a huge difference between a 100/20 Mbps connection that costs $55 and one that costs $85. As far as the public is concerned, these are not the same product – but we pretend that they are.

We know that as broadband prices increase, households are forced to drop broadband or downgrade to a slow-speed but cheaper alternative like DSL. One thing I’ve learned through surveys is that people in cities keep DSL because it costs less. People are not happy with the inferior performance of DSL, but they choose to spend $50 for a slow DSL connection because they can’t afford a 100/20 Mbps connection from Comcast or Charter.

Some of the new federal grants pay lip service to affordable rates, but none of these rules have any teeth. In fact, the giant BEAD grant legislation specifically prohibits the NTIA from suggesting broadband rates. (Any bet that this language came from the cable companies?) Some grants reward applicants for having a low-income broadband product, but they don’t insist that ISPs put any energy into marketing a low-income plan.

I don’t have specific metrics in mind, but there must be a way to weave affordability into the definition of broadband. Perhaps it’s something like the following. If 100/20 is the speed definition of broadband, then maybe connections under $60 would be considered as served, priced between $60 and $80 might be underserved, and prices over $80 would be considered as unserved. The price tiers I’ve chosen are arbitrary and open for debate, but the concept is that grants and subsidies ought to favor ISPs with affordable rates.

I know that real life is a lot more complex than my simple example. The big cable companies disguise broadband prices by hiding them in bundles. ISPs have jacked up the cost of the modem so that the basic broadband price sounds cheaper. In those markets lucky enough to have competition, the big ISPs offer special low rates for those willing to ask, while still billing the full rates to everybody else.

There is nothing that scares the big cable companies more than talking about regulating broadband prices. This was the main motivation for deregulating broadband at the FCC – cable companies don’t want regulators looking too closely at broadband prices. But prices matter at least as much as speeds, and for millions of homes, price is everything.

I know the big ISPs will say that the new Affordable Connectivity Program (ACP) will take care of this issue – but it doesn’t. Not all ISPs are going to take part in the program, and many of the ones that do will not market it to customers. Besides, how good a deal is it to get $30 off the basic Comcast broadband product that costs $90? The price after the ACP subsidy is still out of reach of many homes and is still more expensive than urban DSL – and many of the homes without broadband today can’t even afford DSL. Besides, there is also no guarantee that the ACP program won’t die in a few years when the funding runs dry.

I have no easy answer for this issue, but I hope this blog might plant a seed with somebody who can figure this out. I know that we have to stop ignoring the fact that prices matter. I’ve said for years that the big cable companies are on a path towards $100 broadband, and they are getting closer every year. Let’s stop pretending that $90 or $100 broadband is the same product as $50 or $60 broadband – even if the speeds are the same.

Mapping and Broadband Grants

Hopefully by now, most communities with poor broadband will have heard about the gigantic federal grants on the way to provide broadband solutions. The largest is the $42.5 BEAD (Broadband Equity, Access, and Deployment) grants that will be administered by states, with the funding and the rules established by the NTIA.

There is one provision in the enabling legislation that established these grants that makes me nervous and should concern everybody. The federal grants give priority to locations that are unserved (broadband speeds under 25/3 Mbps) and can also be used to fund underserved locations (speeds between 25/3 and 100./20 Mbps). The troubling provision is that Section 60102 of the legislation makes it clear the determination of eligible locations will rely upon the FCC maps.

Commerce Secretary Gina Raimondo, the agency that oversees the NTIA and the grant program, acknowledges that this is a problem. In an interview with CNBC, the Secretary admitted that grants might not be awarded until sometime in 2023 after the FCC maps have been updated.

I think it’s a huge problem if we need corrected FCC maps before we can decide which parts of the country are eligible for these grants. I fully expect the first version of the new FCC maps to be a disaster. ISPs will struggle with changing from reporting simple tallies by Census block to drawing complicated polygons around customers who are served or who can be served within ten days after a request. There will be a lot of honest mistakes made in the first few iterations of the new mapping as ISPs adapt to the new reporting methodology. It might take a few rounds of reporting until ISPs get the new maps right.

But the real problem will come from the big telcos who distort the current broadband maps by over-reporting broadband speed capabilities. Several of the big telcos have notoriously been reporting speeds of 25/3 Mbps or greater to shield monopoly areas from grants. ISPs today are largely free to claim any speeds they want. The current FCC rules say that ISPs can report marketing speeds – something the ISP can determine. There are huge parts of the country where speeds of 25/3 Mbps are claimed in the FCC maps when actual speeds might be a few Mbps.

The new FCC maps will not stop this practice. The big telcos can still claim fast speeds that don’t exist. In fact, if it stops the award of the BEAD grants, the telcos are likely to report even more areas as having 25/3 Mbps capability. Recall that just before the RDOF reverse auction, Frontier and CenturyLink tried to change tens of thousands of Census blocks to speeds of 25/3 Mbps, which would have kept them out of the RDOF auction. The FCC rejected most of these claims, but the attempt demonstrates the blatant deceptions that the big ISPs are willing to take to keep customers and revenues.

There is currently no penalty at the FCC for overreporting speeds, and some of the big telcos have found this to be a convenient tool to use to maintain monopoly service areas. The BEAD grant might be the last big grant program for a long time, so I can’t see any motivation for the big telcos will suddenly become honest and create honest maps.

We’ve seen in the challenges to the current NTIA grants that the big telcos have no shame. The telcos have been challenging grant eligibility in huge numbers of Census blocks where they know that speeds are poor  – these grant challenges are all about keeping out competition.

If there is any glimmer of hope, it’s that the BEAD grant funds will funnel through the states. Many of the states that already have grant programs have become tired of the games played by telcos and don’t pay them much heed. There are some states that have created their own broadband maps that they believe to be more accurate than the current FCC maps. We’ll have to see how much leeway the NTIA will allow for states to use the better mapping data. Unfortunately, the grant language in the BEAD legislation is fairly clear that the mapping that matters for this map is the FCC mapping. The grant legislation says that broadband data maps are the maps created under section 802(c)(1) of the Communications Act of 1934 (47 U.S.C. 642(c)(1)).

It’s already a shame that the mapping issue is immediately going to delay the BEAD grants from being awarded this year when millions of homes are waiting for better broadband. This mapping issue will easily add six months to a year until grants are awarded – and that means a longer time until there is a broadband solution deployed. The worse travesty will come if parts of the country continue to be denied grant funding due to dishonest maps. I think we’re only going to have one chance to get this right – and I’m not optimistic about this first basic step of defining who is eligible for the grants. I hope somebody proves me wrong.

Top Broadband Stories of 2021

Every year I write a blog talking about the trends that I think we’re likely to see in the coming year. But 2021 was such an unusual year for all of us that I thought it would also be useful to talk about what we accomplished in the industry over the last year while fending off a pandemic. All in all, it was quite a year.

Broadband Funding. This was the year when the federal government finally reacted to the poor broadband in many parts of the country and pulled the trigger on huge broadband grants. Before the year started, we saw funding through the CAREs funds in 2020. We saw money going to every community that can be used for broadband in the American Rescue Plan Act. This legislation also funded grant programs at the RUS and the NTIA. The big grant announcement was the $42.5 in broadband infrastructure funding from the Infrastructure Investment and Jobs Act. States across the country have chipped in with state broadband grants funded by legislatures. 2021 was the year when the funding spigots were opened wide.

Technology Getting Faster. 2021 was the year when XGS-PON became price competitive with GPON, and we’re starting to routinely talk about new FTTP networks as 10-gigabit capable. Fixed wireless technology has been improving, but the jury is still out on claims made in 2021 for being able to deliver rural gigabit wireless. Several cable companies did field trials of early DOCSIS 4.0 technology last year – the technology that will bring gigabit upload speeds to coaxial networks. Starlink showed us last year that satellite broadband doesn’t have to suck.

Supply Chain Becomes an Issue. I don’t recall ever hearing the term supply chain in the broadband industry before 2021 – and now it’s on everybody’s lips. Supply chain issues became real in 2021. ISPS ran into sudden long waits for basic electronics like switches and servers. During the year, the delivery times for fiber grew longer. And like always happens in times of shortages, by the end of the year, it became obvious that the biggest ISPs are still getting what they need while new ISPs at the bottom of the supply chain are told to wait a year to buy fiber.

BIG ISPs Interested in Rural America. I find it dismaying and somewhat ironic to see the big telcos and even a few of the big cable companies taking a sudden serious interest in serving rural America. The telcos started to ignore rural copper networks as far back as the 1980s, and their collective neglect led directly to the current dreadful state of rural broadband that we are now attempting to fix. The new interest in rural America is clearly due to the gigantic grant programs. Since the big grants are going to be funded through states, I guess we’ll find out if anybody wants to trust these companies yet another time.

Expansion of WiFi. When we look back twenty years from now, the expansion of WiFi spectrum might have been the most important development in 2021. The FCC originally voted to add 6 GHz spectrum to WiFi in April 2020, but the order was appealed by cellular carriers that wanted the spectrum for 5G. At the very end of 2021, the courts sided with the FCC and are allowing the use of 6 GHz WiFi to finally move ahead. WiFi is the wireless solution the world needs. You buy a box and can transmit wireless broadband around the home or office – the alternative is to buy subscriptions from cellular carriers. WiFi 6 and 6 GHz spectrum is going to take the technology to a new level.

Private Equity Finds Broadband. 2021 saw private equity money pouring into the broadband market. The big flashy announcement was when Apollo Global Management bought the copper assets of CenturyLink in twenty states. But more quietly, there is private equity money being used to buy smaller ISPs and to launch new fiber networks. It’s an interesting phenomenon when you consider that none of the fundamental aspects of the market has changed. Broadband networks are never likely to earn more than infrastructure returns, and the sudden interest in investing in low, slow return businesses is baffling.

Regulation Went Nowhere. There was big anticipation at the end of 2020 that a change of administration meant a new regulatory direction for the broadband industry. But inexplicably, almost all of 2021 went by without the White House naming a fifth FCC Commissioner or a new head of NTIA. 2021 was a year of regulatory status quo where the FCC concentrated on issues that have bipartisan support like awarding new spectrum and trying to fix robocalling.

ARPA is Not Just for Rural Broadband

FCC Commissioner Brandon Carr released an extraordinary statement the other day that is worth reading. Carr is taking exception to the final rules from the Treasury Department concerning how communities can use the $350 billion in funding from the American Rescue Plan Act (ARPA). Carr is asking states to somehow intervene in the way that cities, counties, and towns elect to use these funds.

As a reminder, the $350 billion he is talking about is funding that is being given directly to states, cities, counties, and townships. The money is not just for broadband and is intended to help local governments combat issues related to the pandemic.

Broadband is listed as an acceptable use of these funds since most communities had broadband-related problems during the pandemic as many millions were sent to work and school from home. But the money can also be used for many other purposes such as supporting the public health response to the pandemic, addressing negative economic impacts, replacing lost local government tax revenues that came as a result of the pandemic, covering premium pay for essential workers, and making investments in water and sewer infrastructure. The large majority of this funding is going to go to needs other than broadband.

Commissioner Carr starts with the statement that “the Administration’s rules green-light spending to overbuild existing, high-speed networks in communities that already have fast Internet service, rather than directing those dollars to the rural and other communities that lack access to any broadband service today.”

I take exception to this sentence for several reasons. First, I think the final Treasury rules are following the intent of Congress that wrote the enabling legislation. Congress included broadband as a possible use for the funds. If Congress had intended this funding to be used only for rural broadband, the legislation would have said so. But broadband is listed as an acceptable use for every community, including cities. I’m not sure how Commissioner Carr thinks that ARPA money given to Detroit, Baltimore, or New York City could be used to support rural broadband.

A lot of the funding is going to rural communities and I know many communities are aiming this funding to help areas with poor broadband. But I think cities contemplating using this funding also think they are helping to solve the digital divide. In every city, there are places where cable companies never built broadband, and there are many millions more homes that can’t afford broadband. Most of the urban initiatives I’ve seen for using ARPA funding are aimed at building infrastructure to serve public housing or for bringing broadband to students that don’t have home broadband. Commissioner Carr says those kinds of projects deviate from the intent of ARPA, and I have to disagree.

Commissioner Carr also doesn’t think this money should be used for overbuilding. I always get my hackles up when I hear that word, because the big ISPs have been using the word overbuilding as a pejorative for many years. Looking back to the days when there were federal grants that were earmarked to bring better broadband to areas with broadband speeds under 10/1 Mbps, the big ISPs fretted that the money would be used to overbuild existing rural ISPs. The big ISPs don’t think any federal funding should be used to ever overbuild any existing ISP – the big ISPs are in favor of maintaining monopolies. Whenever I see the word overbuild coming from a big ISP I just substitute the correct word – competition. When Congress added broadband as an acceptable use for the ARPA funding, it obviously intended that the money could be used to compete (overbuild) against ISPs that weren’t delivering the broadband households needed during the pandemic.

I must admit that I got a good laugh out of Commissioner Carr’s warning that “the Treasury rules allow these billions of dollars to be spent based on bad data.” The final Treasury rules allow the exact opposite by allowing communities to ignore the FCC’s notoriously bad broadband data when determining where to spend the money.

I opened the blog by calling this an extraordinary statement because I’m not sure why he wrote it. Commissioner Carr’s plea to the states doesn’t mean much since local communities are free to use the ARPA funds without any approval from the states. It’s just a guess, but perhaps Commission Carr is upset that the FCC has no role in this spending. This funding was created by Congress and given to the Treasury Department and to communities directly in what looks like a deliberate snub of the FCC. The FCC got snubbed again more recently when Congress decided to send the $42.5 billion in BEAD grants to the states to spend.

When Will We See Real 5G?

The non-stop wireless industry claims that we’ve moved from 4G to 5G finally slowed to the point that I stopped paying attention to it during the last year. There is an interesting article in PC Magazine that explains why 5G has dropped off the front burner.

The article cites interviews with Art Pouttu of Finland’s University of Oulu about the current state and the future of 5G. That university has been at the forefront of the development of 5G technology and is already looking at 6G technology.

Pouttu reminds us that there is a new ‘G” generation of wireless technology about every ten years but that it takes twenty years for the market to fully embrace all of the benefits of a new generation of wireless technology.

We are just now entering the heyday of 4G. The term 4G has been bantered around by wireless marketing folks for so long that it’s hard to believe that we didn’t see a fully-functional 4G cell site until late in 2018. Since then, the cellular companies have beefed up 4G in two ways. First, the technology is now spread through cell sites everywhere. But more importantly, 4G systems have been bolstered by the addition of new bands of cellular spectrum. The marketing folks have gleefully been labeling this new spectrum as 5G, but the new spectrum is doing nothing more than supporting the 4G network.

I venture to guess that almost nobody thinks their life has been drastically improved because 4G cellphone speeds have climbed in cities over the last few years from 30 Mbps to over 100 Mbps. I can see that faster speed on my cellphone if I take a speed test, but I haven’t really noticed much difference between the performance of my phone today compared to four years ago.

There are two major benefits from the beefed-up 4G. The first benefits everybody but has gone unnoticed. The traditional spectrum bands used for 4G were getting badly overloaded, particularly in metropolitan areas. The new bands of spectrum have relieved the pressure on cell sites and are supporting the continued growth in cellular data use. Without the new spectrum, our 4G experience would be deteriorating.

The new spectrum has also enabled the cellular carriers to all launch rural fixed cellular broadband products. Before the new spectrum, there was not enough bandwidth on rural cell sites to support both cellphones and fixed cellular customers. The many rural homes that can finally buy cellular broadband that is faster than rural DSL are the biggest winners.

But those improvements have nothing to do with 5G. The article points out what has always been the case. The promise of 5G has never been about better cellphone performance. It’s always been about applications like using wireless spectrum in complex settings like factories where feedback from huge numbers of sensors needs to be coordinated in real-time.

The cellular industry marketing machine did a real number on all of us – but perhaps most of all on the politicians. We’ve had the White House, Congress, and State politicians all talking about how the U.S. needed to win the 5G war with China – and there is still some of that talk going around today. This hype was pure rubbish. What the cellular carriers needed was more spectrum from the FCC to stave off the collapse of the cellular networks. But no cellular company wanted to crawl to Congress begging for more spectrum, because doing so would have meant the collapse of cellular company stock prices. Instead, we were fed a steady diet of false rhetoric about how 5G was going to transform the world.

The message from the University of Oulu is that most 5G features are probably still five or six years away. But even when they finally get here, 5G is not going to bring much benefit or change to our daily cellphone usage. It was never intended to do that. We already have 100 Mbps cellular data speeds with no idea how to use the extra speed on our cellphones.

Perhaps all we’ve learned from this experience is that the big cellular companies have a huge amount of political and social clout and were able to pull the wool over everybody’s eyes. They told us that the sky was falling and could only be fixed with 5G. I guess we’ll find out in a few years if we learned any lesson from this because we can’t be far off from hearing the hype about 6G. This time it will be 100% hype because 6G deals with the use of extremely short frequencies that will never be used in outdoor cellular networks. But I have a feeling that we’ll find ourselves in a 6G war with China before we know it.

Challenges to Broadband Grants

One of the most annoying aspects of the current federal broadband grants is the ability of incumbent ISPs to challenge the validity of grant requests. In the typical challenge, the incumbents claim that they are offering fast broadband and that an applicant should not be able to overbuild them.

This is another issue that can be laid squarely at the feet of the lousy FCC broadband maps. ISPs are largely free to claim any broadband speeds they want, and grant challenges give them a lever in situations like these grants. The challenges put a burden on anybody filing for a grant since they must somehow prove that incumbent broadband speeds are slower than 25/3 Mbps.

This is not new behavior by the incumbents. You might recall that before the RDOF auction in 2020 that Frontier and CenturyLink together tried to claim they were delivering speeds of at least 25/3 Mbps to tens of thousands of additional Census blocks. The goal was to eliminate these locations from the RDOF auction so that the telcos could preserve their broadband monopoly. The FCC largely rejected the last-minute changes by the telcos. There were already huge areas where telco speed capabilities were overstated before RDOF, with the result that a huge number of Census blocks were incorrectly kept out of the RDOF auction.

A recent article by Karl Bode for Community Networks highlights some specific examples of challenges bogging down the current round of NTIA broadband grants. He cites the example of a grant application made in Grafton County, New Hampshire, where the incumbents challenged the speeds for 3,000 Census blocks in a grant covering 4,000 blocks.

Grafton County had collected speed tests that showed that existing broadband speeds are mostly far below the 25/3 Mbps threshold. But this still puts the burden on the grant applicant to somehow document broadband speeds for each of the many Census blocks. The incumbents are using the challenges to weaponize the lousy data included in the FCC’s broadband maps.

This is often a ludicrous situation. Applicants like Grafton County are seeking to build fiber broadband because it has already heard repeatedly from residents about the poor broadband in the area.

There are easy and obvious fixes to this. One simple fix would be that grants that ask to build fiber over existing DSL should be free from challenges. There is no place in rural America where DSL is delivering adequate broadband.

Another easy fix would be to stop talking about 25/3 Mbps as a meaningful definition of broadband. If these grants only allowed challenges for claims of 100/20 Mbps, then all of the challenges from telcos would be neutered. But there would still be battles like seen by Grafton County, where the cable companies are delivering slow speeds and challenging the grants. Setting the definition of broadband to a faster speed, even if only for the purposes of these grants, would eliminate the wasted energy being taken in handing out grant funding. The folks taking the most of brunt of these challenges are the folks in the various broadband grant offices. The shame of the challenge process is that there probably are some legitimate challenges being made, but they get lost in the huge volume of harassment challenges.

Unfortunately, these challenges are in place for a reason that surprises nobody. When the legislation enabling grants comes through Congress, the incumbents get to sneak innocuous-sounding language into the grant rules that is then manifested in chaos during the grant process. Unfortunately, the upcoming BEAD grant rules include a challenge process, so we’re going to get to see this process repeated. If there were a huge number of challenges in the $288 million NTIA grant program, it’s hard to imagine what we’re going to see with the $42.5 billion BEAD grant program that’s granting 150 times more in funding.

Is Space Getting Too Busy?

Satellite broadband made the news again recently when the Chinese government said it had to adjust the orbits of the Chinese space station to avoid collisions with Starlink satellites. China claims it had to make adjustments in July and October of last year.

The Chinese are not the only ones making this claim. In 2020, the CEO of Rocket Lab said that it is becoming increasingly difficult to plot a clear trajectory when launching a rocket. The head of the European space agency recently accused Starlink of “making the rules” for everybody else in the way the company is launching satellites. The recent reaction by Elon Musk to these criticisms is that space is huge and can accommodate tens of billions of satellites.

What seems to be in play here is that there are no international regulations in place to define parameters for space launches. The last international treaty on space is over fifty years old and never envisioned the huge number of satellites we’re already starting to see. Starlink alone already has over 1,700 satellites and plans to launch new satellites twice per month throughout 2022. One earlier Starlink business plan called for over 30,000 satellites.

There have already been a few notable collisions between satellites. The most recent was when the Yunhai-1 Chinese satellite was apparently destroyed in March 2021 from pieces of debris from a Russian Satellite. There is a huge amount of space debris. There are over a million pieces of debris between 1 and 10 centimeters (4 inches) in size. The U.S. Space Surveillance Network was actively tracking 15,000 objects larger than 4 centimeters as of November 2021.

Debris matters because orbiting objects are moving fast – at 150 miles above the earth, a satellite needs to be going 17,500 miles per hour to maintain orbit. A collision with even a small object can be devastating.

Scientists have been warning about space debris for a long time. In 1978, NASA scientist Donald Kessler warned that collisions in space could result in a cloud of debris that would create an effective barrier to launching rockets or sending people into space.

This is no longer a theoretical problem since much of what we do on earth is now reliant on satellites. Most of our cable TV signals are launched from space. GPS relies on a series of satellites. Ships and airplanes navigate with support from satellites. Satellites are used to track weather patterns. There are now satellites tracking and monitoring everything from the movement of foreign armies to the water temperature of the oceans.  There will soon be millions of broadband customers using low-orbit satellites.

It’s hard for any layman to understand the real risks. Some of the controversy likely stems from international wrangling between nations. But there are also a lot of notable scientists that are worried that we might make space unusable.

It will be ironic if the world solves rural broadband with satellites only to find one day that there is too much debris to launch more satellites. It seems like a remote possibility, but some scientists say it’s possible. It makes sense for the international community to come together and work out rules that everybody can agree to.

Pushing Back Against Municipal Broadband

As a cautionary tale to any city that provides broadband, the incumbent ISPs are always going to push back on city initiatives. The following is a story from the summer that slipped off my radar. The city of Tucson, Arizona, launched a free wireless network to bring broadband to students in homes without broadband. As would be expected, the incumbent cable company, Cox Communications, fought against the city-provided broadband.

The city recognized the need for the network when it got requests for over 7,000 wireless access points from students during the pandemic. The city decided that the best long-term solution to the large numbers of unserved students was to create a private network using CBRS spectrum. We tend to think of municipal wireless networks as slow, but the city’s network rivals the broadband speeds offered by other cellular carriers in the city.

The city is using 4G LTE technology, which provides for the same indoor coverage as received by cell phones. The city identified 20 square miles of the city with the greatest number of students without home broadband. The initial network consisted of 40 small cell sites, and there are plans to add more. Broadband is received in the home through a typical cellular receiver and a SIM card that identifies the network. Broadband speeds are more than adequate to support a single student with download speeds over 50 Mbps and upload speeds over 3 Mbps. This network avoids the problem of having multiple students in a household sharing the network because it provides a receiver for each student.

The network has some interesting features. It supports basic network slicing which gives the school board the ability to prioritize school broadband traffic over other uses by students. The city is now looking at how to use this network for smart city purposes since the network provides broadband everywhere. The city is considering using the technology for monitoring the water system (critical infrastructure in arid Tucson), for providing ubiquitous broadband in parks, for connecting to all firefighters and other first responders, and for controlling traffic lights.

As might be expected, Cox Communications, the incumbent cable company, pushed back against the city network. When the wireless network was first discussed publicly, Cox made a proposal to provide 10 Mbps broadband to students in some selected parts of the city. When told that the wireless network would be delivering speeds of at least 50 Mbps, Cox countered that it would also be able to match the higher speed. But the first Cox offer is typical of most cable company low-income broadband programs – the speeds offered are far slower than what is delivered to a basic broadband customer.

Cox also sent a letter to the Tucson city council that warned about the problems that would be caused by broadband competition from the city. The letter included the same refrains we have seen elsewhere. The city shouldn’t be competing against the public sector. Cox warned that the city would have a hard time maintaining its new network. Cox also offered to partner with the city to build broadband in parts of the city not reached by Cox (with the city paying for the expansion).

I’m not sure that we should expect incumbents to act differently. As the cable company, Cox has a virtual monopoly on broadband since Cox largely competes only against DSL – and monopolies always fight to maintain monopoly power. Cable companies fight against all competition. They try every trick in the book to delay new commercial ISPs from building networks. But cable companies roll out a full press against city initiatives because they hope there is a political pressure point that will cause the city to reconsider. They know it’s a smart tactic because there are many cities that have canceled broadband plans after heavy lobbying by the incumbents.

In this case, the city didn’t back down and has launched the first phase of the wireless network. This became much easier for the city to finance after it received ARPA money from Congress that can be used to pay for broadband infrastructure. I am positive that the city will derive huge benefits from this network far past the day when the pandemic is behind us.