Good Enough Broadband

I’ve lately been asked by several local politicians why they should pursue getting grant funding for their county since Starlink satellite and FWA cellular broadband seem like good broadband alternatives that are already here today. It’s a reasonable question to ask since they have likely heard from rural households that are happy with both technologies. The question usually includes some degree of wishful thinking because the officials want to be able to tell constituents that good broadband is already available and that the broadband gap has been solved.

I hate to tell them that these technologies are not a good permanent solution. At the same time, I stress that they should be promoting these technologies to make sure that folks know there are some better alternatives available today than other extremely slow broadband options. But I don’t think either of these technologies is a long-term broadband solution.

FWA cellular broadband is home broadband that is delivered by cellular companies from cellular towers. It uses the same technology as the broadband delivered to cellphones, with the only real difference being that there is an in-home receiver that can be used for home broadband.

The primary problem with thinking of FWA cellular as a permanent solution is the reach of the technology. Somebody living right under a tower might be able to get 200 Mbps broadband today, and for somebody who has been suffering with rural DSL or cellular hotspots, this is an amazing upgrade. But the strong cellular service doesn’t carry far from a given tower. Speeds drop rapidly with the distance between a customer and the cell tower. A customer living a mile away from a tower might see maximum speeds of 100 Mbps, but after that, speeds drop precipitously until the product looks like other current slow broadband technologies.

The distance issue wouldn’t be a big problem if rural counties were peppered with cell towers – but most rural counties don’t have nearly enough towers to support this technology. In fact, in most rural counties I’ve worked in, a lot of the county doesn’t have good enough cellular coverage for voice calls. There doesn’t seem to be any mad rush to build new towers to support FWA – and I wouldn’t expect a cellular carrier to want to be on a tower that might only see a few dozen potential customers.

A final issue with FWA is that cellular carriers give priority to cell phones over home broadband. If cellphone traffic gets heavy, then the carriers will throttle the FWA speeds. This is probably less of an issue in a rural area than in a city, but it means that the broadband is not fully reliable.

Satellite broadband is also not a great long-term solution for several reasons. Starlink has already said that it will only serve some fixed number of customers in a given geographic area – a number it won’t disclose. That makes sense to any network engineer because the bandwidth from a single satellite overhead is shared by all homes using the service. This means that if too many households try to use a satellite at the same time that broadband speeds will bog down. Starlink is never going to be willing to serve all of the rural customers in a county – when it reaches it’s target customers it won’t sell more connections.

The other issue with Satellite broadband is that customers need a great view of the sky. Homes located amidst trees or near hills or mountains may not be able to get the service at all or get a slowed connection.

The final issue with both technologies is the speed being delivered. FWA is most typically today delivering only 50-100 Mbps to most households that are within range of a tower. The speed tests for Starlink show a similar range between 50-150 Mbps. These are amazing speeds for a home with no broadband alternatives. But these speeds are already at the low end of acceptable broadband today – particularly since these technologies have a much higher latency than fiber.

In twenty years, we’ve grown from DSL and cable modems that delivered 1 Mbps to fiber technology today that can deliver multiple gigabit speeds. There are those that claim that the fast speeds are just marketing gimmicks, but I’m hearing from more households over time that need the faster speeds. The reality of the marketplaces is that technologies will spring up to take advantage of faster broadband. We’re already seeing 8K TVs today, and telepresence should be here in the near future. A rural customer receiving 50-100 Mbps will be locked out of future faster applications.

Any county that decides not to pursue the grants to get faster broadband will regret the decision in a decade when neighboring counties have blazingly fast broadband and are the places where folks will want to live. We’ve learned that fast home broadband now equates to economic development due to the work-at-home phenomenon. I worked with a county recently where 30% of the homes include at least one person working full time from home. That means higher incomes which translates into local prosperity.

I really like both of these technologies, and I recommend them to rural folks all of the time. But these are not the broadband solution that a county needs for long-term prosperity.

Lets Stop Talking About Technology Neutral

A few weeks ago, I wrote a blog about the misuse of the term overbuilding. Big ISPs use the term to give politicians a phrase to use to shield the big companies from competition. The argument is always phrased about how federal funds shouldn’t be used to overbuild where an ISP is already providing fast broadband. What the big ISPs really mean is that they don’t want to have competition anywhere, even where they still offer outdated technologies or where they have neglected networks.

Today I want to take on the phrase ‘technology neutral’. This phrase is being used to justify building technologies that are clearly not as good as fiber. The argument has been used a lot in recent years to say that grants should be technology neutral so as not to favor only fiber. The phrase was used a lot to justify allowing Starlink into the RDOF reverse auction. The phrase has been used a lot to justify allowing fixed wireless technology to win grants, and lately, it’s being used more specifically to allow fixed wireless using unlicensed spectrum into the BEAD grants.

The argument justifies allowing technologies like satellite or fixed wireless using unlicensed spectrum to get grants since the technologies are ‘good enough’ when compared to the requirement of grant rules.

I have two arguments to counter that justification. The only reason the technology neutral argument can be raised is that politicians set the speed requirements for grants at ridiculously low levels. Consider all of the current grants that set the speed requirement for technology at 100/20 Mbps. The 100 Mbps speed requirement is an example of what I’ve recently called underbuilding – it allows for building a technology that is already too slow today. At least 80% of folks in the country today can buy broadband from a cable company or fiber company. Almost all of the cable companies offer download speeds as fast as a gigabit. Even in older cable systems, the maximum speeds are faster than 100 Mbps. Setting a grant speed requirement of only 100 Mbps download is saying to rural folks that they don’t deserve broadband as good as what is available to the large majority of people in the country.

The upload speed requirement of 20 Mbps was a total political sellout. This was set to appease the cable companies, many which struggle to beat that speed. Interestingly, the big cable companies all recognize that their biggest market weakness is slow upload speeds, and most of them are working on plans to implement a mid-split upgrade or else some early version of DOCSIS 4.0 to significantly improve upload speed. Within just a few years, the 20 Mbps upload speed limit is going to feel like ancient history.

The BEAD requirement of only needing to provide 20 Mbps upload is ironic for two reasons. First, in cities, the cable companies will have much faster upload speeds implemented by the time that anybody builds a BEAD network. Second, the cable companies that are pursuing grants are almost universally using fiber to satisfy those grants. Cable companies are rarely building coaxial copper plant for new construction. This means the 20 Mbps speed was set to protect cable companies against overbuilding – not set as a technology neutral speed that is forward looking.

The second argument against the technology neutral argument is that some technologies are clearly not good enough to justify receiving grant dollars. Consider Starlink satellite broadband. It’s a godsend to folks who have no alternatives, and many people rave about how it has solved their broadband problems. But the overall speeds are far slower than what was promised before the technology was launched. I’ve seen a huge number of speed tests for Starlink that don’t come close to the 100/20 Mbps speed required by the BEAD grants.

The same can be said for FWA wireless using cellular spectrum. It’s pretty decent broadband for folks who live within a mile or two of a tower, and I’ve talked to customers who are seeing speeds significantly in excess of 100/20 Mbps. But customers just a mile further away from a tower tell a different story, where download speeds are far under 100 Mbps download. A technology that has such a small coverage area does not meet the technology neutral test unless a cellular company promises to pepper an area with new cell towers.

Finally, and a comment that always gets pushback from WISPs, is that fixed wireless technology using unlicensed spectrum has plainly not been adequate in most places. Interference from the many users of unlicensed spectrum means the broadband speeds vary depending on whatever is happening with the spectrum at a given moment. Interference on the technology also means higher latency and much higher packet losses than landline technologies.

I’ve argued until I am blue in the face that grant speed requirements should be set for the speeds we expect a decade from now and not for the bare minimum that makes sense today. It’s ludicrous to allow award grant funding to a technology that barely meets the 100/20 Mbps grant requirement when that network probably won’t be built until 2025. The real test for the right technology for grant funding is what the average urban customer will be able to buy in 2032. It’s hard to think that speed won’t be something like 2 Gbps/200 Mbps. If that’s what will be available to a large majority of households in a decade it ought to be the technology neutral definition of speed to qualify for grants.

Packet Loss and Broadband Performance

In a recent article in FierceWireless, Joe Madden wrote an article looking at the various wireless technologies he has used at his home in rural central California. Over time he subscribed to a fixed wireless network using WiFi spectrum, cellular LTE broadband, Starlink, and a fixed wireless provider using CBRS spectrum. A lot of rural folks can describe a similar path where they have tried all of the broadband technologies available to them.

Since Joe is a wireless expert who works at Mobile Experts, he was able to analyze his broadband performance in ways that are not easily understood by the average subscriber. Joe came to an interesting conclusion – the difference in performance between various broadband technologies has less to do with speed than with the consistency of the broadband signal.

The average speed tests on the various products varied from 10/2 Mbps on fixed wireless using WiFi, to 117/13 Mbps on Starlink. But what Joe found was that there was a huge difference in consistency as measured by packet loss. Fixed wireless on WiFi had packet loss of 8.5%, while the packet loss on fixed wireless using CBRS spectrum dropped to 0.1%. The difference is stark and is due to the interference that affects using unlicensed spectrum compared to a cleaner signal on licensed spectrum.

But just measuring packet loss is not enough to describe the difference in the performance of the various broadband connections. Joe looked at the number of lost packets that were delivered over 250 milliseconds. That will require some explanation. Packet loss in general describes the percentage of data packets that are not delivered on time. In an Internet transmission, some packets are always lost somewhere in the routing to customers – although most packets are lost due to the local technology at the user end.

When a packet doesn’t show up as expected, the Internet routing protocols ask for that packet to be sent again. If the second packet gets to the user quickly enough, it’s the same, from a user perspective, as if that packet was delivered on time. Joe says that re-sent packets that don’t arrive until after 250 milliseconds are worthless because by then, the signal has been delivered to the user. The easiest way to visualize this is to look at the performance of Zoom calls for folks using rural technologies. Packets that don’t make it on time result in a gap in the video signal that manifests as fuzziness and unclear resolution on the video picture.

Packet loss is the primary culprit for poor Zoom calls. Not receiving all of the video packets on time is why somebody on a Zoom call looks fuzzy or pixelated. If the packet loss is high enough, the user is booted from the Zoom call.

The difference in the percentage of packets that are delivered late between the different technologies is eye-opening. In the fixed wireless using WiFi spectrum an astounding 65% of re-sent packets took longer than 250 ms. Cellular LTE broadband was almost as bad at 57%. Starlink was better at 14%, while fixed wireless using CBRS was lowest at 5%.

Joe is careful to point out that these figures only represent his home and not the technologies as deployed everywhere. But with that said, there are easily explainable technology reasons for the different levels of packet delay. General interference plays havoc with broadband networks using unlicensed spectrum. Starlink has delay just from the extra time for broadband signals to go to and from the satellite and the ground in both directions. The low packet losses on a CBRS network might be due to having very few other neighbors using the new service.

Joe’s comparison doesn’t include other major broadband technologies. I’ve seen some cable networks with high packet loss due to years of accumulated repairs and unresolved issues in the network. The winner of the packet loss comparison is fiber, which typically has an incredibly low packet loss and also a quick recovery rate for lost packets.

The bottom line from the article is that speed isn’t everything. It’s just one of the characteristics that define a good broadband connection, but we’ve unfortunately locked onto speed as the only important characteristic.

Space Weather and Broadband

There was an interesting phenomenon that happened in September when Starlink launched 49 new satellites. The satellites were successfully deployed by the rocket, but as the satellites were being maneuvered to reach the final orbital slots there was a geomagnetic storm that caused 38 of the satellites to fall back to earth.

Space storms happen when radiation affects the magnetosphere that surrounds the earth. This is a band of particles that are held close to the planet due to the earth’s magnetic field. A geomagnetic storm occurs when there is an exchange of energy from outer space to the orbiting particles. The biggest storms are caused by mass ejections of particles and energy that occur during large solar flares. These solar flares release radiation and highly charged particles into space, which during a storm, interface with the magnetosphere.

It is the charged particles from the storms that manifest in the Aurora Borealis or northern lights. The extra energy from the storms can also play havoc with GPS and other space-based communications. The earth’s atmosphere keeps most of the radiation from solar flares away from the planet, but strong storms can wreak havoc with radio communications and can even produce feedback in long-haul electric wires that can disrupt the power grid.

During a geomagnetic storm, energy is pushed from the particles in the magnetosphere to the upper reaches of the ionosphere. This can temporarily increase the heat and the intensity of the ionosphere, which is what happened to the satellites. They met unexpected resistance that the tiny thrusters on the small satellites were unable to overcome.

Scientists have been looking at ways to better predict solar flares and the ensuing storms. In this case, with a warning, the satellite launch would have been delayed until the storm had passed. It’s a big challenge to predict the size and location of solar flares. The sun has an eleven-year cycle for the period of the heaviest solar flare activity, but a solar flare can erupt at any time.

Scientists around the world have been studying the sun using NASA’s Solar Dynamics Observatory. Scientists in China have had some success by tracking changes in the magnetic field of the sun, particularly in how that manifests in changes on the sun’s surface. They say that the temperature temporarily drops on the surface of the sun in the area where flares are coming. They have predicted several solar flares within 48 hours of an eruption. They have a long way to go for this to be accurate. Even when we get to the point of successfully predicting solar flares, it’s an even bigger challenge to predict if the particles from the flare will hit the earth. The worse impacts come when our planet is in the direct path of the ejected particles.

Tracking space weather matters since we are becoming reliant on space technologies. We’ve all incorporated GPS and satellite weather into our daily routines. We use space monitors for scientific research, to study farm fields, and to keep an eye on the various militaries around the planet. And suddenly, we have a lot of people using satellites for broadband. It was costly to Starlink to lose most of the satellites from a launch. But the potential damage from space storms is going to increase dramatically as we use space more and more. Starlink alone keeps talking about having 30,000 broadband satellites.

It’s not hard to picture the impact of losing these technologies for a few days up to a week. How many of you still carry an atlas in your car in case GPS doesn’t work? Businesses of all types plan outdoor work based on weather predictions that use data gathered by satellites. And having multi-day broadband outages can be devastating, particularly for rural businesses or people working from home. Space technology has become everyday technology, but it’s too easy to take for granted and to assume it will always work.

Update on Satellite Broadband

It’s been a busy few weeks with announcements from the satellite broadband industry. The industry keeps moving us closer to a time when almost anybody in the world will potentially have access to broadband.

The first announcement came from OneWeb. The company successfully launched 36 new satellites with rockets supplied by NewSpace India Limited. This new rocket company was formed in 2019 and is a public sector undertaking sponsored by the Indian Government and an arm of the India Space Research Organization. This launch is a reminder that many parts of the world are now interested in the space business.

These new satellites bring the OneWeb fleet of satellites up to 462. The company says it will ultimately launch 648 satellites. OneWeb intends to soon open up the constellation to global coverage. OneWeb’s business plan is to reach the remotest places in the world. The company has also been hinting at using the satellites to bring broadband to remote cell towers and to remote outposts for governments and militaries around the world.

Project Kuiper, owned by Amazon and Jeff Bezos is finally ready to hit the skies and plans to launch its first two prototype satellites in early 2023. The company has an ultimate goal of launching a total of 3,236 satellites. The first launch will use rockets from the United Launch Alliance using the new Vulcan Centaur rockets. Project Kuiper has already secured 38 additional launches on the Vulcan Centaur rockets, but the majority of its satellites will be deployed using the ULA Atlas V rockets. The company is rumored to have secured as many as 92 rocket launches.

One of the most interesting pieces of news comes from subscribers of Starlink. The company recently added new language to the terms of service for both residential and business customers that introduces the idea of a data cap. The new terms of service say that customers will get a monthly limit of ‘priority access’, and once that limit is reached, the customer will no longer be prioritized over traffic generated by other customers.

This is interesting from several perspectives. First, Starlink said in the early days of the business that it would never put a cap on usage. And with this announcement, it still hasn’t done that since customers will be free to continue to use broadband for the remainder of the billing cycle.

This feels eerily reminiscent of plans offered by the high-orbit satellite companies where usage slows down after customers reach a monthly usage limit.

Numerous engineers have speculated that any satellite constellation will have a finite capacity to move data, and this announcement hints that that data limit is already foreseeable for Starlink. Of course, the company can continue to launch more satellites and has plans on the drawing board to have as many as 30,000 satellites in its constellation. But for now, with a little over 2,300 satellites, this announcement says that the constellation is probably already getting over-busy at times. The ability to slow down customers is a classic way to serve more customers than the capacity of a network. The technique has been used for years by cellular carriers, and the supposed unlimited cellular data plans are not really unlimited because user speeds get significantly slowed when a customer reaches the subscribed data limit.

Satellite providers face the same dilemma as all ISPs in that the average broadband data consumption by consumers continues to grow at a torrid pace. According to Ookla, the average monthly broadband usage in the US has grown from 215 gigabytes per month in early 2018 to 481 gigabytes in June of this year. This growth puts a strain on all networks, but it has to be more of a problem for a satellite constellation which is going to have more backhaul restrictions than a landline network fed by fiber.

Broadband Satellite Issues

One of the most interesting aspects of serving broadband from low-orbit satellites is that it brings issues related to space into the broadband discussion. Space issues were less important for high earth orbit satellites that sit 20,000 miles above the earth. Other than an occasional impact from sunspots, there wasn’t much of note. But there are two recent events that highlight our new focus on low-earth orbit satellites. I would never have imagined a decade ago that I would be interested in these topics in terms of the impact on broadband.

The first is a piece of legislation introduced by Senators Maria Cantwell (D-WA), John Hickenlooper (D-CO), Cynthia Lummis (R-WY), and Roger Wicker (R-MS). The legislation is called the Orbital Sustainability (ORBITS) Act. The bill is intended to begin the development of a technology called active debris removal (ADR) that would be used to remove dangerous debris from low earth orbit.

The risk of space debris has been well documented by NASA and others. There are over one hundred million pieces of debris orbiting the earth today. These range in size from dust-sized up to out-of-service satellites and rocket boosters. Space will be getting a lot more crowded as the industry plans to launch tens of thousands of additional satellites in the coming years. Space is going to get crowded.

So why is debris a problem? The issue was described by NASA scientists Don Kessler in 1978. He postulated that as mankind put more objects into orbit that the inevitability of collisions would increase and that over time there would be more and more debris. This is easy to understand when you realize that every piece of debris is circulating at over 20,000 miles per hour. When objects collide, even more debris is created, and Kessler postulated that there would eventually be a cloud of debris that would destroy anything in orbit, making low-space unusable.

The legislation would fund research into different technologies that can be used to clean debris, with NASA tackling some of the trials. The hope is for an eventual system that scrubs space of debris as it is created to keep the valuable low-orbit space usable.

In other news, President Putin of Russia has threatened to destroy Starlink and other satellites that are helping Ukraine in the war between the two countries. Targeting satellites as part of war is an idea that has been used by Hollywood for years. The first such movie I remember is Moonraker, the James Bond movie that sent the British secret service agent into space.

In September, a Russian diplomat said at the United Nations that satellites could be legitimate military targets. He argued that civilian satellites that provide broadband might be a violation of the Outer Space Treaty that provides for only peaceful uses of satellite technology. He is obviously aiming his comments at Starlink, although in a few years, there will be multiple companies in the same category.

Russia has already been targeting Starlink with cyberwarfare hacking to try to corrupt the satellite software. It’s been reported that Russia was also looking for a way to identify the location of the satellite receivers on the ground.  But it was clear from recent threats that Russia is hinting at some method of crippling or destroying satellites in orbit.

The earth has become massively reliant on satellite technology. It’s now becoming a source of broadband, but there are many other vital uses such as GPS technology, weather forecasting, studying and tracking resources like water and minerals, and numerous other uses.

The idea of attacks on satellites is scary. This might range from some sort of hunter satellites that attack other satellites or more indiscriminately through something like nuclear blasts that would disable all electronics. But the investment in satellites is huge and would not easily be replaced. The bigger question raised is if it is worth spending money on satellites that can be destroyed.

It’s likely that the threats are just rhetoric because every country depends on satellites for a lot of everyday functions. But countries have done insane things in wartime before, so it’s not off the table.

Starlink and RDOF

In August, the FCC denied the SpaceX (Starlink) bid to receive $885 million over ten years through the RDOF subsidy. This is something that Starlink won in a reverse auction in December 2020.

In the press release for the rejection, FCC Chairman Jessica Rosenworcel was quoted as saying, “After careful legal, technical, and policy review, we are rejecting these applications. Consumers deserve reliable and affordable high-speed broadband. We must put scarce universal service dollars to their best possible use as we move into a digital future that demands ever more powerful and faster networks. We cannot afford to subsidize ventures that are not delivering the promised speeds or are not likely to meet program requirements.”

The FCC went on to say in the order that there were several technical reasons for the Starlink rejection. First was that Starlink is a “nascent” technology, and the FCC doubted the company’s ability to deliver broadband to 642,925 locations in the RDOF areas along with serving non-RDOF areas. The FCC also cited the Ookla speed tests that show that Starlink speeds decreased from 2021 into 2022.

Not surprisingly, Starlink appealed the FCC ruling this month. In the Starlink appeal, the company argued, “This decision is so broken that it is hard not to see it as an improper attempt to undo the commission’s earlier decision, made under the previous administration, to permit satellite broadband service providers to participate in the RDOF program. It appears to have been rendered in service to a clear bias towards fiber, rather than a merits-based decision to actually connect unserved Americans”.

Rather than focus on the facts in dispute in the appeal, today’s blog looks at the implications on the broadband industry during the appeal process. Current federal grant rules don’t allow federal subsidies to be given to any area that is slated to get another federal broadband subsidy. This has meant that the RDOF areas have been off-limits to other federal grants since the end of 2020. This has included NTIA grants, USDA ReConnect grants, and others. Federal grant applicants for the last few years have had to carefully avoid the RDOF areas for Starlink and any other unresolved RDOF award areas.

As a reminder, the RDOF areas were assigned by Census block and not in large coherent contiguous areas. The RDOF award areas have often been referred to as Swiss cheese, meaning that Census blocks that were eligible for RDOF were often mixed with nearby ineligible Census blocks. A lot of the Swiss cheese pattern was caused by faulty FCC maps that excluded many rural Census blocks from RDOF that should have been eligible, but for which a telco or somebody else was probably falsely claiming speeds at least 25/3 Mbps.

ISPs that have been contemplating grant applications in the unresolved RDOF areas were relieved when Starlink and other ISPs like LTE Broadband were rejected by the FCC. It’s difficult enough to justify building rural broadband, but it’s even harder when the area to be built is not a neat contiguous study area.

The big question now is what happens with the Starlink areas during an appeal. It seems likely that these areas will go back into the holding tank and remain off-limits to other federal grants. We’re likely going to need a definitive ruling on this from grant agencies like the USDA to verify, but logic would say that these areas still need to be on hold in case Starlink wins the appeal.

Unfortunately, there is no defined timeline for the appeal process. I don’t understand the full range of possibilities of such an appeal. If Starlink loses this appeal at the FCC, can the agency take the appeal on to a court? Perhaps an FCC-savvy lawyer can weigh in on this question in the blog comments. But there is little doubt that an appeal can take some time. And during that time, ISPs operating near the widespread Starlink grant areas are probably still on hold in terms of creating plans for future grants.

The 12 GHz Battle

A big piece of what the FCC does is to weigh competing claims to use spectrum. It seems like there have been non-stop industry fights over the last decade on who gets to use various bands of spectrum. One of the latest fights, which is the continuation of a fight going on since 2018, is for the use of the 12 GHz spectrum.

The big wrestling match is between Starlink’s desire to use the spectrum to communicate with its low-orbit satellites and cellular carriers and WISPs who want to use the spectrum for rural broadband. Starlink uses this spectrum to connect its ground-based terminals to satellites. Wireless carriers argue that the spectrum should also be shared to enhance rural broadband networks.

The 12 GHz band is attractive to Starlink because it contains 500 MHz of contiguous spectrum with 100 MHz channels – a big data pipe for reaching between satellites and earth. The spectrum is attractive to wireless ISPs for these same reasons, along with other characteristics. The 12 GHz spectrum will carry twice as far as the other spectrum in point-to-multipoint broadband networks, meaning it can cover four times the area from a given tower. The spectrum is also clear of any federal or military encumbrance – something that restricts other spectrum like CBRS. The spectrum also is being used for cellular purposes internationally, which makes for an easy path to find the radios and receivers to use it.

In the current fight, Starlink wants exclusive use of the spectrum, while wireless carriers say that both sides can share the spectrum without much interference. These are always the hardest fights for the FCC to figure out because most of the facts presented by both sides are largely theoretical. The only true way to find out about interference is in real-world situations – something that is hard to simulate any other way,

A few wireless ISPs are already using the 12 GHz spectrum. One is Starry, which has recently joined the 12 GHz Coalition, the group lobbying for terrestrial use of the spectrum. This coalition also includes other members like Dish Networks, various WISPs, and the consumer group Public Knowledge. Starry is one of the few wireless ISPs currently using millimeter-wave spectrum for broadband. The company added almost 10,000 customers to its wireless networks in the second quarter and is poised to grow a lot faster. If the FCC opens the 12 GHz spectrum to all terrestrial uses, it seems likely that use of the spectrum would quickly be used in many rural areas.

As seems usual these days, both sides in the spectrum fight say that the other side is wrong about everything they are saying to the FCC. This must drive the engineers at the FCC crazy since they have to wade through the claims made by both sides to get to the truth. The 12 GHz Coalition has engineering studies that show that the spectrum could coexist with satellite usage with a 99.85% assurance of no interference. Starlink, of course, says that engineering study is flawed and that there will be significant interference. Starlink wants no terrestrial use of the spectrum.

On the flip side, the terrestrial ISPs say that the spectrum in dispute is only 3% of the spectrum portfolio available to Starlink, and the company has plenty of bandwidth and is being greedy.

I expect that the real story is somewhere in between the stories told by both sides. It’s these arguments that make me appreciate the FCC technical staff. It seems every spectrum fight has two totally different stories defending why each side should be the one to win use of spectrum.

Satellite Cell Service

T-Mobile and Starlink made a joint announcement recently about an arrangement where Starlink will enable voice and texting capabilities to T-Mobile cellphones by the end of 2023. This is a service that would work with existing cell phones and would supposedly kick in when a phone can’t find a signal from a cell tower. Starlink said the technology would be enabled by new satellites that have significantly larger antennae than the current satellites in the constellation. In the press release, Elon Musk touted this as being able to reach people lost in the wilderness, but the much bigger use will be to fill in cellular coverage in rural areas for T-Mobile.

While the two companies made a big splashy announcement about the arrangement, they are late to the game as other industry players already have similar plans underway.

AST SpaceMobile has been working on deploying satellites aimed specifically at the cellular market. The company plans to launch its first five satellites in 2024. The company’s business plan is to launch fairly large satellites weighing over 3,300 pounds to create a constellation dedicated to cellular coverage. The company has already created partnerships with more than 25 mobile operators around the world, including the giant cellular company Vodaphone.

Lynk is taking a different approach and will launch small satellites around the size of a pizza box. The company has one test satellite in orbit with another schedule this December. The company plans to have 50 satellites in orbit by the end of 2023. Lynk already has 14 commercial agreements in place and will support large corporations and governments as well as mobile providers.

Just yesterday, Apple announced that it will offer a texting service for those lost in the wilderness in a partnership with Globalstar. This service is going to be text only and is going to be exceedingly slow, but it will supposedly work for folks who have the latest iPhone and who also are able to point the phone directly at the satellite. There will be an app that will tell a user where the satellite can be found.

All of these plans raise a lot of questions that we won’t get answered until somebody has a working satellite product. For example, could somebody inside a vehicle connect to a satellite? I have no problem connecting to the Sirius XM satellite service, so this might not be a problem. Will these connections somehow roam and connect back to cellular carriers when the user is in reach of a cell tower? That would be really complicated, and my guess is that this won’t work. Mike Sievert, the CEO of T-Mobile said this project is like putting a cell site in the sky, but much harder – and I believe him. I’ve been trying to picture how the satellites will pick out the right calls because filtering through the many billions of cellphone calls to find the right ones sounds like a huge data processing challenge.

The service would certainly be a boon to somebody lost in the  woods, but this is a much-needed service for a lot of people. My consulting firm does surveys, and it’s not unusual to find rural counties today where 30% or more of homes say they have no cellular coverage at their homes. The national coverage maps of the big cellular companies are a joke in many rural places.

T-Mobile and Starlink said that these connections would be only for voice calls and texting at first but that using cellular data might be on the horizon. That would be a significant accomplishment since a receiver many times larger than a cell phone is needed today to communicate with a satellite.

The real potential for this product is not in the U.S. and Europe where a large percentage of folks can connect today to cellular networks. The real market is the many parts of the world where modern cellular towers are a rarity. Most Americans probably don’t understand or appreciate that there is still a lot of the world where folks are not connected, or perhaps only connected through one universal connection that is shared by a whole community.

FCC Nixes Starlink and LTD Broadband

On August 10, the FCC issued a press release denying the long-form applications of Starlink and LTD Broadband in the RDOF reverse auction. This is big news because these are two of the biggest winners of the reverse auction. LTD Broadband was the largest winner of the auctions at $1.32 billion while Starlink had claimed over $885 million in the auction.

The FCC press release quoted FCC chairman Jessica Rosenworcel asking why the FCC should subsidize Starlink since it’s a “still developing technology” that requires customers to pay for a $600 dish, even with the FCC subsidy. I have to imagine that the FCC was relying, at least in part, on Ookla speed tests that show that Starlink’s performance has been worsening over time as more customers come onto the network. The speed tests also show that Starlink doesn’t meet the 20 Mbps upload speed that Starlink pledged to meet in the auction. We may not know the full reasoning behind the rejection unless the FCC follows this press release with a longer document.

The release says that the FCC rejected LTD Broadband because the agency deemed that the company was not capable of deploying a network of the scope, scale, and size required to satisfy the RDOF buildout requirements. This is not surprising since LTD is a small regional WISP in Minnesota that promised to build a fiber network that would cost many billions of dollars. LTD has already been having problems and had failed to win state approval for Eligible Telecommunications Carrier status in seven of the fifteen states where the company won the RDOF auction. There is also an open proceeding at the Minnesota Public Service Commission asking to revoke the existing ETC status.

These two cancellations of RDOF will have a significant ripple effect through the rest of the carrier world. The areas that were claimed in the RDOF auction have been off-limits for other federal grants like ReConnect. This ruling means that any areas that were claimed by these two companies can now be included in future federal grants.

The other issue caused by RDOF is that the awards were by Census block, and this resulted in award areas that have been described as swiss cheese. This meant that the RDOF awards were not contiguous but were often a scattering of Census blocks mixed in with areas that seemed to be identical but were mysteriously not included in RDOF – fully as a result of faulty FCC maps. This made it nearly impossible in some cases for other ISPs to seek grants for the areas not covered by RDOF since the areas are scattered.

I’m only speculating, but I suspect that the pending BEAD grants have a lot to do with the FCC decision. If the FCC had awarded the RDOF, then folks living in the Starlink areas would have been precluded from getting fiber or other broadband that is faster than Starlink. This was a particularly troublesome situation in my part of the world, where Starlink won the RDOF reverse auction in some of the western mountainous wooded counties in North Carolina. We now have a lot of evidence that Starlink struggles in heavily wooded areas.

The risk of awarding the RDOF to LTD Broadband is that the company would fail to execute on the fiber buildout. It wouldn’t be evident for a number of years if the buildout wasn’t going to succeed, and by that time, all of the current state and federal broadband grants would be long gone. I think this rejection shows that the federal government is really hoping that the BEAD grants will bring broadband to all rural areas.

There are still a few other large RDOF winners that have not been awarded. These are companies that are proposing gigabit wireless capability. The FCC is obviously not yet ready to make the awards to these companies, but it’s also apparently not ready to reject them. The clock is ticking for these areas. ISPs and local governments need to know if these areas won’t get RDOF since it takes time to plan for the BEAD grants, so it’s important for the FCC to make or reject the remaining RDOF applications soon.