Is Online Programming Too Expensive?

I’ve read several articles recently that conjecture that online programming services that mimic cable company TV are in trouble because they are too expensive. This matters when trying to understand the cord-cutting trend because homes are less likely to bolt traditional cable if they have to spend as much elsewhere to get the networks they still want to watch. I haven’t looked a while, so I thought I’d make a new comparison. My local cable company is Charter Spectrum, so I compared the price of Charter cable TV to the online alternatives.

Charter’s base TV plan is called TV Select, and a new Charter subscriber gets a 12-month special price as follows:

$49.99 – 12-month advertised promotional price

$16.45 – Broadcast TV charge

$  6.99 – Settop box

$73.43 – 12-month promotion total price

After 12 months the base price for Select TV goes from $49.99 to $73.99, a $24 increase – and the full monthly fee jumps to $97.43 after the end of the one-year promotion. I’m a sports fan, and to get all of the channels I want I’d have to subscribe to Charter’s TV Silver plan. That package is $20 more expensive than the select plan, or $93.43 for 12 months, and then $117.43 after the end of the promotion period.

Charter’s Broadcast TV Charge has been widely labeled as a hidden fee in that Charter never mentions the fee in any advertising about the cable product. Charter just raised the fee to $16.45 in August, up from $13.50, making it the highest such fee among the big cable companies. But Comcast is not far behind at $14.95 per month and that fee is likely to increase soon. This fee is where the big cable companies are aggregating the charges for local programming from network affiliates of ABC, CBS, FOX, and NBC.

Comcast, AT&T, and some other big cable companies also charge a Regional Sports Fee, but so far Charter is covering this in their base cable costs. The bottom line is that for a Charter customer, my cheapest alternative that includes a full array of network cable channels will cost $73.43 for a year and then go up by $24.

How does this compare with the online alternatives?

  • The cheapest online alternative might be Sling TV. They have two basic small packages that cost $25 each or both for $45. Sling TV has a balanced number of sports and non-sports channels, but in my case doesn’t carry every sports network I want to see. There are also $5 add-on packages that can drive the cost up to $60 to see the network channels most homes probably want to watch. Sling TV doesn’t carry a full array of local network affiliates.
  • Next up in price is Fubo TV, priced at $54.99 per month. This is a sports-centric network that is especially attractive to soccer fans since the network carries a wide array of international sports. Strangely, Fubo TV doesn’t carry ESPN (meaning they also don’t carry ABC or Disney).
  • At the same price of $54.99 is Hulu + Live TV. They carry all of the sports networks I am looking for and a wide array of other network channels. They also carry the local network affiliate channels for most major markets. For $60.99 you can get this service without commercials, which requires downloading shows to watch the commercial-free versions. Hulu + Live TV also lets families and friends network together to watch shows at the same time.
  • YouTube TV is perhaps the closest online product to compare to Charters cable TV plans. This is priced at $64.99 per month. As a sports fan, the YouTube TV lineup provides all of the channels I want to follow my Maryland Terrapins. YouTube TV carries the same local network affiliates for my market that are available on Charter.

All of the online TV options allow subscribers to drop or add the service easily at any time, although none of them give a refund for time already paid. This means no contracts and no term commitment.

It’s easy to see why homes think that online program is too expensive, particularly since Charter falsely advertises their cable product at $49.99. But it costs almost $20 per month more to buy TV from Charter, even with the 12-month promotional price, and then $42 more poor month at the end of the promotion period. It still mystifies me why homes with decent broadband don’t do the math and leave Charter for Hulu or YouTube TV.

K12 Education During the Pandemic

Pew Stateline published a recent article talking about the widely disparate state of educating K12 students during the pandemic. Every school system has students without home broadband or home computers and school districts and states are dealing with these issues in widely different ways.

There are major challenges in educating students outside of the classroom. The Stateline article points out that there are issues beyond providing broadband and computers, and that kids still need adults to help direct their learning. But students without computers or broadband have virtually no chance of keeping up in an environment that relies fully or partially on learning from home.

The article cites a recent study by the Annenberg Institute of Brown University that looks at the impact of the pandemic in the spring semester of this year. The study estimates that students returning to school this fall will have only made between 63% and 68% of the expected gains in reading that would normally have been expected from the last school year. Students will only have made between 37% and 50% of the expected gains in math. It’s hard to imagine what happens to current students if virtual or interrupted education carries through much of the current school year. I’ve seen articles where various educators are already calling 2020 a ‘lost year’.

As part of my ongoing work with community broadband, I’ve talked to communities with a wide range of circumstances and proposed solutions. For example, I talked to the school administrator of a small rural school district that has roughly 600 students. The area resides in a broadband desert and most homes have no good home broadband option – even traditional satellite service barely works in the community where homes are nestled into canyons and valleys.

This small school district is trying the full range of solutions we hear from across the country. The district has scrambled to find computers for students that don’t have them at home. The school district has obtained cellular hotspots for many rural students, although there a lot of places in the county with little or no cellular coverage. The local government has tried to fill in the gap in cellular coverage by deploying a number of public hotspots to provide places where students and home workers can find broadband. But probably the most important thing they are doing is that the superintendent of schools called every student in the district and is trying to find individual solutions for students that are having problems learning.

Even with all this effort, the school district acknowledges that this is not a solution that will work with all students and that some students are going to fall far behind. This school district is only able to tackle the above solutions due to the small number of students in the district. It’s hard to imagine how school districts with thousands of students can even attempt to provide individual solutions.

The pandemic has also shown us that ‘normal’ broadband is not adequate for homes with multiple students and adults trying to work from home at the same time. Even expensive cable broadband subscriptions can be inadequate when more than two people try to share the small upload bandwidth. Emergency home and public hotpots share the same problems and can easily get overwhelmed.

I don’t have any proposed solutions for the problem and as a country, we’re going to somehow deal with a whole generation of students that have fallen behind the expected education progression. I do not doubt that when school gets back to normal that many school districts will figure this out.

For now, local communities have to try to take all of the steps needed to at least try to help students. I talked to somebody who does broadband mapping and was surprised to hear that many school districts are just now trying to figure out which students don’t have computers or home broadband. It’s been six months since the start of the pandemic and it’s hard to believe that school districts didn’t gather these basic facts before now.

States and localities everywhere have scrambled to create WiFi hotspots, but nobody should rest on their laurels and think that solves the problem. Many states and localities have used CAREs money to buy computers, and as important as that is, it is only a piece of the solution. I’ve read that school districts scrambled all summer to adapt curriculum to an online format, but that also doesn’t fix the problem. The bare minimum answer is that school districts need to find ways to do all of the above, and more – and even with that students are going to fall behind this school year. But what other choice do we have? As the Stateline article points out, some lucky families will hire tutors to keep students up to speed – but that’s not going to help the vast majority of students in the coming school year.

Gaming and Broadband Demand

Broadband usage has spiked across the US this year as students and employees suddenly found themselves working from home and needing broadband to connect to school and work servers. But there is another quickly growing demand for broadband coming from gaming.

We’ve had online gaming of some sort over the last decade, but gaming has not been data-intensive activity for ISPs. Until recently, the brains for gaming has been provided by special gaming computers or game boxes run locally by each gamer. These devices and the game software supplied the intensive video and sound experience and the Internet was only used to exchange game commands between gamers. Command files are not large and contain the same information that is exchanged between a game controller and a gaming computer. In the past, gamers would exchange the command files across the Internet, and local software would interpret and activate the commends being exchanged.

But the nature of online gaming is changing rapidly. Already, before the pandemic, game platforms had been migrating online. Game companies are now running the core software for games in a data center and not on local PCs or game consoles. The bandwidth path required between the data center core and a gamer is much larger than the command files that used to be exchanged since the data path now carries the full video and music signals as well as 2-way communications between gamers.

There is a big benefit of online gaming for gamers, assuming they have enough bandwidth to participate. Putting the gaming brains in a data center reduces the latency, meaning that game commands can be activated more quickly. Latency is signal delay, and the majority of the delay in any internet transmission happens inside the wires and electronics of the local ISP network. With online gaming, a signal between a gamer only has to cross the gamer’s local ISP network. Before online gaming, that signal had to pass through the local ISP network of both gamers.

There are advantages for gaming companies to move online. They can release a new title instantly to the whole country. Game companies don’t have to manufacture and distribute copies of games. Games can now be sold to gamers who can’t afford the expensive game boxes or computers. Gamers benefit because gaming can now be played on any device and a gamer isn’t forced into buying an expensive gaming computer and then only playing in that one location. Game companies can now sell a gaming experience that can be played from anywhere, not just sitting at a gamer’s computer.

A gaming stream is far more demanding on the network than a video stream from Netflix. Netflix feeds out the video signal in advance of what a viewer is watching, and the local TV or PC stores video content for the next few minutes of viewing. This was a brilliant move by video streamers because streaming ahead of where what viewers are watching largely eliminated the delays and pixelation of video streams that were common when Netflix was new. By streaming in advance of what a viewer is watching, Netflix has time to resend any missed packets so that the video viewing experience has ideal quality by the time a viewer catches up to the stream.

Gaming doesn’t have this same luxury because gaming is played in real time. The gamers at both ends of a game need to experience the game at the same time. This greatly changes the demand on the broadband network. Online gaming means a simultaneous stream being sent from a data center to both gamers, and it’s vital that both gamers receive the signal at the same time. Gaming requires a higher quality of download path than Netflix because there isn’t time to resend missed data packets. A gamer needs a quality downstream path to receive a quality video transmission in real-time.

Gaming adds a second big demand in that latency becomes critical. A player that receives signal just a little faster than an opponent has an advantage. A friend of mine has symmetrical gigabit Verizon FiOS fiber broadband at his home which is capable of delivering the best possible gaming data stream. Yet his son is driving his mother crazy by running category 6 cables between the gaming display and the FiOS modem. He sears that bypassing the home WiFi lowers the latency and gives him an edge over other gamers. From a gamer perspective, network latency is becoming possibly more important than download speed. A gamer on fiber has an automatic advantage over a gamer on a cable company network.

At the same time as the gaming experience has gotten more demanding for network operators the volume of gaming has exploded during the pandemic as people stuck at home have turned to gaming. All of the major game companies are reporting record earnings. The NPD Group that tracks the gaming industry reports that spending on gaming was up 30% in the second quarter of this year compared to 2019.

ISPs are already well aware of gamers who are the harshest critics of broadband network performance. Gamers understand that little network glitches, hiccups, and burps that other uses may not even notice can cost them a game, and so gamers closely monitor network performance. Most ISPs know their gamers who are the first to complain loudly about network problems.

Data Usage Remains Robust in 2Q20

OpenVault recently published its Broadband Insights Report for the second quarter of 2020. Since OpenVault’s software is used to track usage in major Internet POPs, the company has a unique perspective on broadband usage in the country.

The report says that the peak of data usage this year was in March when people first reacted to the pandemic. Data usage is down slightly compared with the first quarter, but still much higher than data usage a year ago, In the second quarter the average home used 380 gigabytes of data per month. This is down 6% compared to the average usage in March 2020 of 403 gigabytes. But the second quarter data usage is up 36% over the average usage of 280 gigabytes per household used in the second quarter of 2019. Before the pandemic, household broadband usage was growing at a rate just above 20% annually, so the 36% growth in a year demonstrates the huge impact on the pandemic on broadband.

Median data usage has increased even faster than average usage. The median usage measures the middle point where half of homes use less and have of homes use more broadband. The median usage in the second quarter of 2019 was 144 gigabytes and has grown 54% in a year to 223 gigabytes. This indicates that even households that previously would have been light data users are now using a lot more data during the pandemic. This likely can comes from both increase cord-cutting as well as from students and adults working from home.

OpenVault reports that usage for homes with unlimited broadband plans (no data caps) grew even faster and increased by 42% over 2019. The company surmises that the big increase is at least partially because the big ISPs are not enforcing data caps during the pandemic. However, part of this increase is also likely due to an increase of what OpenVault calls power users. These are homes that use more than 1 terabyte of data per month.

In the second quarter 8.7% of homes used at least 1 terabyte of data per month, more than double the 4.1% of terabyte homes a year earlier. This now includes 1% of all homes that are using more than 2 terabytes of data, triple since a year earlier in 2019.

One reason for the higher data usage might be explained by households subscribing to faster data plans. At the end of the second quarter, 4.9% of homes are now subscribed to gigabit data speeds, more than double the 2.1% of gigabit subscribed in the second quarter of 2019. Over 61% of homes in the country are now subscribed to broadband speeds greater than 100 Mbps. That includes 37.8% subscribed to plans between 100 Mbps and 200 Mbps, 13.5% subscribed to plans between 200 Mbps and 400 Mbps, 5% subscribed to speeds between 400 Mbps and 900 Mbps, and 4.9% subscribed to gigabit speeds. Less than 20% of homes nationwide are subscribed to plans slower than 40 Mbps.

There is one segment of broadband usage that continued to increase in the second quarter of 2020. Upload usage from homes is up 56% over a year earlier. Upload demand is directly related to the need to connect for homes to connect to school and work servers and to take part in Zoom and other video conferencing services. It’s likely before the pandemic that many homes had never much needed the upload link from home.

What is most intriguing about the continued increase in upload demand is that upload usage continued to grow even after school semesters were ending for the year. During the second quarter tens of millions of upload links to school servers would have gone quiet as school semesters ended, and yet upload demand continued to grow. It’s going to be interesting to see what the fall school semester does to broadband usage.

A Resilient Internet

Anybody that lives on the East Coast has likely experienced a major Internet outage in the past due to catastrophic weather like hurricanes, storm surges, flooding, nor’easters, or snow blizzards. There have also been Internet outages due to electrical brownouts and blackouts due to hot weather. Outages also can come from non-weather events, and some of you might remember the fire in the tunnel in Baltimore that incinerated a lot of major fiber cables.

I talked to David Theodore of Climate Resilient Internet. David’s been around since the beginning of the Internet and cut his teeth designing some of the first wireless networks for MCI. The primary premise of resilient Internet is that we should design the critical broadband infrastructure in vulnerable communities to withstand outages. The company has created a simple video to explain the concept.

The cost of a community completely losing the Internet is immense. We’ve seen examples of major Internet outages in the memorable past. Hurricane Sandy knocked out broadband to a large part of New York City in 2012. Hurricane Erma knocked out broadband in parts of Miami for weeks in 2017. I’ve experienced this firsthand when I lived in the Virgin Islands and hurricane Otto knocked out my broadband and phone for six weeks. Some of you may remember what happened when 60 Hudson Street shut down in Manhattan after 911.

There is probably no worse time to lose the Internet than during a disaster since people want to communicate with loved ones and want to start taking steps to get back to normal. Society is increasingly reliant on uninterrupted Internet access. If hurricane Sandy hit today, the economic consequences would be far worse than what we experienced just eight years ago in 2012 since our business and personal lives have migrated extensively to the cloud since then. We are more reliant on broadband access each year as more of our daily routines involve Internet access. An Internet outage completely cuts us off from the outside world and takes away things we’ve come to take for granted.

Climate Resilient Internet recommends that vulnerable communities and key infrastructure in vulnerable communities incorporate resilient Internet links as part of the core Internet infrastructure. This means using powerful millimeter-wave radio links that are built to hurricane standards to beam broadband from key buildings to data centers away from flood plains and coastal flooding. It means putting those radio transmitters in secure places like rooftops where they can be bolted down to withstand hurricane winds. It means having onsite microgrid and backup power sources that don’t rely on the commercial power grid. And it means avoiding all wires between the radio transmitter and the data centers.

This doesn’t have to mean a new layer of extra expense. Theodore recommends that large broadband users incorporate radio links into their daily broadband usage so that some of their Internet traffic always travels via the wireless link. Large businesses and critical anchor institutions like hospitals should have diverse routing to reach the Internet. Unfortunately, as many have found out during outages, routes that are promised to be diverse often are not if they eventually converge or share physical address switching points. Having a backup connection using wireless links is one of the only sure ways to guarantee diverse routing.

Oddly, wireless has gotten a bad name over the past twenty years as carriers only wanted to talk about fiber. Some of that bad name is deserved as manufacturers flooded the market with inexpensive radios that don’t meet carrier class standards. But carrier class radios are still some of the most secure and reliable technology we’ve ever deployed. I know of some microwave links that have hummed along for over forty years. Recent technology improvements and the use of higher frequencies mean that radios today can carry multiple gigabits of data.

While cities in hurricane and flood plains have the highest risk of having outages, this concept is worth exploring for critical infrastructure everywhere. Large swaths of the country are vulnerable to tornados. Any city can have a devastating fire. Bad things can happen to even the most secure data center. Some areas of the country stand at risk for earthquakes. Every large city has businesses and key infrastructure that ought to be protected by diverse Internet routing. In too many cases diverse routing using fiber is unavailable, is too expensive, or turns out to not be truly diverse. ISPs everywhere ought to take another look at mixing resilient wireless routing into their networks and as part of their service offerings.

The Other Homework Gap

I snagged today’s blog title from Christopher Ali, a professor in the Department of Media Studies at the University of Virginia. He recently wrote an article for the Benton Institute for Broadband & Society that reminds us that there is a second homework gap in addition to the one in K12 schools. There are almost 20 million college and graduate students across the country, most of which have been recently been notified that most or all of the fall semester this year will be done online.

Secondary education has already been in the process of migrating online. Eduventures estimated that the percentage of students already tackling an online degree before the pandemic was 29% of those pursuing an associate’s degree, 42% for a bachelor’s degree, 27% for a master’s degree and 3% of those working towards a doctorate. In the fall of 2020, nearly all secondary students will have some or all of the curriculum online.

Most college and university campuses have good broadband. Most campuses across the country are connected with fiber, coming in part from the effort by the folks at Internet2 which connects 321 universities to transmit data between campuses at gigabit speeds. Most college campuses have good broadband to classrooms, dorms, along with campuswide WiFi that enables students to easily connect to university data networks.

But the pandemic has sent college students home for the fall semester where they will have to take coursework online. Far too many students come from homes without good broadband. We’ve known for years that there are millions of rural homes without good broadband. But it’s easy to forget that 10% to 30% of the homes in various urban markets have no broadband, at home, mostly due to affordability issues. Ali says there are still 42 million Americans without home broadband.

In many states, school systems are finding broadband solutions for K12 students without broadband. Almost every state and county I’ve talked to since the start of the pandemic has one or more programs to connect K12 students. Many are providing cellular hotspots. Unfortunately, this is not always a great solution since many rural homes also don’t have a good cellular signal. Other schools are spreading hotpots around the community so that students can drive or walk to get broadband access. But nobody is making these same efforts for college students. These students are largely on their own, and there is no doubt that the lack of broadband will cause students to drop out of school.

Since broadband research is Ali’s field, he’s sensitive to the plight of his students and has designed a curriculum that will work for students who can get only rudimentary access to broadband. He’s prerecording classes so that students can download files rather than having to make a 2-way video connection. He’s gone old-school and has enabled group chats as a low-bandwidth way to have a dialogue with students.

But most college professors are not accommodating students without broadband. I have a daughter who is a senior at Texas Tech, and she tells me about the challenges of doing classes online. For example, she took a class in American Sign Language in the spring semester which become extremely challenging when moved online in the middle of the semester. Her professor is deaf and all communication during the course is done using sign language – which is hard to make work with twenty students online at the same time. She also has been taking science classes with labs that have been watered down due to going online. There are some aspects of college courses that will never translate well into an online format. It’s hard to picture how students taking a dance class, an anatomy dissection lab, or an advanced electronics lab class can transition easily to online. Some topics require hands-on experience.

At some point we’ll be out of the pandemic and back to normal, whatever that might come to mean. A big concern for universities is that they might lose a substantial portion of their current student population who are unable to keep up online. There are no easy answers to this dilemma, other than perhaps the kinds of steps that Ali is taking to accommodate students with low bandwidth. Universities can’t easily tackle the same solutions as K12 schools because their student base is likely dispersed widely. Universities are scrambling to figure this out, but if they don’t have a broadband contingency plan in place by now it’s too late for this school year.

Walking Away from Copper

It’s been clear for many years that the big telcos are looking for ways to walk away from legacy copper networks. Big telco copper is getting old and most was built in the 1950s and 1960s. All of this copper is far past the 40-year expected lives that the telcos claimed when they built the networks. Even old copper can be made to work if it was well-maintained, but the big telcos stopped doing routine maintenance on copper decades ago. For years, the big telco maintenance policy has been to patch problems without improving or fixing network issues.

In some cases, the big telcos have gone through the formal FCC process of formally retiring copper. This requires giving customers a 90-day notice that copper will be deactivated and providing customers an alternative to copper.  For example, Verizon posts notices of copper retirement on this web site. There have been no announced retirements this year, likely due to the COVID-19 pandemic, but Verizon was active last year, like in this September notification for Massachusetts. CenturyLink has made similar notices in parts of Arizona, Colorado, Minnesota, Nebraska, Oregon, Utah, and Washington.

In all of these cases, Verizon and CenturyLink made the announcements in communities where the carriers can provide fiber-to-the-home. It’s a natural technological progression to replace old copper technology with new fiber, and customers who lose copper to move to fiber have little room to complain.

But what about all of the places where the telcos never plan to offer fiber? There are still huge areas, including big parts of major cities where the telcos have no plans of migrating to fiber. What will happen to folks in regions where the copper is rotting past the point of usefulness, like is described in this article from last year in Fauquier County, Virginia? In that county the copper barely works for voice, which is sadly becoming the norm and not the exception.

There is nothing the big telcos can do with copper that has gone past the point of no return. No telco is going to replace bad copper and none of the big telcos are going to extend fiber into rural America or into urban neighborhoods where construction is too expensive. Verizon might be replacing copper with fiber around Boston, as indicated by the above filing, but the telco has no plans for building fiber in western Massachusetts, Cape Cod, or the many other places in the state where it never built FiOS fiber.

We might have gotten a glimpse into Verizon’s strategy when the company recently unveiled a 4G fixed wireless product. This 4G Home product promises to deliver 25 Mbps broadband using the 4G cellular network and Verizon could point to this product as a justification to abandon DSL over copper.

On paper, the 4G Home product will outproduce rural DSL, which typically has speeds well under 10 Mbps. But the Fauquier County article pointed out another ugly truth – much of rural America has poor cellular coverage to go along with outdated copper. The 4G Home product is not going to work for homes that are more than a mile or two from a cell tower. 4G Home is not going to be a reasonable substitute for DSL in communities like the towns on Cape Cod where the density is too high to support a lot of subscribers using 4G data as a landline data substitute – even a small customer penetration would swamp the 4G LTE network in populated areas.

AT&T has a similar fixed wireless product it introduced during the past year as the solution for meeting the company’s rural CAF II requirements. I’ve been tracking this product on the web and still don’t see local articles or chatter from many folks who have changed to the wireless product. AT&T has implemented this product to satisfy the FCC (and to keep the CAF II grant funding), but for some reason the company doesn’t seem to be pushing the product very hard.

The bottom line is these telcos will have to walk away from copper at some point within the next decade for the simple reason that the networks will stop functioning. From what I can see, both the FCC and many state regulatory commissions refuse to acknowledge that copper is dying and keep pretending that the telcos can somehow make this work. These networks are dying. The telcos might toss a bone to regulators by halfheartedly offering a wireless substitute for DSL. But the telcos are under no obligation to offer a replacement if the copper dies. Sadly, we’re going to look up five years from now and find a lot of rural homes without a telephone line and a cellular connection. There was a time when that was unthinkable, but it’s the coming reality.

Keeping Track of Satellites

The topic of satellite broadband has been heating up lately. Elon Musk’s StarLink now has over 540 broadband satellites in the sky and is talking about starting a few beta tests of the technology with customers. OneWeb went into bankruptcy but it being bought out by a team consisting of the British government and Bharti Airtel, the largest cellular company in India. Jeff Bezos has continued to move forward with Project Kuiper and the FCC recently gave the nod for the company to move ahead.

These companies have grandiose plans to launch large numbers of satellites. Starlink’s first constellation will have over 4,000 satellites – and the FCC has given approval for up to 12,000 satellites. Elon Musk says the company might eventually grow to over 30,000 satellites. Project Kuiper told the FCC they have plans for over 3.300 satellites. The original OneWeb plan called for over 1,200 satellites. Telesat has announced a goal of launching over 500 satellites. A big unknown is Samsung, which announced a plan a year ago to launch over 4,600 satellites. Even if all of these companies don’t fully meet their goals, there are going to be a lot of satellites in the sky over the next decade.

To put these huge numbers into perspective, consider the number of satellites ever shot into space. The United Nations Office for Outer Space Affairs (NOOSA) has been tracking space launches for decades. They reported at the end of 2019 that there have been 8,378 objects put into space since the first Sputnik in 1957. As of the beginning of 2019, there were 4,987 satellites still in orbit, although only 1,957 were still operational.

There is a lot of concern in the scientific community about satellite collisions and space junk. Low earth satellites travel at a speed of about 17,500 miles per hour to maintain orbit. Satellites that collide at that speed create many new pieces of space junk, also traveling at high speed. NASA estimates there are currently over 128 million pieces of orbiting debris smaller than 1 square centimeter, 900,000 objects between 1 and 10 square centimeters, and 22,000 pieces of debris larger than 4 inches.

NASA scientist Donald Kessler described the dangers of space debris in 1978 in what’s now described as the Kessler syndrome. Every space collision creates more debris and eventually there could be a cloud of circling debris that will make it nearly impossible to maintain satellites in space. While scientists think that such a cloud is almost inevitable, some worry that a major collision between two large satellites, or malicious destruction by a bad actor government could accelerate the process and could quickly knock out all of the satellites in a given orbit.

There has only been one known satellite collision when a dead Russian satellite collided with an Iridium communications satellite over a decade ago. That satellite kicked off hundreds of pieces of large debris. There have been numerous near misses, including with the manned Space Station. There was another near-miss in January between the defunct Poppy VII-B military satellite from the 1960s and a retired IRAS satellite that was used for infrared astronomy in the 1980s. It was recently reported that Russia launched a new satellite that passed through one of StarLink’s newly launched swarms.

The key avoiding collisions is to use smart software to track trajectories of satellites and provide ample time for the satellite owners to make corrections to the orbital path to avoid a collision. Historically, that tracking role has been done by the US military – but the Pentagon has made it clear that it is not willing to continue in this role. No software is going to help avoid collisions between dead satellites like the close-call in January. However, all newer satellites should be maneuverable to help avoid collisions as long as sufficient notice is provided.

A few years ago, the White House issued a directive that would give the tracking responsibility to the Commerce Department under a new Office of Space Commerce. However, some in Congress think the proper agency to track satellites is the Federal Aviation Agency which already tracks anything in the sky at lower levels. Somebody in government needs to take on this role soon, because the Pentagon warns that its technology is obsolete, having been in place for thirty years.

The need for tracking is vital. Congress needs to decide soon how this is to be done and provide the funding to implement a new tracking system. It would be ironic if the world solves the rural broadband problem using low orbit satellites, only to see those satellites disappear in a cloud of debris. If the debris cloud is allowed to form it could take centuries for it to dissipate.

Broadband Choice for Apartment Buildings

In the 1996 Telecommunications Act, Congress established a goal that rural residents ought to have an opportunity for broadband speeds equal to urban residents. It is this goal that forces the FCC to measure broadband speeds to determine if the whole country has adequate broadband.

It’s clear that urban and suburban single-family homes have the overall best broadband choices in the country. Most are served by a cable company with basic speeds between 100 Mbps and 200 Mbps. Urban homes also have the option of telco broadband that can range from 15 Mbps DSL to fiber. A few lucky markets also have fiber overbuilders from companies like Google Fiber, municipalities, and a handful of other entrepreneurs like US Internet in Minneapolis.

It’s easy to forget that a lot of urban residents have not shared in the improved broadband seen by single-family homes. A little less than one-third of Americans live in multi-tenant buildings (MDUs) which includes apartment buildings, condominiums, and assisted living housing. There is a hodge-podge of federal regulations that govern MDU broadband that has resulted in a wide range of levels of broadband for apartment residents. There are apartment buildings served by fiber that provide better broadband than the average single-family home but there are other apartments with practically no broadband options.

This situation arose due to a string of regulatory rulings that established that apartment building owners have the right to deny access to ISPs. Landlords also have the right to negotiate with one or more ISPs. Some of the big cable companies took advantage of apartment owners due to the emerging rules and got apartment owners to sign exclusive agreements that took away future options. The FCC stepped in and abolished the most abusive exclusive contracts, but the general principle still stands that apartment owners can grant or not grant access to ISPs.

There is also a wide range of the way that landlords allow tenants to buy broadband. Some allow tenants to contract directly with any ISP that has pre-wired the building, and many apartment dwellers have the choice between a cable company and a telco. But many landlords have inserted themselves as a middleman and force tenants to use whatever broadband choice the landlord has arranged. Landlord broadband can be embedded in the rent or charged a la carte. Nothing is stopping an apartment owner from buying a single broadband connection and providing a weak WiFi connection as the only source of broadband.

You might think that the market might make it hard for landlords that offer poor broadband options. But the reality is that there is often an apartment shortage for low-income tenants. Landlords that serve low-income tenants tend to not negotiate for gigabit broadband on fiber.

Just as the COVID-19 crisis has uncovered the sad state of rural communities without broadband, the pandemic has also uncovered the large number of urban apartments without adequate broadband for students and workers to function from home. I’ve talked to several large cities since the pandemic and some are reporting large numbers of urban students who are unable to participate in remote schoolwork.

Ryland Sherman recently wrote an article for the Benton Foundation that rightfully argues that this is another broadband gap we need to close. He recommends that Congress acts to change the rules that allow landlords to block ISPs from their buildings. He also points out that any meaningful change also will require eliminating the ability of ISPs and landlords to negotiate exclusive contracts that block other ISPs from entering buildings. His final recommendation is that any federal laws on the issue should prohibit states from erecting barriers that would keep ISPs out of apartment buildings.

These are all great ideas and they’ve been on my wish list for years should there ever be another telecommunications act coming out of Congress. Only Congress can make the needed changes since the FCC has its hands tied by the messy history of court rulings on the subject over the last few decades.

Unfortunately, Mr. Sherman’s recommended changes alone won’t fix all of the problems. These changes will allow ISPs to enter buildings that they’ve been precluded from. But no law can force ISPs to enter apartment buildings. The reality is that it’s expensive for a new ISP to rewire many apartment buildings. Many ISPs have only agreed to spend the money to wire buildings based upon having an exclusive contract. ISPs won’t enter buildings in a competitive environment when the math doesn’t work. It’s hard to imagine that fixing barriers is going to entice ISPs to serve apartments with low-income tenants.

The recommendations made by Mr. Sherman are needed. Allowing ISPs to enter buildings more freely will spur competition in both speeds and prices. We need to come up with new ideas to get ISPs to serve buildings that are expensive to wire or that serve low-income tenants. This will likely need to be a local solution since every market is different. We can’t rely on the private sector to provide good broadband in all MDUs – the incumbents have already been accused in many cities of redlining to avoid low-income neighborhoods. We absolutely should remove all barriers that keep ISPs out of MDUs. But we need to go a lot further to find ways to get ISPs to serve all MDUs.

The Impact of a Work-at-Home Economy

Analysts at the Federal Reserve Bank of Atlanta looked at the long-term impact of working from home on the economy and ranked different parts of the economy on two factors related to working at home – the likelihood that an area will generate a lot of work at home opportunities, and the ability of an area to support a work-at-home economy.

The premise behind the study was that we are likely to see much higher levels of people working home after the end of the pandemic. The pandemic has allowed employers of all types and sizes to see the impact on their business of having people work from home. It’s been widely reported that many businesses have seen no negative effects of having employees working from home, and many have reported increased productivity. Employers that are able to continue with work at home policies have been sharpening their pencils and have realized the amount of money they can save by downsizing or eliminating costly office space.

Businesses have also found that many employees like working from home. Workers are enjoying the savings from eliminating costly commutes, from not having to dress for work, and from not spending money on lunches. People love the gained freedom to take care of home tasks during the day while still working, and the relief from being near to family and kids instead of at a job site.

The study began by looking at what it called the impacted employment share. Researchers looked at the sectors that are being most negatively impacted by the pandemic, such as tourism, hospitality, travel, manufacturing, and agriculture. The sector impacts were then layered onto states to see which states are having the biggest negative job impact from the pandemic. The states with the biggest percentage employment impact are Wyoming and Nevada, with 62% of employees in sectors that are affected negatively by the pandemic. At the other end of the scale was New York where only 42% of jobs are in sectors that are negatively impacted by the pandemic.

The study then looked at the percentage of jobs that might reasonably be moved home in each part of the country. This was also done by sector, and sectors of the economy that can most easily accommodate moving workers home include financial services, information technology, and knowledge-based businesses. This sector analysis was also layered onto states and individual markets. The analysis showed the areas that were least able to migrate jobs to the home include Yakima, WA, and Salinas CA, where only 20% of existing jobs can be done from home. The highest places include San Jose – Sunnyvale – Santa Clara CA, Bloomington, IL, and the Washington DC metropolitan area where 32% of jobs could be transitioned to working from home.

Finally, the study combined the analysis to identify states that are the best and worst positioned to handle a work-at-home economy. This final analysis brought in data such like the availability of good broadband that can support working from home. The researches judged states not only by the availability of broadband but also by broadband subscription rates. Markets around the US vary between 12% and 23% in terms of homes without a broadband subscription, and that tracks well with income and poverty levels. The researchers reasoned that local economies with low broadband subscription rates are less likely to support a work-from-home economy.

The study also considered the percentage of homes that only connect to the Internet by cellphone, which ranged from a low of 7% in New Hampshire to a high of over 20% in Mississippi. They reasoned that people using cellular as the only source of broadband are not positioned to work from home. Finally, the study recognized that there are many rural communities without access to good broadband.

The bottom-line conclusion of the report is that states differ significantly by the ability to move jobs home to help weather the pandemic. A second conclusion is that no state is fully ready to handle the pandemic. For example, New York has over 41% of jobs negatively impacted by the pandemic but has less than 30% of jobs that could be transitioned to home. But New York is still far better off than states like Wyoming and Nevada where over 62% of jobs are negatively impacted by the pandemic and less than 25% of the jobs in the state could be transitioned to working from home.

The study doesn’t draw any conclusion beyond compiling the facts. It’s study has clearly identified states that are going to have the hardest time coping if the pandemic continues. What we know from past economic upheavals is that people follow jobs. If the US economy is going to have a larger percentage of people working from home in the future, it stands to reason that people are going to want to live in places where they can be hired for the work-at-home economy and where there is sufficient and affordable broadband to allow them to do so.

A trend towards working from home is going to change migration patterns within the country. We’ve seen decades of people migrating south to find manufacturing jobs as factories in the north went under and new jobs were created in the south. People may not have to contemplate such long-distance migration in the future, but the work at home trend presages increased short-distance migration from areas with poor broadband to areas with good broadband. A community without good broadband has to view the work-at-home trend with dread because the data in this report hints at a continued outflow of workers, a continued brain drain of young people, lower housing values, and all of the other negative aspects of an area in economic decline. We no longer need a strong economic argument for improving rural broadband – it’s staring us right in the face.