Broadband and New Factories

There is a lot of talk across the political spectrum about the need to bring manufacturing back to the US. The pandemic has made it clear that the US is far too dependent on other countries that make the things we need to succeed. I found it painful back in March and April watching governors pleading with foreign countries to ship us the basic supplies needed to test for the coronavirus.

Medical supplies are just the tip of the iceberg and as a country, we’ve outsourced goods across the spectrum. It’s disappointing to look at the iconic American companies that no longer make their goods in the US. We’ve outsourced Schwinn bikes, Rawlings baseballs, Levi jeans, Converse All-star sneakers, Fisher-Price toys, Samsonite luggage, Brach’s candy, Fender guitars, Dell computers, Black & Decker and Craftsman tools, Radio-Flyer red wagons, and even America Girl and Barbie dolls.

Over 60,000 US factories have shut since 2001 when China joined the WTO. Manufacturing jobs at the end of WW2 II represented over 60% of all jobs in the US economy, and that has dropped today to under 9%. The reasons we’ve lost American factories are complex. While much of it can be blamed on manufacturers chasing higher margins through lower labor costs, many US factories also grew old and obsolete as owners didn’t put profits back into modernization. The strong US dollar has often contributed to US-made goods being at a disadvantage on world markets.

The current administration has made it a priority to create American manufacturing jobs and has succeeded in adding back about 900,000 manufacturing jobs since the start of 2017. Joe Biden in his recent presidential acceptance speech talked about creating policies that would create 5 million new manufacturing jobs. The pandemic has made it clear to politicians on both sides of the aisle that we need to manufacture critical goods like drugs and electronics in this country again. It’s insane for the country to have to rely on others for basic commodities like medicines.

The question I ask today is if communities in America are ready for new factories? New factories are different than traditional factories. New factories will almost universally include at least some level of automation. New factories will require a fast and secure broadband connection. Factories today are tied into the cloud for much of the software they use. They use the Internet to interface real-time with suppliers and customers. Factories are often connected to other branches of the company that collaborate over broadband in real-time.

Any community that wants to attract new factories must have great business broadband. That means not only fiber to connect to the business parks where factories are located, but it means diverse fiber routing so that a factory doesn’t lose broadband if somebody cuts a fiber inside of a city. It also means having diverse Internet routes leaving a city so that a fiber cut doesn’t isolate broadband. Factories are not going to locations where the Internet connections are not iron-clad.

Many communities I work with are still working to solve the first issue, which is to build the basic fiber infrastructure. We always hear about communities that have made the big plunge to build fiber to everybody in town, but there are far more communities that have quietly found ways to bring fiber to industrial parks and other key employers.

However, building fiber to business parks is only half of the needed solution. It’s just as important to a community that the fiber connection between the community and the Internet is secure. Factories really don’t care if the reason for fiber outages is inside or outside the community – they want to locate in places where broadband connections are virtually guaranteed.

Unfortunately, many communities are served by poor middle-mile networks that make the community susceptible to Internet outages. This blog from May talks about the counties in northwest Colorado that have suffered as a region every time there has been an outage on CenturyLink’s middle-mile fiber. It was fairly common for a fiber outage in the region to knock out broadband to the whole region and the key infrastructure like hospitals, law enforcement, and factories. These communities banded together to construct Project THOR – a fiber network built to guarantees that a fiber cut or an electronics outage doesn’t disrupt broadband.

If we are going to see a resurgence of new factories, then communities need to make an honest assessment of the local and regional broadband capabilities and vulnerabilities. Cities that have sound broadband infrastructure need to be crowing about it, and communities with gaps in Internet capability need to get in gear and find ways to solve broadband problems. If we indeed see a flood of new factories being built, it might be a once-in-a-generation event, and cities don’t want to miss out due to not having decent basic fiber infrastructure.

Who’s Chasing RDOF Grants?

There is a veritable Who’s Who of big companies that have registered for the upcoming RDOF auction. All of the hundreds of small potential bidders to the auction have to be a bit nervous seeing the list of companies they could end up bidding against.

As a reminder, RDOF stands for Rural Digital Opportunity Fund and is an auction that starts in October that will award up to $16.4 billion in broadband funding. The money will be awarded by reverse auction in a process that favors faster technologies, but also favors those willing to take the lowest amount of grant per customer. The areas that are eligible for the funding are among the most remote places in the country, which is why the list of potential large bidders is puzzling.

There are some big cable companies on the list: Altice, Charter Communications, Cox Communications, Atlantic Broadband, Midco, and Mediacom Communications. These companies serve many of the county seats or other nearby towns to many of the RDOF areas. One has to wonder what these companies have in mind. The only one that has chased any significant federal grants in the past is Midco in Minnesota and North Dakota. Midco has been using grant money to extend fiber backhaul to connect its smallest markets, to build last-mile broadband in some tiny towns, and to build fixed wireless in rural areas surrounding its cable markets.

One has to wonder if the other cable companies have a similar plan. It’s incredibly inefficient to build traditional hybrid coaxial-fiber networks in rural areas, so it’s unlikely that the cable companies will be extending their existing networks. The RDOF auction is being done by Census blocks, which in rural areas can cover a large area. The winner of the auction for a given Census block must offer service to everybody in that block. I also have a hard time envisioning all of these big cable companies getting into the wireless business like Midco is doing, so their presence in the auction is a bit of a mystery.

Then there are the traditional large telcos including Frontier, Windstream, Consolidated Communications, and CenturyLink. These companies already serve many of the areas that are covered by the reverse auction. These are the rural areas where these companies have largely neglected the old copper wiring and either offer no broadband or dreadfully slow DSL. The minimum technology allowed to enter the auction must deliver 25/3 Mbps broadband. It’s almost painful to think that these companies would chase the funding and promise to upgrade DSL to 25/3 Mbps after these companies largely botched an upgrade to 10/1 Mbps DSL in the just-ending CAF II grants. The cynic in me says they are willing to pretend to upgrade DSL all over again if that means substantial grant money. I have to think that some of these companies are considering deploying fixed wireless. To the extent any of these companies is willing to take on new debt or use equity, they could also build fiber. None of these companies has built a substantial amount of fiber to truly rural places, but may these grants are the inducement they were waiting for.

Verizon and U.S. Cellular have registered for the auction. You have to think the cellular carriers will be deploying fixed cellular broadband like the 4G FWA product that Verizon just announced recently. These companies already have equipment on towers in many of the RDOF grant areas and would love to grab a subsidy to roll out a product they might be selling in these areas anyway.

Then there are the satellite companies SpaceX, Hughes Network Systems, and Viasat. Viasat has won federal grant money before for selling broadband from its high-altitude satellites. SpaceX is the wildcard since nobody knows anything about the pricing or real speeds they can provide. We know that Elon Musk has been lobbying the FCC to let him have a shot at the billions up for grabs in this auction.

There is another interesting wildcard with Starry. Their business plan is currently selling fixed wireless to large apartment buildings in center cities and they’ve developed a proprietary technology that’s perfect for that application. They must have something else in mind in chasing grant money in remote areas that are 180 degrees different than their normal business model. Starry founder Chet Kanojia is incredibly creative, so he probably has a new technology in mind if he wins auction funding.

There may be other big players in the auction as well since many of the registered bidders are participating under partnerships or corporations that are disguising their identity for now. I think one thing is clear and some of the rural ISPs and cooperative who think nobody else is interested in their markets will get a surprise early in the auction. These big companies didn’t register for the grant auction to sit on the sidelines.

Western Governors Take a Stance on Broadband

The Western Governor’s Association (WGA) represents all of the states west of the line starting with Texas north to North Dakota, includes Alaska, Hawaii, and the western American territories. The association has been jointly exploring issues of joint interest for the region in many years. In July, the WGA issued a policy position paper that lays forth goals for broadband for 2020 through 2028. This is one of the more concise descriptions I’ve seen of the problems due to poor broadband as well as a concise of a list of needed broadband solutions.

The policy paper begins with the simple statement that broadband is critical infrastructure. I don’t know that I’ve ever seen the term ‘critical’ used at the FCC, which doesn’t share the same sense of urgency as the western governors.

The WGA states that the 25/3 Mbps definition of broadband is obsolete and that western economies and communities need faster broadband to prosper. The FCC recently proposed to stick with the 25/3 Mbps definition of broadband for yet another year and will still fund grants that would build new broadband networks that provide that speed. The governors support a faster definition of broadband that is scalable and that will increase as needed in the future.

The West has a lot of characteristics that make it difficult to find broadband solutions. Communities are often far apart and there are often few opportunities to monetize middle-mile fiber needed to connect communities. The West also has significantly large tracts of federal land that presents a challenge for building broadband infrastructure. Federal agencies like the US Forest Service, Bureau of Land Management, and Bureau of Indian Affairs play a crucial role in allowing (or hindering) the siting of broadband infrastructure. The governors ask for a more coordinated effort from these agencies to remove roadblocks for broadband deployment.

The WGA position paper tackles the inadequacy of the FCC’s 477 data gathering. They note that errors in this mapping are particularly egregious in sparsely populated rural areas where the FCC gathers the maximum advertised speeds instead of actual broadband speeds, which greatly overstates the availability of broadband throughout the West. The WGA supports Congressional funding to produce better data collection.

The WGA complains that there is a maze of broadband grants and financial assistance available from numerous federal agencies but that local communities are rarely able to sort through the opportunities and grants often go unclaimed. The governors support better coordination between federal agencies and states to identify grant opportunities.

The governors strongly support the use of more spectrum for rural broadband.

The governors support an expansion of the eligibility of telephone and electric cooperatives to build new broadband since these are entities that are tackling a lot of western broadband gaps.

The governors ask for better coordination of efforts to help find broadband efforts for tribal lands, which still are far behind the rest of the West in broadband availability and adoption.

The governors ask the federal government to provide more aid in the form of block grants. They say that states know better on how to solve local broadband problems than do federal agencies.

It’s an interesting document in that it is factual and completely non-partisan. As I’ve been contending for many years, broadband is not and should never be a partisan issue. The document reflects the ability of western governors from both parties able to reach an agreement on what’s needed for the region.

A New Fiber Optic Speed Record

Researchers at University College London (UCL) have set a new bandwidth record for fiber optic bandwidth transmission. They’ve been able to communicate through a fiber optic cable at over 178 terabits per second, or 178,000 gigabits per second. The research was done in collaboration with fiber optic firms Xtera and KDDI Research. The press release of the achieved speed claims this is 20% faster than the previously highest achieved speed.

The achieved speed has almost reached the Shannon limit, which defines the maximum amount of error-free data that can be sent over a communications channel. Perhaps the most impressive thing about the announcement was that UCL scientists achieved this speed over existing fiber optic cables and didn’t use pristine fiber installed in a laboratory.

The fast signal throughput was achieved by combining several techniques. First, the lasers use raman amplification, which involves injecting photons of lower energy into a high-frequency photon stream. This produces predictable photon scattering which can be tailored to the characteristics needed for optimally traveling through glass fiber.

The researchers also used Erbium-doped fiber amplifiers. To those who have forgotten the periodic table, erbium is a commonly found metal in nature with an atomic weight of 68. Erbium has a key characteristic needed for fiber optic amplifiers in that the metal efficiently amplifies light in the wavelengths used by fiber optic lasers.

Finally, the amplifiers used for the fast speeds used semiconductor optical amplifiers (SOA). These are diodes that have been treated with anti-reflection coatings so that the laser light signal can pass through with the least amount of scattering. The net result of all of these techniques is that the scientists were able to reduce the amount of light that is scattered during the transmission though a glass fiber cable, thus maximizing data throughput.

UCL also used a wider range of wavelengths than are normally used in fiber optics. Most fiber optic transmission technologies create empty buffers around each light bandwidth being used (much like we do with radio transmissions). The UCL scientists used all of the spectrum, without separation bands, and used several techniques to minimize interference between bands of light.

This short description of the technology being used is not meant to intimidate a non-technical reader, but rather show the level of complexity in today’s fiber optic technology. It’s a technology that we all take for granted, but which is far more complex than most people realize. Fiber optic technology might be the most lab-driven technology in daily use since the technology came from research labs and scientists have been steadily improving the technology for decades.

We’re not going to see multi-terabit lasers in regular use in our networks anytime soon, and that’s not the purpose of this kind of research. UCL says that the most immediate benefit of their research is that they can use some of these same techniques to improve the efficiency of existing fiber repeaters.

Depending upon the kind of glass being used and the spectrum utilized, current long-haul fiber technology requires having the signals amplified every 25 to 60 miles. That means a lot of amplifiers are needed for long-haul fiber routes between cities. Without amplification, the laser light signals get scattered to the point where they can’t be interpreted at the receiving end of the light transmission. As implied by their name, amplifiers boost the power of light signals, but their more important function is to reorder the light signals into the right format to keep the signal coherent.

Each amplification site adds to the latency in long-haul fiber routes since fibers must be spliced into amplifiers and passed through the amplifier electronics. The amplification process also introduces errors into the data stream, meaning some data has to be sent a second time. Each amplifier site must also be in powered and housed in a cooled hut or building. Reducing the number of amplifier sites would reduce the cost and the power requirement and increase the efficiency of long-haul fiber.

America’s First Broadband Blimp?

The Wabash Heartland Innovation Network (WHIN) announced plans to launch an aerostat blimp as part of its mission to monitor IoT sensors for digital agriculture and next-generation manufacturing in a 10-state region of Indiana. The project will be managed by RTO Wireless.

WHIN is headquartered in West Lafayette, Indiana. The group is a consortium of ten counties that is supported by the Lilly Endowment and draws on expertise from Purdue University and Ivy Tech, the local community college. The consortium is a scientific, educational, and charitable 501(c)(3) that is supported by growers, manufacturers, and tech partners such as Wabash National, Nucor, AgriNovus, Demeter, Myers Spring, and Oerlikon.

The concept behind WHIN is to deploy LoRaWAN technology to communicate with Internet of Things sensors for digital agriculture and manufacturing. LoRaWAN technology uses low-power frequency that can cover a large area with a tall enough transmitter. LoRaWAN technology uses a unique spread spectrum technology that is ideal for communicating with IoT sensors, which only transmit data intermittently. A single LoRaWan radio is theoretically able to communicate with huge numbers of sensors. One of the biggest promises of the technology is that it can enable low-power and inexpensive sensors – something that is a challenge if using cellular or other wireless technologies.

The blimp will be tethered to the ground, making it an aerostat. Aerostat blimps are helium-filled and carry the needed electronics to power the LoRaWAN technology and CMRS radios. The press release says the first blimp will hover around 1,000 feet above ground level, which should provide line-of-sight to a large portion of the WHIN territory. The tether includes a power connection to a ground station to operate the electronics as well as a fiber optic cable to carry data traffic to and from the blimp electronics.

This is a creative solution for providing farm sensors. A blimp stationed at 1,000 feet or higher will cover a much larger agricultural footprint than putting the transmitters on much shorter rural cellular towers. Using one tower to communicate with huge numbers of sensors is needed to make the use of farm sensors affordable.

The IoT devices must be designed to communicate at the frequency and using the same technology as the transmitter. Agricultural monitoring is done today with a hodgepodge of technologies. Dairy, hog, and chicken farmers often deploy a local WiFi network since they are usually monitoring a fairly small footprint. There are sensors today that can be read using cellular technology – but one of the big hurdles for cellular companies capturing this market is the ability to cover the huge swaths of farmland. I think one of the drivers behind the FCC’s deferred 5G grant program was to add cellular towers in remote farmlands.

The blimp launch will also test the idea of providing wireless connections for rural broadband. Fixed wireless electronics are typically installed on towers – and the taller the tower, the better for establishing line-of-sight communications with customers. Putting radios at 1,000 feet is five to seven times higher than the normal fixed wireless radio deployment.

There will technology issues with establishing a fixed broadband link since a blimp will move around with the winds. One of the features that enable maximum bandwidth from a normal fixed wireless deployment is that the electronics are fixed in place at both ends, enabling a focused beam between the tower and a customer. It will be interesting to hear how WHIN handles the non-stationary transmitter on the blimp.

If this is deployment is successful we might see similar blimps appearing all over farming communities. We now know that the use of agricultural sensors can improve farm yields while better protecting the environment by reducing the amount of chemicals and insecticides used in farming. What’s been lacking is a platform and system for affordably monitoring sensors.

A New Push to Tax Broadband?

In August, four cities in Indiana – Indianapolis, Evansville, Valparaiso, and Fishers – sued Netflix, Hulu, DirectTV, and Dish Networks claiming that the online video services offered by these companies should have to pay the same franchise fees that cable companies pay for using local rights-of-ways.

I’ve been covering in this blog how cord-cutting has been accelerating, especially this year, and cities are seeing a huge drop in cable franchise fees. These fees are generally levied against the fees charged for traditional cable TV service and are ostensibly to compensate the cities for using the public rights-of-way to deliver TV service.

These fees are a significant source of tax revenues for many communities, and that’s not hard to understand when you realize that the fees range from 3% to 6% that’s added to the cost of every traditional cable TV bill. Most big cable companies say that average cable bills are trending towards $100 per month.

Cities have gotten spoiled by these fees because for the last decade the amount of franchise fees collected has skyrocketed. For over a decade cable companies raised cable rates by 9% or more per year, and those rate increases automatically meant franchise tax revenue increases for cities. While franchise fees might have been relatively small when first imposed, the tax revenues have gotten gigantic as the average cable TV bill approaches $100 per month just for cable.

In a recent blog, I talked about how homes are doing more than just cord-cutting. A survey by Roku showed 25% of TV subscribers are now cord-shavers who have trimmed the size of their cable bill by downgrading packages or dropping extras like movie channels. Cord-shaving also trims franchise tax collection and franchise revenues at cities have to be in a free fall.

Taxes that are imposed unevenly usually eventually are challenged. The cable industry has complained about franchise fees for years, but never seriously tried to eliminate the fees. However, big cable companies are recently yelling foul about competing with online video services that don’t have to collect the franchise fees.

The franchise fees have always been hard to justify from a fairness perspective. If a telephone company or a fiber provider uses the same rights-of-way but doesn’t carry cable TV, then their customers are not charged this same expensive tax. Cities could have more fairly charged a franchise fee on some other basis, such as per mile of cable installed in their cities.  But cities latched onto a cable tax at a time when cable TV was a growing industry.

These Indiana cities are treading into dangerous legal waters because if the courts decide that Netflix doesn’t have to charge the franchise fee, that might provide a legal basis for the cable companies to claim that they also shouldn’t pay.

It would be a disturbing ruling if the online video companies end up having to pay a franchise fee. If Netflix has to pay to use the rights-of-way to reach homes, then why wouldn’t this apply to every other online subscription like newspapers, sports boards, etc. There is nothing particularly different about Netflix’s video signals compared to the numerous other sources of video on the web. Bits from online video data areidentical to every other bit of data delivered across ISP networks.  From a functional perspective, if the cities win this lawsuit, they will be imposing a tax on some, but not all bits passed between an ISP and a customer. That’s a line that I hope we don’t cross.

It’s not hard to understand why cities are unhappy about a drop in cable franchise tax revenues. But any tax that is specific to a given technology is going to change over time. Traditional cable TV as we’ve known it is fading away and could even completely disappear over the next decade. A tax on cable might seem as strange in a decade as a tax in the pass on the proverbial buggy whips.

Is Telemedicine Here to Stay?

It’s going to be interesting to see if telemedicine stays after the end of the pandemic. In the past months, telemedicine visits have skyrocketed. During March and April, the billings for telemedicine were almost $4 billion, compared to only $60 million for the same months a year earlier.

As soon as Medicare and other insurance plans agreed to cover telemedicine, a lot of doctors insisted on remote visits during the first few months of the pandemic. In those early months, we didn’t know a lot about the virus and doctor offices were exercising extreme caution about seeing patients. But now, only four months later a lot of doctor’s offices are back to somewhat normal patient volumes, all done using screening patients at the door for temperature and symptoms.

I had two telemedicine visits during April and the experience felt familiar since I was was spending a lot of my day on Zoom meetings that month. These were zoom-like connections using specialized software to protect patient confidentiality, but with a clearly higher resolution camera (and more bandwidth used) at the doctor’s end. I was put on hold waiting for the doctor just as I would have been in the doctor’s office. One of my two sessions dropped in the middle when the doctor’s office experienced a ‘glitch’ in bandwidth. That particular doctor office buys broadband from the local cable incumbent, and I wasn’t surprised to hear that they were having trouble maintaining multiple simultaneous telemedicine connections. It’s the same problem lots of homes were having during the pandemic when multiple family members have been trying to connect to school and work servers at the same time.

One of my two telemedicine sessions was a little less than fully satisfactory. I got an infected finger from digging in the dirt, something many gardeners get occasionally. The visit would have been easier with a live doctor who could have physically looked at my finger. It was not easy trying to define the degree of the problem to the doctor over a computer connection. The second visit was to talk with a doctor about test results, and during the telemedicine visit I was wondering why all such doctor meetings aren’t done remotely. It seems unnecessary to march patients through waiting rooms with sick patents to just have a chat with a doctor.

There was a recent article about the topic in Forbes that postulates that the future of telemedicine will be determined by a combination of the acceptance by doctors and insurance companies. Many doctors have now had a taste of the technology. The doctors that saw me said that the technology was so new to them at the time that they hadn’t yet formed an opinion of the experience. It also seems likely that the telemedicine platforms in place now will get a lot of feedback from doctors and will improve in the next round of software upgrades.

The recent experience is also going to lead a lot of doctor’s offices to look harder at their broadband provider. Like most of us, a doctor’s office historically relied a lot more on download speed than upload speed. I think many doctor’s offices are going to find themselves unhappy with cable modem service or DSL broadband that has been satisfactory in the past. Doctor’s will join the chorus of those advocating for faster broadband speeds – particularly upload speeds.

Telemedicine also means a change for patients. In the two sessions, the doctor wanted to know my basic metrics – blood pressure, temperature, and oxygen levels. It so happens that we already had the devices t home needed to answer those questions, but I have to think that most households do not.

I don’t think anybody is in a position to predict how insurance companies will deal with telemedicine. Most of them now allow it and some have already expanded the use of telemedicine visits through the end of the year. The Forbes articles suggest that insurance companies might want to compensate doctors at a lower rate for telemedicine visits, and if so, that’s probably not a good sign for doctor’s continuing the practice.

My prediction is that telemedicine visits will not stay at the current high level, but that they will be here to stay. I think when somebody books a visit to a doctor that they’ll be given a telemedicine option when the reason for the visit doesn’t require an examination. The big issue that will continue to arise is the number of homes without adequate bandwidth to hold a telemedicine session. We know there are millions of people in rural America who can’t make and maintain a secure connection for this purpose. There are likely equal millions in cities that either don’t have a home computer or a home broadband connection. And there will be many homes with so-so broadband that will have trouble maintaining a telemedicine connection. Telemedicine is going to lay bare all of our broadband shortcomings.

Is Online Programming Too Expensive?

I’ve read several articles recently that conjecture that online programming services that mimic cable company TV are in trouble because they are too expensive. This matters when trying to understand the cord-cutting trend because homes are less likely to bolt traditional cable if they have to spend as much elsewhere to get the networks they still want to watch. I haven’t looked a while, so I thought I’d make a new comparison. My local cable company is Charter Spectrum, so I compared the price of Charter cable TV to the online alternatives.

Charter’s base TV plan is called TV Select, and a new Charter subscriber gets a 12-month special price as follows:

$49.99 – 12-month advertised promotional price

$16.45 – Broadcast TV charge

$  6.99 – Settop box

$73.43 – 12-month promotion total price

After 12 months the base price for Select TV goes from $49.99 to $73.99, a $24 increase – and the full monthly fee jumps to $97.43 after the end of the one-year promotion. I’m a sports fan, and to get all of the channels I want I’d have to subscribe to Charter’s TV Silver plan. That package is $20 more expensive than the select plan, or $93.43 for 12 months, and then $117.43 after the end of the promotion period.

Charter’s Broadcast TV Charge has been widely labeled as a hidden fee in that Charter never mentions the fee in any advertising about the cable product. Charter just raised the fee to $16.45 in August, up from $13.50, making it the highest such fee among the big cable companies. But Comcast is not far behind at $14.95 per month and that fee is likely to increase soon. This fee is where the big cable companies are aggregating the charges for local programming from network affiliates of ABC, CBS, FOX, and NBC.

Comcast, AT&T, and some other big cable companies also charge a Regional Sports Fee, but so far Charter is covering this in their base cable costs. The bottom line is that for a Charter customer, my cheapest alternative that includes a full array of network cable channels will cost $73.43 for a year and then go up by $24.

How does this compare with the online alternatives?

  • The cheapest online alternative might be Sling TV. They have two basic small packages that cost $25 each or both for $45. Sling TV has a balanced number of sports and non-sports channels, but in my case doesn’t carry every sports network I want to see. There are also $5 add-on packages that can drive the cost up to $60 to see the network channels most homes probably want to watch. Sling TV doesn’t carry a full array of local network affiliates.
  • Next up in price is Fubo TV, priced at $54.99 per month. This is a sports-centric network that is especially attractive to soccer fans since the network carries a wide array of international sports. Strangely, Fubo TV doesn’t carry ESPN (meaning they also don’t carry ABC or Disney).
  • At the same price of $54.99 is Hulu + Live TV. They carry all of the sports networks I am looking for and a wide array of other network channels. They also carry the local network affiliate channels for most major markets. For $60.99 you can get this service without commercials, which requires downloading shows to watch the commercial-free versions. Hulu + Live TV also lets families and friends network together to watch shows at the same time.
  • YouTube TV is perhaps the closest online product to compare to Charters cable TV plans. This is priced at $64.99 per month. As a sports fan, the YouTube TV lineup provides all of the channels I want to follow my Maryland Terrapins. YouTube TV carries the same local network affiliates for my market that are available on Charter.

All of the online TV options allow subscribers to drop or add the service easily at any time, although none of them give a refund for time already paid. This means no contracts and no term commitment.

It’s easy to see why homes think that online program is too expensive, particularly since Charter falsely advertises their cable product at $49.99. But it costs almost $20 per month more to buy TV from Charter, even with the 12-month promotional price, and then $42 more poor month at the end of the promotion period. It still mystifies me why homes with decent broadband don’t do the math and leave Charter for Hulu or YouTube TV.

K12 Education During the Pandemic

Pew Stateline published a recent article talking about the widely disparate state of educating K12 students during the pandemic. Every school system has students without home broadband or home computers and school districts and states are dealing with these issues in widely different ways.

There are major challenges in educating students outside of the classroom. The Stateline article points out that there are issues beyond providing broadband and computers, and that kids still need adults to help direct their learning. But students without computers or broadband have virtually no chance of keeping up in an environment that relies fully or partially on learning from home.

The article cites a recent study by the Annenberg Institute of Brown University that looks at the impact of the pandemic in the spring semester of this year. The study estimates that students returning to school this fall will have only made between 63% and 68% of the expected gains in reading that would normally have been expected from the last school year. Students will only have made between 37% and 50% of the expected gains in math. It’s hard to imagine what happens to current students if virtual or interrupted education carries through much of the current school year. I’ve seen articles where various educators are already calling 2020 a ‘lost year’.

As part of my ongoing work with community broadband, I’ve talked to communities with a wide range of circumstances and proposed solutions. For example, I talked to the school administrator of a small rural school district that has roughly 600 students. The area resides in a broadband desert and most homes have no good home broadband option – even traditional satellite service barely works in the community where homes are nestled into canyons and valleys.

This small school district is trying the full range of solutions we hear from across the country. The district has scrambled to find computers for students that don’t have them at home. The school district has obtained cellular hotspots for many rural students, although there a lot of places in the county with little or no cellular coverage. The local government has tried to fill in the gap in cellular coverage by deploying a number of public hotspots to provide places where students and home workers can find broadband. But probably the most important thing they are doing is that the superintendent of schools called every student in the district and is trying to find individual solutions for students that are having problems learning.

Even with all this effort, the school district acknowledges that this is not a solution that will work with all students and that some students are going to fall far behind. This school district is only able to tackle the above solutions due to the small number of students in the district. It’s hard to imagine how school districts with thousands of students can even attempt to provide individual solutions.

The pandemic has also shown us that ‘normal’ broadband is not adequate for homes with multiple students and adults trying to work from home at the same time. Even expensive cable broadband subscriptions can be inadequate when more than two people try to share the small upload bandwidth. Emergency home and public hotpots share the same problems and can easily get overwhelmed.

I don’t have any proposed solutions for the problem and as a country, we’re going to somehow deal with a whole generation of students that have fallen behind the expected education progression. I do not doubt that when school gets back to normal that many school districts will figure this out.

For now, local communities have to try to take all of the steps needed to at least try to help students. I talked to somebody who does broadband mapping and was surprised to hear that many school districts are just now trying to figure out which students don’t have computers or home broadband. It’s been six months since the start of the pandemic and it’s hard to believe that school districts didn’t gather these basic facts before now.

States and localities everywhere have scrambled to create WiFi hotspots, but nobody should rest on their laurels and think that solves the problem. Many states and localities have used CAREs money to buy computers, and as important as that is, it is only a piece of the solution. I’ve read that school districts scrambled all summer to adapt curriculum to an online format, but that also doesn’t fix the problem. The bare minimum answer is that school districts need to find ways to do all of the above, and more – and even with that students are going to fall behind this school year. But what other choice do we have? As the Stateline article points out, some lucky families will hire tutors to keep students up to speed – but that’s not going to help the vast majority of students in the coming school year.

Gaming and Broadband Demand

Broadband usage has spiked across the US this year as students and employees suddenly found themselves working from home and needing broadband to connect to school and work servers. But there is another quickly growing demand for broadband coming from gaming.

We’ve had online gaming of some sort over the last decade, but gaming has not been data-intensive activity for ISPs. Until recently, the brains for gaming has been provided by special gaming computers or game boxes run locally by each gamer. These devices and the game software supplied the intensive video and sound experience and the Internet was only used to exchange game commands between gamers. Command files are not large and contain the same information that is exchanged between a game controller and a gaming computer. In the past, gamers would exchange the command files across the Internet, and local software would interpret and activate the commends being exchanged.

But the nature of online gaming is changing rapidly. Already, before the pandemic, game platforms had been migrating online. Game companies are now running the core software for games in a data center and not on local PCs or game consoles. The bandwidth path required between the data center core and a gamer is much larger than the command files that used to be exchanged since the data path now carries the full video and music signals as well as 2-way communications between gamers.

There is a big benefit of online gaming for gamers, assuming they have enough bandwidth to participate. Putting the gaming brains in a data center reduces the latency, meaning that game commands can be activated more quickly. Latency is signal delay, and the majority of the delay in any internet transmission happens inside the wires and electronics of the local ISP network. With online gaming, a signal between a gamer only has to cross the gamer’s local ISP network. Before online gaming, that signal had to pass through the local ISP network of both gamers.

There are advantages for gaming companies to move online. They can release a new title instantly to the whole country. Game companies don’t have to manufacture and distribute copies of games. Games can now be sold to gamers who can’t afford the expensive game boxes or computers. Gamers benefit because gaming can now be played on any device and a gamer isn’t forced into buying an expensive gaming computer and then only playing in that one location. Game companies can now sell a gaming experience that can be played from anywhere, not just sitting at a gamer’s computer.

A gaming stream is far more demanding on the network than a video stream from Netflix. Netflix feeds out the video signal in advance of what a viewer is watching, and the local TV or PC stores video content for the next few minutes of viewing. This was a brilliant move by video streamers because streaming ahead of where what viewers are watching largely eliminated the delays and pixelation of video streams that were common when Netflix was new. By streaming in advance of what a viewer is watching, Netflix has time to resend any missed packets so that the video viewing experience has ideal quality by the time a viewer catches up to the stream.

Gaming doesn’t have this same luxury because gaming is played in real time. The gamers at both ends of a game need to experience the game at the same time. This greatly changes the demand on the broadband network. Online gaming means a simultaneous stream being sent from a data center to both gamers, and it’s vital that both gamers receive the signal at the same time. Gaming requires a higher quality of download path than Netflix because there isn’t time to resend missed data packets. A gamer needs a quality downstream path to receive a quality video transmission in real-time.

Gaming adds a second big demand in that latency becomes critical. A player that receives signal just a little faster than an opponent has an advantage. A friend of mine has symmetrical gigabit Verizon FiOS fiber broadband at his home which is capable of delivering the best possible gaming data stream. Yet his son is driving his mother crazy by running category 6 cables between the gaming display and the FiOS modem. He sears that bypassing the home WiFi lowers the latency and gives him an edge over other gamers. From a gamer perspective, network latency is becoming possibly more important than download speed. A gamer on fiber has an automatic advantage over a gamer on a cable company network.

At the same time as the gaming experience has gotten more demanding for network operators the volume of gaming has exploded during the pandemic as people stuck at home have turned to gaming. All of the major game companies are reporting record earnings. The NPD Group that tracks the gaming industry reports that spending on gaming was up 30% in the second quarter of this year compared to 2019.

ISPs are already well aware of gamers who are the harshest critics of broadband network performance. Gamers understand that little network glitches, hiccups, and burps that other uses may not even notice can cost them a game, and so gamers closely monitor network performance. Most ISPs know their gamers who are the first to complain loudly about network problems.