Broadband Interference

Jon Brodkin of ArsTechnica published an amusing story about how the DSL went out of service in a 400-resident village in Wales each morning at 7:00 am. It turns out that one of the residents turned on an ancient television that interfered with the DSL signal to the extent that the network collapsed. The ISP finally figured this out by looking around the village in the morning with a spectrum analyzer until they found the source of the interference.

It’s easy to think that the story points out another weakness of old DSL technology, but interference can be a problem for a lot of other technologies.

This same problem is common on cable company hybrid-fiber coaxial networks. The easiest way to understand this is to think back to the old days when we all watched analog TV. Anybody who watched programming on channels 2 through 5 remembers times when the channels got fuzzy or even became unwatchable. It turns out that there are a lot of different devices that interfere with the frequencies used for these channels including things like microwave ovens, certain motors like power tools and lawnmowers, and other devices like blenders. It was a common household occurrence for one of these channels to go fuzzy when somebody in the house, or even in a neighboring home used one of these devices.

This same interference carries forward into cable TV networks. Cable companies originally used the same frequencies for TV channels inside the coaxial wires that were used over the air and the low TV channels sat between the 5 MHz and 42 MHz frequency. It turns out that long stretches of coaxial wires on poles act as a great antenna, so cable systems pick up the same kinds of interference that happens in homes. It was pretty routine for channels 2 and 3, in particular, to be fuzzy in an analog cable network.

You’d think that this interference might have gone away when cable companies converted TV signals to digital. The TV transmissions for channels 2 through 5 got crystal clear because cable companies relocated the digital version of these channels to better frequency. When broadband was added to cable systems the cable companies continue to use the low frequencies. CableLabs elected to use these frequencies for the upload portion of broadband. There is still plenty of interference in cable networks today – probably even more than years ago as coaxial networks have aged and have more points for interference to seep into the wires. Until the pandemic, we didn’t care much about upload bandwidth, but it turns out that one of the major reasons that cable companies struggle to deliver reliable upload speeds is that they are using the noisiest spectrum for the upload function.

The DSL in the village suffered from the same issue since the telephone copper wires also act as a big outdoor antenna. In this village, the frequency emanating from the old TV exactly matched the frequencies used for DSL.

Another common kind of interference is seen in fixed wireless networks in a situation where there are multiple ISPs using the same frequencies in a given rural footprint. I know of counties where there are as many as five or six different wireless ISPs, and most use the same frequencies since most WISPs rely on a handful of channels in the traditional WiFi bandwidth at 2.4 MHz and 5 MHz. I’ve heard of situations where WiFi is so crowded that the performance of all WISPs suffer.

WiFi also suffers from local interference in the home. The WiFi standard says that all devices have an equal chance of using the frequencies. This means that a home WiFi router will cycle through all the signals from all devices trying to make a WiFi connection. When a WiFi router connects with an authorized device inside the home it allows for a burst of data, but then the router disconnects that signal and tries the next signal – cycling through all of the possible sources of WiFi.

This is the same issue that is seen by people using WiFi in a high-rise apartment building or a hotel where many users are trying to connect to WiFi at the same time. Luckily this problem ought to improve. The FCC has authorized the use of 6 GHz spectrum for home broadband which opens up numerous new channels. Interference will only occur between devices trying to share a channel, but that will be far fewer cases of interference than today.

The technology that has no such interference is fiber. Nothing interferes with the light signal between a fiber hub and a customer. However, once customers connect the broadband signal to their home WiFi network, the same interference issues arise. I looked recently and can see over twenty other home WiFi networks from my office – a setup ripe for interference. Before making too much fun of the folks in the Welsh village, there is a good chance that you are subject to significant interference in your home broadband today.

K12 Education During the Pandemic

Pew Stateline published a recent article talking about the widely disparate state of educating K12 students during the pandemic. Every school system has students without home broadband or home computers and school districts and states are dealing with these issues in widely different ways.

There are major challenges in educating students outside of the classroom. The Stateline article points out that there are issues beyond providing broadband and computers, and that kids still need adults to help direct their learning. But students without computers or broadband have virtually no chance of keeping up in an environment that relies fully or partially on learning from home.

The article cites a recent study by the Annenberg Institute of Brown University that looks at the impact of the pandemic in the spring semester of this year. The study estimates that students returning to school this fall will have only made between 63% and 68% of the expected gains in reading that would normally have been expected from the last school year. Students will only have made between 37% and 50% of the expected gains in math. It’s hard to imagine what happens to current students if virtual or interrupted education carries through much of the current school year. I’ve seen articles where various educators are already calling 2020 a ‘lost year’.

As part of my ongoing work with community broadband, I’ve talked to communities with a wide range of circumstances and proposed solutions. For example, I talked to the school administrator of a small rural school district that has roughly 600 students. The area resides in a broadband desert and most homes have no good home broadband option – even traditional satellite service barely works in the community where homes are nestled into canyons and valleys.

This small school district is trying the full range of solutions we hear from across the country. The district has scrambled to find computers for students that don’t have them at home. The school district has obtained cellular hotspots for many rural students, although there a lot of places in the county with little or no cellular coverage. The local government has tried to fill in the gap in cellular coverage by deploying a number of public hotspots to provide places where students and home workers can find broadband. But probably the most important thing they are doing is that the superintendent of schools called every student in the district and is trying to find individual solutions for students that are having problems learning.

Even with all this effort, the school district acknowledges that this is not a solution that will work with all students and that some students are going to fall far behind. This school district is only able to tackle the above solutions due to the small number of students in the district. It’s hard to imagine how school districts with thousands of students can even attempt to provide individual solutions.

The pandemic has also shown us that ‘normal’ broadband is not adequate for homes with multiple students and adults trying to work from home at the same time. Even expensive cable broadband subscriptions can be inadequate when more than two people try to share the small upload bandwidth. Emergency home and public hotpots share the same problems and can easily get overwhelmed.

I don’t have any proposed solutions for the problem and as a country, we’re going to somehow deal with a whole generation of students that have fallen behind the expected education progression. I do not doubt that when school gets back to normal that many school districts will figure this out.

For now, local communities have to try to take all of the steps needed to at least try to help students. I talked to somebody who does broadband mapping and was surprised to hear that many school districts are just now trying to figure out which students don’t have computers or home broadband. It’s been six months since the start of the pandemic and it’s hard to believe that school districts didn’t gather these basic facts before now.

States and localities everywhere have scrambled to create WiFi hotspots, but nobody should rest on their laurels and think that solves the problem. Many states and localities have used CAREs money to buy computers, and as important as that is, it is only a piece of the solution. I’ve read that school districts scrambled all summer to adapt curriculum to an online format, but that also doesn’t fix the problem. The bare minimum answer is that school districts need to find ways to do all of the above, and more – and even with that students are going to fall behind this school year. But what other choice do we have? As the Stateline article points out, some lucky families will hire tutors to keep students up to speed – but that’s not going to help the vast majority of students in the coming school year.

Gaming and Broadband Demand

Broadband usage has spiked across the US this year as students and employees suddenly found themselves working from home and needing broadband to connect to school and work servers. But there is another quickly growing demand for broadband coming from gaming.

We’ve had online gaming of some sort over the last decade, but gaming has not been data-intensive activity for ISPs. Until recently, the brains for gaming has been provided by special gaming computers or game boxes run locally by each gamer. These devices and the game software supplied the intensive video and sound experience and the Internet was only used to exchange game commands between gamers. Command files are not large and contain the same information that is exchanged between a game controller and a gaming computer. In the past, gamers would exchange the command files across the Internet, and local software would interpret and activate the commends being exchanged.

But the nature of online gaming is changing rapidly. Already, before the pandemic, game platforms had been migrating online. Game companies are now running the core software for games in a data center and not on local PCs or game consoles. The bandwidth path required between the data center core and a gamer is much larger than the command files that used to be exchanged since the data path now carries the full video and music signals as well as 2-way communications between gamers.

There is a big benefit of online gaming for gamers, assuming they have enough bandwidth to participate. Putting the gaming brains in a data center reduces the latency, meaning that game commands can be activated more quickly. Latency is signal delay, and the majority of the delay in any internet transmission happens inside the wires and electronics of the local ISP network. With online gaming, a signal between a gamer only has to cross the gamer’s local ISP network. Before online gaming, that signal had to pass through the local ISP network of both gamers.

There are advantages for gaming companies to move online. They can release a new title instantly to the whole country. Game companies don’t have to manufacture and distribute copies of games. Games can now be sold to gamers who can’t afford the expensive game boxes or computers. Gamers benefit because gaming can now be played on any device and a gamer isn’t forced into buying an expensive gaming computer and then only playing in that one location. Game companies can now sell a gaming experience that can be played from anywhere, not just sitting at a gamer’s computer.

A gaming stream is far more demanding on the network than a video stream from Netflix. Netflix feeds out the video signal in advance of what a viewer is watching, and the local TV or PC stores video content for the next few minutes of viewing. This was a brilliant move by video streamers because streaming ahead of where what viewers are watching largely eliminated the delays and pixelation of video streams that were common when Netflix was new. By streaming in advance of what a viewer is watching, Netflix has time to resend any missed packets so that the video viewing experience has ideal quality by the time a viewer catches up to the stream.

Gaming doesn’t have this same luxury because gaming is played in real time. The gamers at both ends of a game need to experience the game at the same time. This greatly changes the demand on the broadband network. Online gaming means a simultaneous stream being sent from a data center to both gamers, and it’s vital that both gamers receive the signal at the same time. Gaming requires a higher quality of download path than Netflix because there isn’t time to resend missed data packets. A gamer needs a quality downstream path to receive a quality video transmission in real-time.

Gaming adds a second big demand in that latency becomes critical. A player that receives signal just a little faster than an opponent has an advantage. A friend of mine has symmetrical gigabit Verizon FiOS fiber broadband at his home which is capable of delivering the best possible gaming data stream. Yet his son is driving his mother crazy by running category 6 cables between the gaming display and the FiOS modem. He sears that bypassing the home WiFi lowers the latency and gives him an edge over other gamers. From a gamer perspective, network latency is becoming possibly more important than download speed. A gamer on fiber has an automatic advantage over a gamer on a cable company network.

At the same time as the gaming experience has gotten more demanding for network operators the volume of gaming has exploded during the pandemic as people stuck at home have turned to gaming. All of the major game companies are reporting record earnings. The NPD Group that tracks the gaming industry reports that spending on gaming was up 30% in the second quarter of this year compared to 2019.

ISPs are already well aware of gamers who are the harshest critics of broadband network performance. Gamers understand that little network glitches, hiccups, and burps that other uses may not even notice can cost them a game, and so gamers closely monitor network performance. Most ISPs know their gamers who are the first to complain loudly about network problems.

AT&T Argues for Broadband Reform

Ed Gillespie, the Senior Executive Vice President of External & Legislative Affairs at AT&T posted a policy position on the AT&T website that argues for major policy reform to better bring broadband to low-income homes and rural areas.

It’s hard for any broadband advocate to not agree with the suggestions Mr. Gillespie is making:

  • He wants Congress to finish funding the new FCC mapping program to identify homes without access to broadband.
  • He supports additional broadband grant funding for programs like the $20 billion RDOF grants.
  • He supports Lifeline reform and says that it should be as easy for low-income homes to apply a Lifeline discount as it is to use a card to buy food from the SNAP program.
  • He thinks funding should be increased for the Lifeline program and should be funded by Congress rather than funded through a 26% tax on interstate telephony.

I hope AT&T is serious about these proposals because having them lobby for these ideas would help to move the needle on digital inclusion. It’s just odd to see these positions from AT&T since they have spent a lot of effort and dollars arguing against some of these policies.

Mr. Gillespie complains that a lot of the current $9.25 Lifeline discount program is used by MVNOs and other carriers that have not built networks. That’s an ironic argument for AT&T to make since the company has done it’s best to walk away from the Lifeline program. AT&T no longer offers Lifeline in 14 states – AL, AR, FL, IN, KS, KY, LA, MS, NC, NV, SC, TN, and WI. AT&T still participates in Lifeline in 6 states, but only because those states refuse to allow the company to exit the Lifeline program.

Of course, this would not be an AT&T policy paper if the company didn’t pat itself on the back a bit. Mr. Gillespie brags that the ISP networks in the country weathered the big increase in web traffic due to the pandemic even though predictions were made that networks would collapse. He claims that AT&T made it through the pandemic due to light touch regulation. The fact is, once it was understood that the new traffic on the web was coming during the daytime when the network wasn’t busy, I don’t know any network engineer who thought that the web would collapse. I also wonder why AT&T would claim to have weathered the pandemic well – I would challenge AT&T to bring forth happy customers using AT&T DSL and ask for their testimonials on how the AT&T network enabled multiple people to work from home at the same time.

Mr. Gillespie is also calling for an expansion of the concepts used in the RDOF grants. Those grants provide funding for new broadband networks in rural areas that have the worse broadband. Before supporting an expansion of that grant program, I think many of us are holding judgment on the RDOF reverse auction process. While I think it’s likely that there will be beneficial grants given to those willing to build rural fiber networks, I also fear that a huge amount of these grants are going to be wasted on satellite broadband or other technologies that don’t bring rural broadband in line with urban broadband. I’m not ready to bless that grant program until we see how the reverse auction allocates money. I also can’t help being suspicious that AT&T’s position in favor of more grants reflects a hope to win billions of new grant dollars.

Interestingly, even though he never says it, the reforms that Mr. Gillespie is asking for require new broadband regulation. I’m sure that Mr. Gillespie must realize that bills needed from Congress for these reforms are not likely to stop with just AT&T’s wish list. We are long overdue for a new telecommunications act that brings broadband regulation in line with today’s reality. The last such law was passed at a time when people were flocking to AOL for dial-up access. It’s highly likely that new telecom legislation is going to go beyond what AT&T is calling for. It’s likely that new legislation will give some broadband regulating authority back to the FCC and will likely include some version of net neutrality. It’s ironic to see arguments for a stronger FCC when the FCC walked away from regulating broadband at the urging of AT&T and the other giant ISPs. Perhaps even AT&T knows it went too far with deregulation.

Will Congress Fund Rural Broadband?

Members of Congress seem to be competing to sponsor bills that will fund rural broadband. There are so many competing bills that it’s getting hard to keep track of them all. Hopefully, some effort will be made to consolidate the bills together into one coherent broadband funding bill.

The latest bill is the Accessible, Affordable Internet for All Act, introduced in the House of Representatives. This is part of a plan to provide $1.5 trillion of infrastructure funding that would include $100 billion for rural broadband. $80 billion of the funding would be used to directly construct rural broadband. It’s worth looking at the details of this bill since it’s similar to some of the other ideas floating around Congress.

The bill focuses on affordability. In addition to building broadband it would:

  • Require ISPs to offer an affordable service plan to every consumer
  • Provide a $50 monthly discount on internet plans for low-income households and $75 for those on tribal lands.
  • Gives a preference to networks that will offer open access to give more choice to consumers.
  • Direct the FCC to collect data on broadband prices and to make that data widely available to other Federal agencies, researchers, and public interest groups
  • Direct the Office of Internet Connectivity and Growth to conduct a biennial study to measure the extent to which cost remains a barrier to broadband adoption.
  • Provide over $1 billion to establish two new grant programs: the State Digital Equity Capacity Program, an annual grant program for states to create and implement comprehensive digital equity plans to help close gaps in broadband adoption and digital skills, and the Digital Equity Competitive Grant Program which will promote digital inclusion projects undertaken by individual organizations and local communities
  • Provide $5 billion for the rapid deployment of home internet service or mobile hotspots for students with a home Internet connection.

This bill also guarantees the right of local governments, public-private partnerships, and cooperatives to deliver broadband service – which would seemingly override the barriers in place today in 21 states that block municipal broadband and the remaining states that don’t allow electric cooperatives to be ISPs.

This and the other bills have some downsides. The biggest downside is the use of a reverse auction.  There are two big problems with reverse auctions that the FCC doesn’t seem to want to acknowledge. First, a reverse auction requires the FCC to predetermine the areas that are eligible for grants – and that means relying on their lousy data. Just this month I was working with three different rural counties where the FCC records show the entire county has good broadband because of over-reporting of speeds by a wireless ISP. In one county, a WISP claimed countywide availability of 300 Mbps broadband. In another county a WISP claimed countywide coverage of 100 Mbps symmetrical broadband coverage, when their closest transmitter was a county and several mountain ranges away. Until these kinds of mapping issues are fixed, any FCC auctions are going to leave out a lot of areas that should be eligible for grants. The people living in these areas should not suffer due to poor FCC data collection.

Second, there are not enough shovel ready projects ready to chase $80 billion in grant funding. If there is no decent ISP ready to build in a predetermined area, the funding is likely to revert to a satellite provider, like happened when Viasat was one of the largest winners in the CAF II reverse auction. The FCC also recently opened the door to allowing rural DSL into the upcoming RDOF grant – a likely giveaway to the big incumbent telcos.

This particular bill has a lot of focus on affordability, and I am a huge fan of getting broadband to everybody. But policymakers have to know that this comes at a cost. If a grant recipient is going to offer affordable prices and even lower prices for low-income households then the amount of grant funding for a given project has to be higher than what we saw with RDOF. There also has to be some kind of permanent funding in place if ISPs are to provide discounts of $50 to $75 for low-income households – that’s not sustainable out of an ISP revenue stream.

The idea of creating huge numbers of rural open-access networks is also an interesting one. The big problem with this concept is that there are many places in the country where there a few, or even no local ISPs. Is it an open-access network if only one, or even no ISPs show up to compete on a rural network?

Another problem with awarding this much money all at once is that there are not enough good construction companies to build this many broadband rural networks in a hurry. In today’s environment that kind of construction spending would superheat the market and would drive up the cost of construction labor by 30-50%. It would be just as hard to find good engineers and good construction managers in an overheated market – $80 billion is a lot of construction projects.

Don’t take my negative comments to mean I am against massive funding for rural broadband. But if we do it poorly a lot of the money might as well just be poured into a ditch. This much money used wisely could solve a giant portion of the rural broadband problem. But done poorly and many rural communities with poor broadband probably won’t get a solution. Congress has the right idea, but I hope that they don’t dictate how to disperse the money without talking first to rural industry experts, or this will be another federal program with huge amounts of wasted and poorly spent money.

An End to Data Caps?

All of the major ISPs that were enforcing data caps have lifted those caps in response to the COVID-19 crisis. This includes AT&T, Comcast, Cox, Mediacom, and CenturyLink. All of these companies justified data caps as a network management tool that was in place to discourage overuse of the network. That argument no longer holds water if these ISPs eliminate the during a crisis that is overtaxing networks more than we are likely to ever see again.

These companies eliminated the caps as a result of political pressure and from a mass public outcry. The caps were eliminated to make broadband more affordable in a time when millions of people are becoming unemployed. By eliminating the caps during this crisis, these ISPs have publicly admitted that the caps were about making money and not for any issues related to network traffic.

The lame justification these ISPs gave for data caps was always weak when other large ISPs like Charter, Verizon, Altice, Frontier, and Windstream never implemented data caps. A few ISPs on that list like Frontier and Windstream have some of the weakest networks in the country yet never sought to implement data caps as a traffic control measure.

AT&T has been the king of data caps. They have data caps that kick in as low as 150 gigabytes of monthly broadband usage on DSL lines. AT&T’s fixed wireless product for rural markets has a data cap that kicks in at 250 GB. Interestingly, customers buying the 300 Mbps on fiber have a 1 terabyte data cap while customers buying gigabit broadband on fiber are allowed unlimited usage. This also proves that the data caps aren’t about traffic control – the caps are removed from the largest data users. The AT&T caps are to encourage somebody buying 300 Mbps to upgrade to the faster service. AT&T is also the king of overage charges. For DSL and fixed wireless, the overage charges are $10 for each 50 GB, with a maximum monthly overage of a whopping $200. The monthly dollar cap on 300 Mbps service is $100.

Mediacom had the next lowest data caps at 400 Mbps. Comcast and Cox have had data caps at 1 TB. It’s been reported by customers that the companies aggressively enforce the caps. CenturyLink has mostly not billed the data caps, but the fact that they eliminated the caps during this crisis likely means they were billing it to some customers.

To put these data caps in context, OpenVault says that at the end of 2019 that the average households used 344 gigabytes of data, up from 275 gigabytes a year earlier. More germane to data caps, OpenVault says that nearly 1% of homes now use 2 terabytes per of data month and 7.7% use over 1 terabyte per month. The percentage of homes using over 1 terabyte climbed from 4% a year earlier. AT&T has likely been cleaning up with data caps charges while Comcast was just starting to see some real revenue from the caps.

What remains to be seen is if these ISPs reintroduce data caps sometime later this year. They can no longer make a straight-faced claim that data caps are in place to dissuade overuse of the network. If data caps had that kind of impact on networks, then during the crisis the ISPs should have tightened the data cap threshold to protect the many new households that are working from home or doing schoolwork remotely. The data caps have always been about money, nothing else.

Unfortunately, we have no recourse other than a loud public outcry if these ISPs renew the data caps. The FCC has completely washed its hands of broadband regulation and killed its authority to do anything about data caps. Most tellingly, when FCC Chairman Ajit Pai released a plea to ISPs to “Keep Americans Connected”, that plea didn’t even mention data caps. Chairman Pai asked ISPs not to disconnect customers for not paying and asked ISPs to provide more public hotspots.

I bet that when this crisis is over that the big ISPs will quietly introduce data caps again. Even before this crisis, almost 9% of homes routinely used more than a terabyte of data, and the data caps are a huge moneymaker for the big ISPs that they are not willingly going to give up. During this crisis, a lot of routine functions are going go virtual and I expect a lot of them will stay virtual when the crisis is over. It wouldn’t be surprising a year from now to see 20% of homes routinely exceeding a terabyte of usage each month.

I think these ISPs will be making a huge mistake if they introduce data caps a few months, or even a year from now. Millions of people found themselves unable to work or school from home due to poor broadband. In the current environment a big public outcry against bad ISP behavior has a high chance of bringing Congressional action. Almost nobody, except the most partisan politicians would vote against a bill that bans data caps. The ISPs should be afraid of other restrictions that might come along with such a bill.

The Fragile Supply Chain

The recent outbreak of the coronavirus reminded us how fragile the supply chain is for telecom. As it turns out, the Hubei province of China is where much of the world’s optics and lasers are built that are the key component in every device that is used to communicate in a fiber network. Within days after the reality of the virus become apparent, the stocks of tech companies that rely on lasers took a hit.

The supply chain for electronics manufacturing stretches worldwide. The lasers are made in one place. The chips in devices are made somewhere else. Other electronic components come from a third geographic source. Components like cellphone screens and other non-electric components come from yet a different place. And the raw materials to make all of these devices come from markets all over the world.

The virus scare made the world wake up to the fragility of the supply chain. Without lasers, there would be no fiber-to-the-home devices manufactured. There would be no new servers in data centers. There would be no new small cell sites built or activated. Major industries could be brought to their knees within weeks.

It’s not hard to understand why I say the supply chain is fragile. Consider smartphones. There are probably a dozen components in a smartphone that must be delivered on time to a smartphone factory to keep the manufacturing process going. If any one of those components can’t be delivered, smartphone manufacturing comes to a halt. The manufacturing floor can be crippled by a lack of screens just as much as it can suffer if the chips, antennas, or other key electronic components become unavailable.

It’s impossible to know if the coronavirus will cause any major disruption in the supply chain for fiber optics – but the point is that it could. If it’s not a virus today, disruptions could come from a wide range of natural disasters and manmade problems. I remember a fire that destroyed a fiber optic cable factory a few decades ago that created a major shortfall of optic cables for a year. Floods, fires, earthquakes, and other disasters can knock out key manufacturing sites.

Manmade disruptions to the supply chain are even easier to imagine. We saw the price of electronics components shoot up over the last year due to tariff battles between the US and China. The supply chain can be quickly cut if the country making devices goes to war, or even undergoes an ugly regime change. It’s also now possible to weaponize the supply chain and threaten to cut off key components when negotiating other issues.

I’m sure that very few Americans realized that the Wuhan region has a near-monopoly on the manufacture of lasers. A worldwide economy rewards the creation of monopolies because components are cheapest when an industry takes the most advantage of the economy of scale. The companies in the Wuhan region can likely manufacture lasers cheaper than anybody else.

From a strategic position, countries like the US should foster their own industries to manufacture vital components. But that’s not easy or practical to achieve. A new US company trying to compete on the world stage by making lasers is likely to be more expensive and unable to compete when the supply chain is humming at normal capacity. It’s hard to picture creating a competitor to the Wuhan region that can manufacture lasers in the quantities, and at a price the market is willing to pay.

In the long run, the world always finds alternate solutions to any permanent changes in the supply chain. For example, if China is ever unable to export lasers, within a few years other countries would pick up the slack. But the fiber industry would be devastated during the lull needed to find a new source of components. Bank of America reported last year that 3,000 major manufacturing companies were already reconsidering their supply chain due to tariff and other concerns. Some of these companies, particularly electronics companies have been considering bringing production back to the US now that factories can be heavily robotized. I’m sure the coronavirus has accelerated these decisions.

 

Will Costly Alternatives Slow Cord Cutting?

The primary reason that households claim they cut the cord is due to price. Surveys have shown that most households regularly watch around a dozen cable channels, and cord cutters still want to see their favorite channels. Not all cord cutters are willing to go cold turkey on the traditional cable networks and so they seek out an online alternative that carries the networks they want to watch.

For the last few years, there have been online alternatives that carry the most popular cable networks for prices between $35 and $45 per month. However, during the last year, the cost of these alternatives has risen significantly. I doubt that the price increases will drive people back to the cable companies where they had to pay for hidden fees and a settop box, but the higher prices might make more households hesitate to make the switch. Following are the current prices of the major online alternatives to traditional cable TV:

Hulu Live TV. This service is owned 2/3 by Disney and 1/3 by Comcast. They recently announced a price increase effective December 18 to move the package from $44.99 to $54.99. Customers can also select an add-free version for $60.99. At the beginning of 2019, the service was priced at $39.99, so the price increased by 36% during the year.

AT&T TV Now (was called DirecTV Now) raised the price of the service earlier this year from $50 to $65. The company also raised the prices significantly for DirecTV over satellite and lost millions of customers between the two services.

YouTube TV raised prices in May from $40 to $50. This service is owned by Google. Along with the price increase, the service added the Discovery Channel.

Sling TV is owned by Dish Networks. They still have the lowest prices for somebody looking for a true skinny package. They offer two line-ups, called Blue or Orange that each cost $25 per month, or both for $40 per month. There are also add-ons packages for $5 per month for Kids (Nick channels, Disney Jr), Lifestyle (VH-1, BET, diy, Hallmark), Heartland (outdoor channels), Hollywood (TCM, Sundance, Reelz), along with News, Spanish and International packages. One of the big things missing from Sling TV is local network channels and they provide an HD antenna with a subscription. Sling TV has spread the most popular channels in such a way that customers can easily spend $50 to $60 monthly to get their favorite channels.

Fubo TV is independent and not associated with another big media company. They offer 179 channels, including local network channels for $54.99 per month. The network started with sports coverage including an emphasis on soccer.

TVision Home is owned by T-Mobile. This was formerly known as Layer3 TV. The company has never tried to make this a low-cost alternative and it’s the closest online service to mimic traditional cable TV. The service is only available today in a few major markets. Customers can get an introductory price of $90 per month (goes up to $100 after a year). They charge $10 per extra TV and also bill taxes that range from 4% to 20% depending upon the market. This is cable TV delivered over broadband.

Playstation Vue. The service is owned by Sony and has announced that it will cease service at the end of January 2020. The service is no longer taking new customers. The price of the core packages is $55 per month, which increased by $5 in July.  The service carries more sports channels than most of the other services.

The channels offered by each service differ, so customers need to shop carefully and compare lineups. For example, I’m a sports fan and Sling TV and Fubo TV don’t carry the BigTen Network. There are similar gaps throughout the lineups of all of the providers.

All of these alternatives, except perhaps TVision Home, are still less expensive than most traditional cable TV packages. However, it looks like all of these services are going to routinely increase rates to cover increased programming fees. Couple that with the fact that customers dropping cable TV probably lose their bunding discounts, and a lot of houses are probably still on the fence about cord cutting.

A Peek at AT&T’s Fixed LTE Broadband

Newspaper articles and customer reviews provide a glimpse into the AT&T wireless LTE product being used to satisfy the original CAF II obligations. This article from the Monroe County Reporter reports on AT&T wireless broadband in Monroe County, Georgia. This is a county where AT&T accepted over $2.6 million from the original CAF II program to bring broadband to 1,562 rural households in the County.

Monroe is a rural county southeast of Atlanta with Forsyth as the county seat. As you can see by the county map accompanying this blog, AT&T was required to cover a significant portion of the county (the areas shown in green) with broadband of at least 10/1 Mbps. In much of the US, AT&T elected to use the CAF II money to provide faster broadband from cellular towers using LTE technology.

The customer cited in the article is happy with the AT&T broadband product and is getting 30/20 Mbps service. AT&T is cited in the article saying that the technology works best when serving customers within 2 miles of a cell tower, but that the coverage can sometimes extend to 3 miles. Unfortunately, 2 miles or even 3 miles isn’t very far in rural America and there are going to be a lot of homes in the CAF II service area that will be too far from an AT&T cell tower to get broadband.

From the AT&T website, the pricing for the LTE broadband is as follows. The standalone data product is $70 per month. Customers can get the product for $50 per month with a 1-year contract if they subscribe to DirecTV or an AT&T cellular plan that includes at least 1 GB of cellular broadband allowance. The LTE data product has a tiny data cap of 215 GB of download per month. Customers that exceed the data cap pay $10 for each additional 50 GB of data, up to a maximum fee of $200 per month.

The average household broadband usage was recently reported by OpenVault as 275 GB per month. A household using that average broadband would pay an additional $30 monthly. OpenVault also reported recently that the average cord cutter uses over 520 GB per month. A customer using a cord cutter level of data would pay an additional $70 per month. The product is only affordably priced if a household doesn’t use much broadband.

The article raises a few questions. First, this customer had to call AT&T to get the service, which apparently was not being advertised in the area. He said it took a while to find somebody at AT&T who knew about the LTE broadband product. The customer also said that the installer for the service came from Bainbridge, Georgia – which is a 3-hour drive south from the AT&T cell site mentioned in the article.

This highlights one of the major problems of rural broadband that doesn’t get talked about enough. The big telcos all have had massive layoffs over the last decade, particularly in the workforces supporting copper and rural networks. Even should one of these big telcos offer a rural broadband product, how good is that product without technician support? As I travel the county, I hear routine stories of rural folks who wait weeks to get broadband problems fixed.

When I heard that AT&T was going to use LTE to satisfy it’s CAF II requirements, my first thought was that their primary benefit was to use the federal funding to beef up their rural cellular networks rather than to start caring about rural broadband customers. In Monroe County, AT&T received almost $1,700 per CAF household, and I wonder if they will all see the benefits of this upgrade.

I’ve always suspected that AT&T wouldn’t aggressively market the LTE broadband product. If they were heavily marketing this by now, at the end of the fifth year of the CAF II buildout, there would be rural customers all over the country buying upgraded broadband. However, news about upgraded broadband is sparse for AT&T, and also for CenturyLink, and Frontier. I work with numerous rural counties where the local government never heard of CAF II since the telcos have done little marketing of improved rural broadband.

The article highlights a major aspect of the plight of rural broadband. We not only need to build new rural broadband infrastructure, but we need to replenish the rural workforce of technicians needed to take care of the broadband networks. The FCC needs to stop giving broadband money to the big telcos and instead distribute it to companies willing to staff up to support rural customers.

Counting Gigabit Households

I ran across a website called the Gigabit Monitor that is tracking the population worldwide that has access to gigabit broadband. The website is sponsored by VIAVI Solutions, a manufacturer of network test equipment.

The website claims that in the US over 68.5 million people have access to gigabit broadband, or 21% of the population. That number gets sketchy when you look at the details. The claimed 68.5 million people includes 40.3 million served by fiber, 27.2 million served by cable company HFC networks, 822,000 served by cellular and 233,000 served by WiFi.

Each of those numbers is highly suspect. For example, the fiber numbers don’t include Verizon FiOS or the FiOS properties sold to Frontier. Technically that’s correct since most FiOS customers can buy maximum broadband speeds in the range of 800-900 Mbps. But there can’t be 40 million people other people outside of FiOS who can buy gigabit broadband from other fiber providers. I’m also puzzled by the cellular and WiFi categories and can’t imagine there is anybody that can buy gigabit products of either type.

VIAVI makes similar odd claims for the rest of the world. For example, they say that China has 61.5 million people that can get gigabit service. But that number includes 12.3 million on cellular and 6.2 million on WiFi.

Finally, the website lists the carriers that they believe offer gigabit speeds. I have numerous clients that own FTTH networks that are not listed, and I stopped counting when I counted 15 of my clients that are not on the list.

It’s clear this web site is flawed and doesn’t accurately count gigabit-capable people. However, it raises the question of how to count the number of people who have access to gigabit service. Unfortunately, the only way to do that today is by accepting claims by ISPs. We’ve already seen with the FCC broadband maps how unreliable the ISPs are when reporting broadband capabilities.

As I think about each broadband technology there are challenges in defining gigabit-capable customers. The Verizon situation is a great example. It’s not a gigabit product if an ISP caps broadband speeds at something lower than a gigabit – even if the technology can support a gigabit.

There are challenges in counting gigabit-capable customers on cable company networks as well. The cable companies are smart to market all of their products as ‘up to’ speeds because of the shared nature of their networks. The customers in a given neighborhood node share bandwidth and the speeds can drop when the network gets busy. Can you count a household as gigabit-capable if they can only get gigabit speeds at 4:00 AM but get something slower during the evening hours?

It’s going to get even harder to count gigabit capability when there are reliable cellular networks using millimeter wave spectrum. That spectrum is only going to able to achieve gigabit speeds outdoors when in direct line-of-site from a nearby cell site. Can you count a technology as gigabit-capable when the service only works outdoors and drops when walking into a building or walking a few hundred feet away from a cell site?

It’s also hard to know how to count apartment buildings. There are a few technologies being used today in the US that bring gigabit speeds to the front of an apartment building. However, by the time that the broadband suffers packet losses due to inside wiring and is diluted by sharing among multiple apartments, nobody gets a true gigabit product. But ISPs routinely count them as gigabit customers.

There is also the issue of how to not double-count households that can get gigabit speeds from multiple ISPs. There are urban markets with fiber providers like Google Fiber, Sonic, US Internet, EPB Chattanooga, and others where customers can buy gigabit broadband on fiber and also from the cable company. There are even a few lucky customers in places like Austin, Texas and the research triangle in North Carolina where some homes have three choices of gigabit networks after the telco (AT&T) also built fiber.

I’m not sure we need to put much energy into accurately counting gigabit-capable customers. I think everybody would agree an 850 to 950 Mbps connection on Verizon FiOS is blazingly fast. Certainly, a customer getting over 800 Mbps from a cable company has tremendous broadband capability. Technically such connections are not gigabit connections, but the difference between a gigabit connection and a near-gigabit connection for a household is so negligible as to not practically matter.