Charter Asks the FCC to Allow Data Caps

In a move that was probably inevitable, Charter has petitioned the FCC to allow the company to begin implementing broadband data caps. Charter has been prohibited from charging data caps as part of an agreement with the FCC when the agency approved the merger with Time Warner Cable in 2016. Charter is also asking the FCC to lift another provision of the merger agreement that prohibits the company from imposing interconnection fees on Netflix and other companies that generate large amounts of web data.

There was one other requirement of the original merger agreement that the FCC already modified in 2017. Charter had voluntarily agreed to pass 2 million new homes within five years of the merger agreement. The original agreement with the FCC required Charter to compete against other cable companies, but in 2017 that was changed to require Charter to instead pass 2 million new homes.

The merger agreement between the FCC and Charter is in effect until May 2023, but the original deal allowed Charter to ask to be relieved of the obligations after four years, which is the genesis of this request. If granted, the two changes would occur in May 2021.

Charter is asking to lift these restrictions now because the original order allowed them to do so this year. There seems a decent likelihood that the FCC will grant the requests since both Chairman Ajit Pai and Commissioner Michael O’Rielly voted against these merger conditions in 2016 and said the restrictions were too harsh.

What I find interesting is that Charter has been bragging to customers for the last four years about how they are the large ISP that doesn’t impose burdensome data caps on customers. This has likely given them a marketing edge in markets where the company competes against AT&T, which aggressively bills data caps.

Charter has to be jealous of the huge dollars that Comcast and AT&T are receiving from data caps. Back in 2016, there were not many homes that used more data than the 1 terabyte cap that AT&T and Comcast place on customers. However, home broadband usage has exploded, even before the COVID-19 pandemic.

OpenVault reported in early 2018 that the average home used 215 gigabytes of data per month. By the end of 2019, the average home usage had grown to 344 GB monthly. During the pandemic, by the end of March 2020, the average home used 402 GB.

What’s more telling is the percentage of homes that now use a terabyte of data per month. According to OpenVault, that’s now more than 10% of homes – including nearly 2% of homes that use more than 2 terabytes. Just a few years ago only a tiny percentage of homes used a terabyte per month of data. Charter has undoubtedly been measuring customer usage and knows the revenue potential from imposing data caps similar to Comcast or AT&T. If Charter can charge $25 for exceeding the data caps, with their 27 million customers the data caps would increase revenues by over $800 million annually – for usage they are already carrying on their network. Charter, like all of the big ISPs, crowed loudly that their networks were able to easily handle the increase in traffic due to the pandemic. But that’s not going to stop them from milking more money out of their biggest data users.

The US already has some of the most expensive broadband in the world. The US landline broadband rates are twice the rates in Europe and the Far East. The US cellular data rates rival the rates in the most expensive remote countries in the world. Data caps imposed by landline and cellular ISPs add huge amounts of margin straight to the bottom lines of the big ISPs and wireless carriers.

What saddest about all of this is that there is no regulation of ISPs and they free to charge whatever they want for broadband. Even in markets where we see a cable company facing competition with fiber from one of the telcos, there is seemingly no competition on price. Verizon, AT&T, and CenturyLink fiber cost roughly the same in most markets as broadband from cable companies, and the duopoly players in such markets gladly split the customers and the profits for the benefit of both companies.

I’ve written several blogs arguing against data caps and I won’t repeat the whole argument. The bottom line is that it doesn’t cost a big ISP more than a few pennies extra to provide service to a customer that uses a terabyte per month at home compared to a home that uses half that. Data cap revenue goes straight to the bottom line of the big ISPs. For anybody that doesn’t believe that, watch the profits at Charter before and after the day when they introduce data caps.

Verizon Restarts Wireless Gigabit Broadband Roll-out

After a two-year pause, Verizon has launched a new version of its fixed wireless access (FWA) broadband, launching the service in Detroit. Two years ago, the company launched a trial version of the product in Sacramento and a few other cities and then went quiet about the product. The company is still touting this as a 5G product, but it’s not and using millimeter wave radios to replace the fiber drop in a fiber network. For some reason, Verizon is not touting this as fiber-to-the-curb, meaning the marketing folks at the company are electing to stress 5G rather than the fiber aspect of the technology.

Verizon has obviously been doing research and development work and the new wireless product looks and works differently than the first-generation product. The first product involved mounting an antenna on the outside of the home and then drilling a hole for fiber to enter the home. The new product has a receiver mounted inside a window that faces the street. This receiver connects wirelessly with a home router that looks a lot like an Amazon Echo which comes enabled with Alexa. Verizon is touting that the new product can be self-installed, as is demonstrated on the Verizon web page for the product.

Verizon says the FWA service delivers speeds up to a gigabit. Unlike with fiber, that speed is not guaranteed and is going to vary by home depending upon issues like distance from the transmitter, foliage, and other local issues. Verizon is still pricing this the same as two years ago – $50 per month for customers who buy Verizon wireless products and $70 per month for those who don’t. It doesn’t look like there are any additional or hidden fees, which is part of the new billing philosophy that Verizon announced in late 2019.

The new product eliminates one of the controversial aspects of the first-generation product. Verizon was asking customers to sign an agreement that they could not remove the external antenna even if they dropped the Verizon service. The company was using external antennas to bounce signals to reach additional homes that might have been out of sight of the transmitters on poles. With units mounted inside of homes that kind of secondary transmission path is not going to be possible. This should mean that the network won’t reach out to as many homes.

Verizon is using introductory pricing to push the product. Right now, the web is offering three months of free service. This also comes with a year of Disney+ for free, Stream TV for free, and a month of YouTube TV for free.

The router connects to everything in the home wirelessly. The wireless router comes with WiFi 6, which is not much of a selling point yet since there are practically no devices in homes that can yet use the new standard – but over time this will become the standard WiFi deployment. Customers can buy additional WiFi extenders for $200 if needed. It’s hard to tell from the pictures if the router unit has an Ethernet jack.

From a network perspective, this product still requires Verizon to build fiber in neighborhoods and install pole-mounted transmitters to beam the signal into homes. The wireless path to the home is going to require a good line-of-sight, but a customer only needs to find one window where this will work.

From a cost perspective, it’s hard to see how this network will cost less than a standard fiber-to-the-home network. Fiber is required on the street and then a series of transmitters must be installed on poles. For the long run operations of the network, it seems likely that the pole-mounted and home units will have to be periodically replaced, meaning perhaps a higher long-term operational cost than FTTH.

Interestingly, Verizon is not mentioning upload speeds. The pandemic has taught a lot of homes how important upload speeds are, Upload speed is currently one of the biggest vulnerabilities of cable broadband and I’m surprised to not see Verizon capitalize on this advantage for the product – that’s probably coming later.

Verizon says they still intend to use the technology to pass 30 million homes – the same goal they announced two years ago. Assuming they succeed, they will put a lot of pressure on the cable companies – particularly with pricing. The gigabit-range broadband products from Comcast and Charter cost $100 or more while the Verizon FWA product rivals the prices of the basic broadband products from the cable companies.

An Update on ATSC 3.0

This is the year when we’ll finally start seeing the introduction of ATSC 3.0. This is the newest upgrade to broadcast television and is the first big upgrade since TV converted to all-digital over a decade ago. ATSC 3.0 is the latest standard that’s been released by the Advanced Television Systems Committee that creates the standards used by over-the-air broadcasters.

ATSC 3.0 will bring several upgrades to broadcast television that should make it more competitive with cable company video and Internet-based programming. For example, the new standard will make it possible to broadcast over-the-air in 4K quality. That’s four times more pixels than 1080i TV and rivals the best quality available from Netflix and other online content providers.

ATSC 3.0 also will support the HDR (high dynamic range) protocol that enhances picture quality by creating a better contrast between light and dark parts of a TV screen. ATSC 3.0 also adds additional sound channels to allow for state-of-the-art surround sound.

Earlier this year, Cord Cutters News reported that the new standard was to be introduced in 61 US markets by the end of 2020 – however, that has slowed a bit due to the COVID-19 pandemic. But the new standard should appear in most major markets by sometime in 2021. Homes will either have to buy ATSC-enabled TVs, which are just now hitting the market, or they can buy an external ATSC tuner to get the enhanced signals.

One intriguing aspect of the new standard is that a separate data path is created with TV transmissions. This opens up some interesting new features for broadcast TV. For example, a city could selectively send safety alerts and messages to homes in just certain parts of a city. This also could lead to targeted advertising that is not the same in every part of a market. Local advertisers have often hesitated to advertise on broadcast TV because of the cost and waste of advertising to an entire market instead of just the parts where they sell service.

While still in the early stages of exploration, it’s conceivable that ATSC 3.0 could be used to create a 25 Mbps data transmission path. This might require several stations joining together to create that much bandwidth. While a 25 Mbps data path is no longer a serious competitor of much faster cable broadband speeds, it opens up a lot of interesting possibilities. For example, this bandwidth could offer a competitive alternative for providing data to cellphones and could present a major challenge to cellular carriers and their stingy data caps.

ATSC 3.0 data could also be used to bring broadband into the home of every urban school student. If this broadband was paired with computers for every student, this could go a long way towards solving the homework gap in urban areas. Unfortunately, like most other new technologies, we’re not likely to see the technology in rural markets any time soon, and perhaps never. The broadband signals from tall TV towers will not carry far into rural America.

The FCC voted on June 16 on a few issues related to the ATSC 3.0 standard. In a blow to broadcasters, the FCC decided that TV stations could not use close-by vacant channels to expand ATSC 3.0 capabilities. The FCC instead decided to maintain vacant broadcast channels to be used for white space wireless broadband technology.

The FCC also took a position that isn’t going to sit as well with the public. As homeowners have continued to cut the cord there have been record sales in the last few years of indoor antennas for receiving over-the-air TV. Over-the-air broadcasters are going to be allowed to sunset the older ATSC 1.0 standard in 2023. That means that homes will have to replace TVs or will have to install an external ATSC 3.0 tuner if they want to continue to watch over-the-air broadcasts.

Where is Net Neutrality When we Need it?

Just in the last two weeks two stories hit the press that highlight behavior from ISPs that would have likely have violated the Net Neutrality rules that were killed by Ajit Pai’s FCC. The big ISPs have been surprisingly quiet and have not loudly violated those rules, even though they are no longer in effect. The industry speculation is that the big ISPs are treading lightly because they don’t want to trigger a regulatory overreaction should there be a chance of party in the administration or Congress.

The first headline says that AT&T is excluding HBO max from the calculation of any data caps. This is a big deal for AT&T cellular customers and not insignificant for AT&T landline broadband customers that face data caps.

AT&T defends this by referring to other ‘sponsored data plans’ in the industry, like the one offered by T-Mobile that lets premium customers exclude usage from YouTube, Netflix, Hulu, HBO, Sling YV, ESPN, Showtime, Starz and other sources of video.

I don’t know enough to know if T-Mobile is violating the old net neutrality rules. Net neutrality rules would allow an ISP to exempt all video from data caps and would not violate any rules because the ISP wouldn’t be discriminating against any particular source of video. However, if T-Mobile is being paid by those companies to exclude their data from data caps, then T-Mobile would also be violating the spirit of net neutrality. AT&T’s exclusion of HBO Max from data caps is more blatant since AT&T owns HBO – the policy is clearly being made to benefit HBO over Disney, Netflix or other competitors of HBO.

It was easy to predict that sponsored data is something that carriers would be pushing the envelope on, even if net neutrality was still in effect. It’s something that customers like, and so it’s hard to fire the public up that sponsored data is bad for the industry. But it is. AT&T is clearly disadvantaging other video services in favor of their own. If T-Mobile doesn’t exclude all video from data caps they are doing the same thing – just not to advantage their own video product. The original FCC net neutrality order pointed out that sponsored data can make it hard for a new market entrant, and they could be right – we don’t see a lot of new names of companies that stream video.

The second headline is one that broadband customers everywhere will hate. Jon Brodkin in arstechnica describes a situation where Cox is slowing down the upload path to a customer for using too much broadband – and even worse is openly admitting to capping the upload speeds for an entire neighborhood.

I won’t recount all of the details of the story. In a nutshell, there is a customer that is backing up huge amounts of data each night from midnight until 8:00 am. It takes that long to complete the backup because the upload speed available to the customer is only 35 Mbps. If this customer was on symmetrical fiber this backup could be done quickly. Apparently, this customer has been doing the same thing for years, but they have recently been notified by Cox that they need to stop the practice or be kicked from the network. Cox also threatened by cut the upload bandwidth available to the whole neighborhood.

This particular customer uses over 8 terabytes of data per month, which is an extraordinary amount of usage on a home broadband line. But if the usage is all really late at night, it’s unlikely that this is very disruptive to the neighborhood.

What’s extraordinary about this is that the customer doesn’t seem to be violating the Cox terms or service. The customers is already paying extra to avoid the data cap to get unlimited data. Cox is basically saying to the customer that there is some secret usage threshold that they associate with ‘unlimited’ data – yet they won’t give the customer a targeted usage threshold.

Where Cox really crosses the line is when they threaten to penalize an entire neighborhood for using too much data. According to Brodkin this one customer is not the only example of this same behavior by Cox.

If we had an FCC that regulated broadband they would likely slap Cox for this behavior. What’s odd is that Cox doesn’t have to be so arbitrary. They could easily have established rules in the terms of service and their products that could have legally handled this situation. Instead, the sold unlimited data and decided afterwards that there really is a limit on the amount of data they are willing to provide. The fault for this situation seems to lie mostly in the legal department at Cox rather then with the customer who has had the same usage for years.

ISPs ought to realize that the regulatory pendulum always swings the other way. Ajit Pai has completely deregulated one of the largest industries in the country that touches almost everybody. That pushes the regulatory pendulum as far as it can go towards the ‘unregulated’ side, and it’s inevitable that a future Congress or FCC is going to bring back regulation again at some point. When they do, all of the bad behavior by ISPs during this time of deregulation will be used as examples of why regulation is necessary. If the ISPs push the envelope too far they regulatory pendulum will swing a lot further in the regulated direction than they are going to like.

Will Congress Fund Rural Broadband?

Members of Congress seem to be competing to sponsor bills that will fund rural broadband. There are so many competing bills that it’s getting hard to keep track of them all. Hopefully, some effort will be made to consolidate the bills together into one coherent broadband funding bill.

The latest bill is the Accessible, Affordable Internet for All Act, introduced in the House of Representatives. This is part of a plan to provide $1.5 trillion of infrastructure funding that would include $100 billion for rural broadband. $80 billion of the funding would be used to directly construct rural broadband. It’s worth looking at the details of this bill since it’s similar to some of the other ideas floating around Congress.

The bill focuses on affordability. In addition to building broadband it would:

  • Require ISPs to offer an affordable service plan to every consumer
  • Provide a $50 monthly discount on internet plans for low-income households and $75 for those on tribal lands.
  • Gives a preference to networks that will offer open access to give more choice to consumers.
  • Direct the FCC to collect data on broadband prices and to make that data widely available to other Federal agencies, researchers, and public interest groups
  • Direct the Office of Internet Connectivity and Growth to conduct a biennial study to measure the extent to which cost remains a barrier to broadband adoption.
  • Provide over $1 billion to establish two new grant programs: the State Digital Equity Capacity Program, an annual grant program for states to create and implement comprehensive digital equity plans to help close gaps in broadband adoption and digital skills, and the Digital Equity Competitive Grant Program which will promote digital inclusion projects undertaken by individual organizations and local communities
  • Provide $5 billion for the rapid deployment of home internet service or mobile hotspots for students with a home Internet connection.

This bill also guarantees the right of local governments, public-private partnerships, and cooperatives to deliver broadband service – which would seemingly override the barriers in place today in 21 states that block municipal broadband and the remaining states that don’t allow electric cooperatives to be ISPs.

This and the other bills have some downsides. The biggest downside is the use of a reverse auction.  There are two big problems with reverse auctions that the FCC doesn’t seem to want to acknowledge. First, a reverse auction requires the FCC to predetermine the areas that are eligible for grants – and that means relying on their lousy data. Just this month I was working with three different rural counties where the FCC records show the entire county has good broadband because of over-reporting of speeds by a wireless ISP. In one county, a WISP claimed countywide availability of 300 Mbps broadband. In another county a WISP claimed countywide coverage of 100 Mbps symmetrical broadband coverage, when their closest transmitter was a county and several mountain ranges away. Until these kinds of mapping issues are fixed, any FCC auctions are going to leave out a lot of areas that should be eligible for grants. The people living in these areas should not suffer due to poor FCC data collection.

Second, there are not enough shovel ready projects ready to chase $80 billion in grant funding. If there is no decent ISP ready to build in a predetermined area, the funding is likely to revert to a satellite provider, like happened when Viasat was one of the largest winners in the CAF II reverse auction. The FCC also recently opened the door to allowing rural DSL into the upcoming RDOF grant – a likely giveaway to the big incumbent telcos.

This particular bill has a lot of focus on affordability, and I am a huge fan of getting broadband to everybody. But policymakers have to know that this comes at a cost. If a grant recipient is going to offer affordable prices and even lower prices for low-income households then the amount of grant funding for a given project has to be higher than what we saw with RDOF. There also has to be some kind of permanent funding in place if ISPs are to provide discounts of $50 to $75 for low-income households – that’s not sustainable out of an ISP revenue stream.

The idea of creating huge numbers of rural open-access networks is also an interesting one. The big problem with this concept is that there are many places in the country where there a few, or even no local ISPs. Is it an open-access network if only one, or even no ISPs show up to compete on a rural network?

Another problem with awarding this much money all at once is that there are not enough good construction companies to build this many broadband rural networks in a hurry. In today’s environment that kind of construction spending would superheat the market and would drive up the cost of construction labor by 30-50%. It would be just as hard to find good engineers and good construction managers in an overheated market – $80 billion is a lot of construction projects.

Don’t take my negative comments to mean I am against massive funding for rural broadband. But if we do it poorly a lot of the money might as well just be poured into a ditch. This much money used wisely could solve a giant portion of the rural broadband problem. But done poorly and many rural communities with poor broadband probably won’t get a solution. Congress has the right idea, but I hope that they don’t dictate how to disperse the money without talking first to rural industry experts, or this will be another federal program with huge amounts of wasted and poorly spent money.

Many Libraries Still Have Slow Broadband

During the recent pandemic, a lot of homes came face-to-face with the realization that their home broadband connection is inadequate. Many students trying to finish the school year and people trying to work from home found that their broadband connection would not allow them to connect and maintain connections to school and work servers. Even families who thought they had good broadband found that they were unable to maintain multiple connections for these purposes.

The first thing that many people did when they found that their home broadband wasn’t adequate was to search for some source of public broadband that would enable them to handle their school or office work. Even in urban areas this wasn’t easy, since most of the places with free broadband, such as coffeeshops were closed and didn’t have the broadband connected to deliver meager broadband for those willing to sit outside.

School officials scrambled and were able in many cases to quickly activate broadband from schools, which in most places have robust broadband. Local government supplemented this with ideas like putting cellular hot spots on school buses and parking them in areas with poor broadband.

I’m sure that one of the first places that those without broadband tried was the local small-town libraries. Unfortunately, a lot of libraries in rural areas suffer from the same poor broadband as everybody else in the area.

The FCC established a goal in 2014 for library broadband in the E-Rate Modernization Order, setting a goal of having at least 100 Mbps broadband to every library serving a community of less than 50,000 people, The goal for libraries serving larger communities was set at a gigabit. Unfortunately, many libraries still don’t have good broadband.

In just the last few months, I’ve been working with rural communities where rural libraries get their broadband from cellular hot spots or slow rural DSL connections. It’s hard to imagine being a broadband hub for a community if a library has a 3 to 5 Mbps broadband connection. Libraries with these slow connections gamely try to share the bandwidth with the public – but it obviously barely works. To rub salt in the wounds, some of these slow connections are incredibly expensive. I talked to a library just a few weeks ago that was spending over $500 per month for a dedicated 5 Mbps broadband connection using a cellular hotspot.

The shame of all of this is that the federal funding is available through the E-Rate and a few other programs to try to get better broadband for libraries. Some communities haven’t gotten this funding because nobody was willing to slog through the bureaucracy and paperwork to make it happen.  But in most cases, rural libraries don’t have good broadband because it’s not available in many small rural towns. It would require herculean funding to bring fast broadband to a library in a town where nobody else has broadband.

This is not to say that all rural libraries don’t have good broadband. Some are connected by fiber and have gigabit connections. In many cases these connections are made as part of fiber networks that connect schools or government buildings. These ‘anchor institution’ networks solve the problem of poor broadband in the schools and libraries, but almost always are prohibited from sharing that bandwidth with the homes and businesses in the community.

Of course, there are rural libraries that have good broadband because somebody built a fiber network to connect the whole community. In most cases that means a rural telephone company or telephone cooperative. More recently that might mean an electric cooperative. These organizations bring good broadband to everybody in the community – not just to anchor institutions. Even in these communities the libraries serve a vital role since they can provide WiFi for those that can’t afford to buy the subscription to fiber broadband. Most schools and libraries have found ways to turn the WiFi towards parking lots, and all over rural America there have been daily swarms of cars parked all day where there is public WiFi.

Ultimately, the problems with library broadband are a metaphor for the need for good rural broadband for everybody. Society is not served well when people park all day in a parking lot just to get a meager broadband connection to do school or office work. Folks in rural communities who have suffered through this pandemic are not going to forget it, and local and state politicians better listen to them and help find better broadband solutions.

Our Uneven Regulatory Environment

I think everybody would agree that broadband is a far more important part of the American economy than landline telephone service. While something in the range of 35% of homes still have a landline, almost every home has or wants a broadband connection. If you knew nothing about our regulatory history in the U.S., you would guess that the FCC would be far more involved with broadband issues than landline telephone issues – but they’re not. Consider some of the recent regulatory actions at the FCC as evidence of how regulation is now unbalanced and mostly looks at voice issues.

Recently the FCC took action against Magic Jack VocalTec Ltd. The FCC reached a settlement with MagicJack to pay $5 million in contributions to the Universal Service Fund. MagicJack also agreed to implement a regulatory compliance plan to stay in compliance with FCC rules.

The contributions to the Universal Service Fund come from a whopping 26.5% tax on the interstate portion of telephone service, and MagicJack has refused for years to make these payments. MagicJack has been skirting FCC rules for years – which is what allows them to offer low-price telephone service.

The FCC also recently came down hard on telcos that are making a lot of money by billing excessive access charges for calls to service like Free Conference Calling.com and chat lines. These services made arrangements with LECs that are remote and that bill access on a lot of miles of fiber transport. The FCC ruled that these LECs were ‘access stimulators’ and that the long-distance companies and their customers were unfairly subsidizing free conference calling. In one of the fastest FCC reactions I can recall, just a few months after the initial ruling the FCC also published orders denying appeals to that order.

From a regulatory perspective, these kinds of actions are exactly the sort of activity one would expect out of a regulatory agency. These two examples are just a few out of a few dozen actions the FCC has taken in the last few years in their regulation of landline telephone service. The agency has been a little less busy, but also looked at cable TV issues over the last year.

Contrast this with broadband, which any person on the street would think would be the FCC’s primary area of regulation. After all, broadband is the far most important communications service and affects far more homes and businesses than telephone service or cable TV service.  But the regulatory record shows a real dearth of action in the area of broadband regulation.

In December 2019 Congress passed the Television Viewer Protection Act that prohibits ISPs and cable companies from billing customers for devices that the customer owns. It’s odd that a law would even be needed for something so commonsense, but Frontier and some cable companies have been billing customers for devices that were sold previously to customers. In one example that has gotten a lot of press, Frontier has been billing customers a $10 fee for a router that customers purchased from Verizon before Frontier bought the property.

Frontier appealed the immediate implementation of the new law to the FCC. The telco said that due to COVID-19 the company is too busy to change its practices and asked to be able to continue the overbilling until the end of this year. In a brave regulatory move in April, the FCC agreed with Frontier and will allow them to continue to overbill customers for such devices until the end of 2020.

I was puzzled by this ruling for several reasons. From a practical perspective, the regulators in the U.S. have normally corrected carrier wrongs by ordering refunds. It’s impossible to believe that Frontier couldn’t make this billing change, with or without COVID. But even if it takes them a long time to implement it, the normal regulatory remedy is to give customers back money that was billed incorrectly. Instead, the FCC told Frontier and cable companies that they could continue to rip off customers until the end of the year, in violation of the intent of the law written by Congress.

A more puzzling concern is why the FCC even ruled on this issue. When the agency killed Title II regulation, they also openly announced that they have no regulatory authority over broadband. My first thought when reading this order was to wonder if the FCC even has jurisdiction any longer to rule on issues like data modems. However, in this case, the Congress gave them the narrow authority to rule on issues related to this specific law. As hard as the FCC tries, these little nagging broadband issues keep landing in their lap – because there is no other place for them to go.

In this case, the FCC dipped briefly into a broadband issue and got it 100% wrong. Rather than rule for the customers who were being billed fraudulent charges, and going against the intent of Congress that passed the law clarifying the issue – the FCC bought into the story that Frontier couldn’t fix their billing systems until a year after the law was passed. And for some reason, even after buying the story, the FCC didn’t order a full refund of past overbilling.

If we actually had light-touch broadband regulation, then the FCC would be able to weigh in when industry actors act badly, like happened in the two telephone dockets listed above. But our light-touch regulation is really no-touch regulation and the FCC has no jurisdiction over broadband except in snippets where Congress gives them a specific task. The FCC ruling is puzzling. We know they favor the big ISPs, but siding with Frontier’s decision to openly rip off customers seems like an odd place to make a pro-ISP stand. As much as I’ve complained about this FCC giving up their broadband regulatory authority – perhaps we don’t want this to be fixed until we get regulators who will apply the same standards to broadband as they are applying to telephone service.

How Will Cable Companies Cope with COVID-19?

A majority of households today buy broadband from cable companies that operate hybrid coaxial fiber networks (HFC) that us some version of DOCISIS technology to control the networks. The largest cable companies have upgraded most of their networks to DOCSIS 3.1 that allows for gigabit download speeds.

The biggest weakness in the cable networks is the upload data links. The DOCSIS standard limits the upload path to me no larger than 1/8th of the total bandwidth uses – but it’s not unusual for the cable companies to make this path even smaller and offer products like 100/10 Mbps where the upload is 1/11th of the total bandwidth provided to customers.

This is not a new concern for the cable companies and the engineering folks at Comcast and other big cable companies have been discussing ways to improve upload bandwidth for much of the last decade. They understood that the need for uploading would someday overwhelm the bandwidth path provided – they just didn’t expect to get there so explosively as been done in reaction to the COVID-19 crisis.

Every student and employee trying to work from home is carving out an uploaded VPN when they connect to a school or work server. Customers are also using significant upload bandwidth when they join a video call on Zoom or other platforms. While carriers report 30–40% overall increases in traffic due to COVID-19, they are not disclosing that a lot of that increase is demand for uploading.

Cable companies are now faced with solving the upload crisis. Practically every prognosticator in the country is predicting that we’re not going to return to pre-COVID behavior. There is likely to be a lot of people who will continue to work from home. While students will return to the classroom eventually, this grand experiment has shown that’s it’s feasible to involve students in the classroom remotely, and so school systems are likely to continue this practice for students with long-term illnesses or other reasons why they can’t always be in the classroom. Finally, we’ve taught a whole generation of people that video meetings can work, so there is going to be a whole lot more of that. The day of traveling to attend a few hour meeting might be over.

There is one other interesting fact to consider when looking at a cable company upload data path. Cable companies have generally devalued the upload path quality and have assigned the upload path to the low frequencies on the cable network spectrum. Historically upload data speeds were provisioned on the 5-42 MHz range of spectrum. This is the spectrum in a cable system that experiences the most interference from things like microwave ovens, vacuum cleaners and passing large trucks. Cable companies could get away with this because historically most people didn’t care if it took longer to upload a file or if packets had to be retransmitted due to interference. But people connecting to WANs and video conferences care about the upload quality as well as speed.

One solution, and something that some cable providers have already done is to do what is called a mid-split upgrade that extend the spectrum for uploading to the 5-85 MHz band. This still includes a patch of the worst spectrum inside the cable system, but is a significant boost in the amount of upload broadband available. Depending upon the settop boxes being used, this upgrade can require some new customer boxes.

Another idea is to do more traditional node splits, meaning to reduce the number of customers included in a neighborhood node. Traditionally, node splits were done to improve the performance of download speeds – this was the fastest way to relieve network congestion when a local neighborhood network bogged down unduly in the evening. It’s an interesting idea to consider splitting nodes to relive pressure on the upload data path.

After those two idea the upgrades get expensive. Migrating to switched digital video could free up a mountain of system bandwidth which would allow for a larger data path, including an enlarged upload path. The downside of this kind of upgrade is that it moves outside of the DOCSIS technology and starts to look more like providing Ethernet over fiber. This is not just a forklift upgrade it changes the basic way the network operates.

The final way to get more upload speed would be an upgrade to the upcoming DOCSIS 4.0 standard. Everything I read about this makes it sound expensive. But the new standard would allow for nearly symmetrical data services and would let cable network broadband compete head-on with fiber network. It will be interesting to see if the cable companies view the upload crisis as bad enough to warrant spending huge amounts of money to fix the problem.

The FCC Muddles the RDOF Grants

Last week the FCC ‘clarified’ the RDOF rules in a way that left most of the industry feeling less sure about how the auction will work.  The FCC is now supposedly taking a technologically neutral position on the auction. That means that the FCC has reopened the door for low-earth orbit satellites. Strangely, Chairman Ajit Pai said that the rules would even allow DSL or fixed wireless providers to participate in the gigabit speed tier.

Technologically neutral may sound like a fair idea, but in this case it’s absurd. The idea that DSL or fixed wireless could deliver gigabit speeds is so far outside the realm of physics as to be laughable. It’s more likely that these changes are aimed at allowing the providers of satellite, DSL, and fixed wireless providers to enter the auction at speeds faster than they can deliver.

For example, by saying that DSL can enter the auction at a gigabit, it might go more unnoticed if telcos enter the auction at the 100./10 Mbps tier. There is zero chance for rural DSL to reach those speeds – the CAF II awards six years ago didn’t result in a lot of rural DSL that is delivering even 10/1 Mbps. It’s worth remember that the RDOF funding is going to some of the most remote Census blocks in the country where homes are likely many miles from a DSL hub and also not concentrated in pockets – two factors that account for why rural DSL often has speeds that are not a lot faster than dial-up.

Any decision to allow low orbit satellites into the auction has to be political. There are members of Congress now pushing for satellite broadband. In my State of North Carolina there is even a bill in the Senate (SB 1228) that would provide $2.5 million to satellite broadband as a preferred solution for rural broadband.

The politics behind low orbit satellite broadband is crazy because there is not yet any such technology that can deliver broadband to people. Elon Musk’s satellite company currently has 362 satellites in orbit. That may sound impressive, but a functional array of satellites is going to require thousands of satellites – the company’s filed plan with the FCC calls for 4,000 satellites as the first phase deployment.

I’ve seen a lot of speculation in the financial and space press that Starlink will have a lot of challenge in raising the money needed to finish the constellation of satellites. A lot of the companies that were going to invest are now reluctant due to COVID-19. The other current competitor to Starlink is OneWeb, which went bankrupt a few months ago and may never come out of receivership. Jeff Bezos has been rumored to be launching a satellite business but still has not launched a single satellite.

The danger of letting these various technologies into the RDOF process is that a lot of rural households might again get screwed by the FCC and not get broadband after a giant FCC grant. That’s what happened with CAF II where over $9 billion was handed to the big telcos and was effectively washed down the drain in terms of any lasting benefits to rural broadband.

It’s not hard to envision Elon Musk and Starlink winning a lot of money in the CAF II auction and then failing to complete the business plan. The company has an automatic advantage over any company they are bidding against since Starlink can bid lower than any other bidder and still be ahead of the game. It’s not an implausible scenario to foresee Starlink winning every contested Census block.

Allowing DSL and fixed wireless providers to overstate their technical capacity will be just as damaging. Does anybody think that if Frontier wins money in this auction that they will do much more than pocket it straight to the bottom line? Rural America is badly harmed if a carriers wins and the RDOF money and doesn’t deliver the technology that was promised – particularly if that grant winner unfairly beat out somebody that would have delivered a faster technology. One has to only look back at the awards made to Viasat in the CAF II reverse auction to see how absurd it is when inferior technologies are allowed in the auction.

Probably the worst thing about the RDOF rules is that somebody who doesn’t deliver doesn’t have to give back all of the grant money. Even should no customer ever be served or if no customer ever receives the promised speeds, the grant winner gets to keep a substantial percentage of the grant funding.

As usual, this FCC is hiding their real intentions under the technology neutral stance. This auction doesn’t need the FCC to be ‘technology neutral’, and technologies that don’t exist yet today like LEO satellites or technologies that can’t deliver the speed tiers should not be allowed into the auction. I’m already cringing at the vision of a lot of grant winners that have no business getting a government subsidy at a time when COVID-19 has magnified the need for better rural broadband.