States Fight Back Against CAF II

Jon Brodkin of ArsTechnica wrote a recent article about the Mississippi Public Service Commission (PSC) notifying the FCC that AT&T had failed to meet its CAF II requirements in the state. AT&T had taken over $49 million per year for six years ending this December and was supposed to use that money to upgrade broadband to almost 144,000 residents in the state to at least 10/1 Mbps broadband.

The PSC notification informs that FCC that they don’t believe the upgrades have been done or that many of those homes were able to get faster broadband. AT&T has certified to the FCC that the CAF II work has been completed on schedule. AT&T has stonewalled the PSC on data requests to find out how many homes have successfully been able to access faster broadband.

The FCC is supposed to begin testing CAF II homes in 2021 and is supposed to fine the big telcos like AT&T if homes in the CAF II area aren’t getting the faster speeds. However, that testing program is badly flawed in that the telcos are going to have some say about which homes get tested, and they’ll certainly funnel the testing into places that meet the speed test.

AT&T elected to use the CAF II funding to upgrade speeds by offering fixed cellular service to customers that formerly had slow DSL service. From what I can see, AT&T has not widely advertised the new wireless product and it’s unlikely that they have added many people to the cellular technology in Mississippi or anywhere else. The company is refusing to tell the state how many homes are on the new product.

Unfortunately, what AT&T is doing in Mississippi is not unusual. AT&T took $2.57 billion nationwide for CAF II and it’s likely It hasn’t made many upgrades in other states as well. I’ve seen a lot of evidence that Frontier ($1.7 billion) and CenturyLink ($3.03 billion) have also failed to upgrade rural customers. Those two companies elected to mostly upgrade rural DSL to the faster speeds. We’ve recently had engineers in counties where Frontier and CenturyLink were supposed to make CAF II upgrades and we could find no evidence of upgraded DSL anywhere in the rural parts of these counties. We’ve also helped counties to solicit speed test from citizens and we’ve studied a number of counties where no rural DSL service tested even close to the 10/1 Mbps goal of CAF II.

To make matters even worse, the FCC recently decided to award these big telcos a seventh year of subsidy. That means AT&T will get $428 million in 2021, Frontier will get $283 million, and CenturyLink will get $506 million. The companies have no obligation for this addition funding and don’t have to use it to improve rural broadband.

While 10/1 Mbps broadband isn’t great, it’s a lot better than the DSL that was in these rural areas in 2015 when the CAF II payments began. The CAF II areas are remote and most customers who could even get DSL saw speeds under 1 or 2 Mbps download.

The impact of AT&T’s failure to make the upgrades became apparent this year when millions of students were sent home during the pandemic. A student might be able to squeak out a school connection on a 10/1 Mbps broadband connection, but students cannot function on the slower DSL that is still in place due to lack of upgrades. The actions of the FCC and the greed of the big telcos robbed millions of rural homes from getting better broadband.

The failure of CAF II rests entirely on the FCC. The last FCC under Chairman Wheeler awarded the funding to upgrade to 10/1 speeds, even though the definition of broadband at the time was 25/3 Mbps. The current FCC under Chairman Pai has turned a blind eye to the non-performance of the big telcos and absurdly is awarding them with an additional year of CAF II funding. The overall CAF II program handed out over $10 billion in funding for improving rural broadband that might as well have been flushed down the drain. The FCC could have awarded this money instead to broadband grants that could have brought better broadband in the CAF II rural areas.

I hope the Mississippi PSC does more than just write a letter. I’d like to see them ask for AT&T to refund the CAF II money to the state to use for broadband grants. And I’d love to see other states do the same and take back the billions of CAF II broadband funding that was wasted.

Can the FCC Regulate Facebook?

At the urging of FCC Chairman Ajit Pai, the FCC General Counsel Tom Johnson announced in a recent blog that he believes that the FCC has the authority to redefine the immunity shield provided by Section 230 of the FCC’s rules that comes from the Communications Decency Act from 1996.

Section 230 of the FCC rules is one of the clearest and simplest rules in the FCC code:  “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider“.

In non-legalese, this means that a web companies is not liable for third-party content posted on its platform. It is this rule that enables public comments on the web. All social media consists of third-party content. Sites like Yelp and Amazon thrive because of public post reviews of restaurants and products. Third-party comments are in a lot more places on the web such as the comment section of your local newspaper, or even here on my blog.

Section 230 is essential if we are going to give the public a voice on the web. Without Section 230 protections, Facebook could be sued by somebody who doesn’t like specific content posted on the platform. That’s dangerous because there is somebody who hates every possible political position.  If Facebook can be sued for content posted by its billions of users, then the platform will have to quickly fold – there is no viable business model that can sustain the defense of huge volumes of lawsuits.

Section 230 was created when web platforms started to allow comments from the general public. The biggest early legal challenge to web content came in 1995 when Wall Street firm Stratton Oakmont sued Prodigy over a posting on the platform by a user that accused the president of Stratton Oakmont of fraud. Stratton Oakmont won the case when the New York Supreme Court ruled that Prodigy was a publisher because the platform exercised some editorial control by moderating content and because Prodigy had a clearly stated set of rules about what was allowable content on the Prodigy platform. As might be imagined, this court case had a chilling impact on the burgeoning web industry, and fledgling web platforms worried about getting sued over content posted by the public. This prompted Representatives Rob Wyden and Chris Cox to sponsor the bill that became the current Section 230 protections.

Tom Johnson believes the FCC has the authority to interpret Section 230 due to Section 201(b) of the Communications Act of 1934, which confers on the FCC the power to issue rules necessary to carry out the provisions of the Act. He says that when Congress instructed that Section 230 rules be added to FCC code, that implicitly means the FCC has the authority to interpret the rules.

But then Mr. Johnson does an interesting tap dance. He distinguishes between interpreting the Section 230 rules and regulating companies that are protected by these rules. If the FCC ever acts to somehow modify Section 230, the legal arguments will concentrate on this nuance.

The FCC has basically been authorized by Congress to regulate common carriers of telecommunications services as well as a few other responsibilities specifically assigned to the agency.

There is no possible way that the FCC could ever claim that companies like Facebook or Google are common carriers. If they can’t make that argument, then the agency likely has no authority to impose any obligations on these companies, even should it have the authority to ‘interpret’ Section 230. Any such interpretation would be meaningless if the FCC has no authority to impose such interpretations on the companies that rely on Section 230 protections.

What is ironic about this effort by the FCC is that the current FCC spent a great deal of effort to declassify ISPs from being common carriers. The agency has gone as far as possible to wipe its hands of any responsibility for regulating broadband provided by companies like AT&T and Comcast. It will require an amazing set of verbal gymnastics to somehow claim the ability to extend FCC authority to companies like Facebook and Twitter, which clearly have zero characteristics of being a common carrier while at the same time claiming that ISPs are not common carriers.

FCC Expands Rural Use of White Space Spectrum

At the October monthly meeting, the FCC modified its Part 15 rules to allow for better utilization of white space spectrum in rural America – a move that should provide a boon to fixed wireless technology. The term ‘white space’ refers to spectrum that has been assigned for over-the-air television broadcasting but that sits empty in and is not being used by a television station. In any given market there are channels of television spectrum that are not being used, and today’s ruling describes new ways that wireless ISPs, school systems, and others can better use the unused spectrum.

The FCC action follows a long-standing petition from Microsoft asking for better use of unused white space spectrum. The FCC asked Microsoft and the National Association of Broadcasters to negotiate a reasonable plan for using idle spectrum, and the actions taken by the agency reflect the cooperation of the parties. The FCC further plans to issue a Notice for Proposed Rulemaking to investigate other questions related to white space spectrum.

First, the FCC is allowing for increased height for white space transmitters. The transmitters were previously limited to being no more than 250 meters above the average terrain in an area, and that has been boosted to 500 meters. In case somebody is envisioning 1,500-foot towers, wireless companies achieve this height when placing towers on hilltops. The extra height is important for two reasons. Fixed wireless technology requires line-of-sight between the tower and a customer location, and the higher the tower the better chance of being able to ‘see’ some portion of a customer premise. Using higher towers also means that wireless signal can travel farther – white space spectrum is unique compared to many other spectrum bands in that it can deliver some broadband at significant distances from a tower.

The FCC order also is allowing increased power and has increased the maximum effective radiated power from 10 watts to 16 watts. Power levels are important because the strength of the signal matters at the customer location – higher power means a better chance of delivering full broadband speeds.

The order builds in some additional protection for existing television stations. The FCC order increases the separation between an ISP wireless signal and existing television station frequencies. Transmissions with white space spectrum tend to stray out of band and allowing broadband signals too close to television signals would mean degraded performance for both the television station and ISP. One of the questions to be asked by the NPRM is if there is a way to utilize the bands closer to existing television signals.

The FCC’s order also authorized the use of narrowband devices that use white space. This opens up the door to using white space spectrum to communicate with Internet of Things devices. In rural areas, this might be a great way to communicate with agricultural sensors since the white space spectrum can travel to the horizon.

Finally, the order allows for higher power applications in isolated geographic areas that can be ‘geo-fenced’, meaning that the transmissions can be done in such a way as to keep the signals isolated to a defined area. The envisioned uses for this kind of application would be to provide broadband along school bus routes or to provide coverage of defined farm fields.

These changes were a long time in coming, with Microsoft asking for some of these changes since 2008. The issues have been bouncing around the FCC for years and it finally took the compromise between the parties to make this work. Maybe some of the other parties arguing over spectrum allocation could learn from this example that cooperation beats years of regulatory opposition.

The Upload Speed Lie

In the 2020 Broadband Deployment Report, the FCC made the following claim. “The vast majority of Americans – surpassing 85% – now have access to fixed terrestrial broadband service at 250/25 Mbps”. The FCC makes this claim based upon the data provided to it by the country’s ISPs on Form 477. We know the data reported by the ISPs is badly flawed in the over-reporting of download speeds, but we’ve paid little attention to the second number the FCC cites – the 25 Mbps upload speeds that are supposedly available to everybody. I think the FCC claim that 85% of homes have access to 25 Mbps upload speeds is massively overstated.

The vast majority of the customers covered by the FCC statement are served by cable companies using hybrid fiber-coaxial technology. I don’t believe that cable companies are widely delivering upload speeds greater than 25 Mbps upload. I think the FCC has the story partly right. I think cable companies tell customers that the broadband products they buy have upload speeds of 25 Mbps, and the cable company’s largely report these marketing speeds on Form 477.

But do cable companies really deliver 25 Mbps upload speeds? One of the services my consulting firm provides is helping communities conduct speed tests. We’ve done speed tests in cities recently where only a tiny fraction of customers measured upload speeds greater than 25 Mbps on a cable HFC network.

It’s fairly easy to understand the upload speed capacity of a cable system. The first thing to understand is the upload capacity based upon the way the technology is deployed. Most cable systems deploy upload broadband using the frequencies on the cable system between 5 MHz and 42 MHz. This is a relatively small amount of bandwidth that sits at the noisiest part of cable TV frequency. I remember back to the days of analog broadcast TV and analog cable systems when somebody running a blender or a microwave would disrupt the signals on channels 2 through 5 – the cable companies are now using these same frequencies for uploading broadband. The DOCSIS 3.0 specification assigned upload broadband to the worst part of the spectrum because before the pandemic almost nobody cared about upload broadband speeds.

The second factor affecting upload speeds is the nature of the upload requests from customers. Before the pandemic, the upload link was mostly used to send out attachments to emails or backup data on a computer into the cloud. These are largely temporary uses of the upload link and are also considered non-critical – it didn’t matter to most folks if a file was uploaded in ten seconds or five minutes. However, during the pandemic, all of the new uses for uploading require a steady and dedicated upload data stream. People now are using the upload link to connect to school servers, to connect to work servers, to take college classes online, and to sit on video call services like Zoom. These are critical applications – if the upload broadband is not steady and sufficient the user loses the connection. The new upload applications can’t tolerate best effort – a connection to a school server either works or it doesn’t.

The final big factor that affects the bandwidth on a cable network is demand. Before the pandemic, a user had a better chance than today of hitting 25 Mbps upload because they might have been one of a few people trying to upload at any given time. But today a lot of homes are trying to make upload connections at the same time. This matters because a cable system shares bandwidth both in the home, but also in the neighborhood.

The upload link from a home can get overloaded if more than one person tries to connect to the upload link at the same time. Homes with a poor upload connection will find that a second or a third user cannot establish a connection. The same thing happens at the neighborhood level – if too many homes in a given neighborhood are trying to connect to upload links, then the bandwidth for the whole neighborhood becomes overloaded and starts to fail. Remember a decade ago that it was common for downloaded videos streams to freeze or pixelate in the evening when a lot of homes were using broadband? The cable companies have largely solved the download problem, but now we’re seeing neighborhoods overloading on upload speeds. This results in people unable to establish a connection to a work server or being booted off a Zoom call.

The net result of the overloaded upload links is that the cable companies cannot deliver 25 Mbps to most homes during the times when people are busy on the upload links. The cable companies have ways to fix this – but most fixes mean expensive upgrades. I bet that the cable companies are hoping this problem will magically go away at the end of the pandemic. But I’m guessing that people are going to continue to use upload speeds at levels far higher than before the pandemic. Meanwhile, if the cable companies were being honest, they would not be reporting 25 Mbps upload speeds to the FCC. (Just typing that made me chuckle because it’s not going to happen.)

What’s the Best Way to Help Precision Agriculture?

The FCC is going to take a fresh look at the $9 billion 5G fund this month and it sounds like the grant program will get delayed again while the FCC figures out where to deploy the money. The fund idea has been roiled in controversy since the beginning when it became clear that the big cellular companies were providing false data about existing cellular coverage.

Buried inside this fund is $1 billion in grants intended to help precision farming. Precision farming needs bandwidth, and apparently, the FCC has decided that the bandwidth should be cellular. I was frankly surprised to see such a specific earmark. The current FCC and administration have clearly climbed on the 5G bandwagon, but it seems premature to me to assume that cellular will be the winning technology for precision agriculture.

This funding means that the cellular companies will get a free, or highly subsidized network and will then be able to bill farmers for providing the bandwidth needed for smart tractors and for the millions of field sensors that the industry predicts will be deployed to monitor crops and livestock.

This all sounds great and shows that the government is working to help solve one of our biggest broadband needs. But it also means that the FCC hopes to hand the agribusiness revenue stream to cellular companies. This feels to me like another victory for the cellular lobbyists – their companies get free government handouts that will lead to lucrative long-term monopoly revenue streams.

If the FCC was doing its job right, we’d be seeing a far different approach. There are multiple wireless technologies that can be leveraged for smart agriculture.

  • Cellular technology is an option, but it’s not necessarily the best technology to cover big swaths of farmland. The coverage area around a cell tower is only a few miles and it requires a huge number of rural cell sites to provide universal cellular broadband coverage in farming areas.
  • Another option is LoRaWAN, a technology that is perfect for providing small bandwidth to huge numbers of sensors over a large area. This technology was discussed in a recent blog talking about the deployment of a LoRaWAN blimp in Indiana.
  • By default, early farm sensors are using WiFi, which is something farms can implement locally, at least in barns and close to farm buildings.

All these technologies require broadband backhaul, and this could be provided by fiber or satellites. If the 5G grants and the current RDOF grants are spent wisely there will be fiber built deeply into farming counties. Satellite broadband could fill in for the most remote farms.

Ideally, the FCC would be considering the above technologies and any others that could help agribusiness. Agriculture is our largest industry and it seems callous to stuff money to solve the problem inside an FCC grant program that might not even be awarded for several years and that then will allow for six more years to build the networks – that would push solutions out for at least a decade into the future.

Instead, the FCC should be establishing a smart farming grant program to see what could be done now for this vital sector of our economy. The FCC should be funding experimental test trials to understand the pros and cons of using cellular, WiFi, satellite, or LoRaWAN bandwidth to talk to farm devices. The results of such trials would then be used to fund a farming broadband grant program that would deploy farm broadband in an expeditious manner – a lot sooner than a decade from now.

The FCC should not be automatically awarding money to cellular companies to control the budding smart farming industry. If we took the time to look at this scientifically, we’d find out which technology is the most suitable and sustainable. For example, one of the driving factors in creating smart farming is going to be the power needs for sensors using the different wireless technologies. It may turn out that the best solution is cellular – but we don’t know that. But that’s not going to stop the FCC from marching forward with $1 billion in grants without ever having looked hard at the issue. This sounds like just another giveaway to the big carriers to me.

The Race to Bury Net Neutrality

The Internet is currently full of news articles describing how the FCC will soon be putting to bed the last vestiges of its order a few years ago to eliminate net neutrality rules. The order that is widely being called the net neutrality ruling was a far-reaching change at the FCC that essentially wrote the FCC out of any role in regulating broadband.

Eliminating net neutrality rules was only a small part of that order. Net neutrality is a set of principles that describe how ISPs and network owners are to not discriminate between bits carried over the Internet. Most of the largest ISPs said that they could live with the net neutrality principles, and eliminating net neutrality was not a high priority for companies like AT&T and Comcast. The real priority for the big ISPs was to take advantage of a friendly FCC that was open and willing to deregulated broadband – particularly willing to eliminate any threat of broadband rate regulation.

So when you read the flood or articles this month talking about net neutrality, you need to substitute the term ‘net neutrality’ with ‘regulating broadband’ as you read articles on the topic. The FCC chose to disguise their attempt to kill regulation under the moniker of net neutrality and was successful since the average American probably has no idea that the FCC no longer regulates ISPs and broadband.

The FCC is holding a vote on October 27, just before the presidential election to cement the last open pieces from the FCC’s order to eliminate broadband regulation. The FCC’s order to write the agency out of broadband regulation was challenged in federal court. The court basically said that the FCC had the regulatory authority to either change the rules (or not change the rules) to walk away from broadband regulation.

However, the court said that the FCC needs to demonstrate that eliminating regulatory authority over broadband didn’t impact three areas negatively. The FCC was asked to clarify:

  • How eliminating broadband regulation impacts public safety;
  • How the FCC can still regulate pole attachments if it doesn’t regulate broadband;
  • If walking away from regulation negatively impacts the FCC’s ability to offer the FCC Lifeline programs that benefit low-income Americans.

On October 27 the FCC is going to take a vote to say that it’s earlier order doesn’t negatively impact any of these issues. It’s clear that that the FCC wants to finish the elimination of broadband regulation before the election on the chance that a new Democratic president will mean a new head of the FCC. The FCC has openly said that it changed the rules on broadband regulation in such a way that will make it hard for a future FCC to overturn its order.

A new FCC can obviously undo anything that was done by a previous FCC. However, the net neutrality order was done in such a way that a new FCC would have to go through the full cycle of the FCC’s processes that including various cycles of notices of proposed rulemaking, a final rulemaking, and then the inevitable court challenges to any attempt to reregulate broadband – all done with vigorous opposition from the big ISPs. The process of reversing the deregulation of broadband would likely stretch over many years.

However, there is a much shorter and quicker path for reversing the FCC’s order. Congress is free to reset the FCC rules in any way it seems fit, and Congress could finally pass a new telecom act. There hasn’t been any major telecom legislation out of Congress since 1996 – during the heyday of dial-up Internet. In today’s political environment it would take a Democratic sweep of the White House and both houses of Congress to get new telecom legislation passed.

Even should that happen with the election, the new Democratic majority would have to agree on what is contained in a new telecom act. I can’t foresee that being an easy or quick process. There is an accumulation of topics in addition to broadband regulation that would benefit from Congressional clarification including privacy, regulation of web companies, solving the digital divide, elimination of outdated cable TV and telephone regulations, a national policy on spectrum, regulation of low orbit satellites, and a host of smaller issues.

If the Democrats don’t make a clean sweep of Congress and the White House, then the current FCC will largely have succeeded and it might be many years until a determined FCC could reestablish any regulatory authority over broadband. What is clear to somebody who closely watches industry regulation – it’s going to be interesting few years ahead of us in this industry regardless of what happens at the polls in November.

AT&T Stops DSL Sales

USA Today reported last week that AT&T stopped selling new DSL to customers on October 1. This is an event that will transform the broadband landscape in a negative way across the country. There are a number of immediate consequences of this action by the company.

Probably the most dramatic impact will be that many rural customers will no longer have an option for landline broadband. While rural DSL broadband is slow, a DSL connection at speeds between 1 Mbps and 6 Mbps beats the alternatives – which is satellite broadband or cellular hotspots. Since there are a lot of rural homes where those two technologies don’t work, this means some homes will suddenly have no broadband option. Expect to soon see stories of folks who buy rural homes and then find they have no option to buy broadband.

In cities where AT&T DSL is the only alternative to a cable company broadband service, this move bestows total monopoly power to the cable company. Our firm does broadband surveys and we still find markets where AT&T DSL represents as much as a 30% market share. Many homes buy DSL because it costs less, and that option just got taken off the table in AT&T markets. And just like in rural markets, every city has customers who’s only choice is DSL. For various reasons, there are streets in most cities where the cable companies never constructed network. Any customer moving into one of these broadband deserts will find themselves with no broadband alternative.

According to an article just published by Ars Technica, only 28% of AT&T broadband customers have access to AT&T fiber – anybody living in the neighborhoods without fiber will no longer be able to buy broadband from AT&T. That has to equate to tens of millions of households that just lost a broadband option. The FCC proudly measure the number of homes with multiple broadband options, and I’ll be curious to see if they recognize this sea change in the market.

This change will stop the practice of customers who hop back and forth between DSL and cable company broadband to save money. I just talked to a customer the other day that has bounced between DSL and cable company broadband for almost twenty years. Both the cable company and the telco offer introductory prices each time for swapping, and this customer has gone back and forth between the ISPs regularly every few years. In neighborhoods where AT&T is the telco DSL provider, this might mean the end of introductory special prices from the cable company – they now have zero incentive to compete for customers.

I would have to think that Verizon will eye this announcement closely. They have openly said that they want to do away from copper network technology. This might be all of the push needed for Verizon to follow suit. This announcement might be citied in telco history as the beggining of the end of copper wires. AT&T says they won’t be tossing folks off DSL service, but will no longer connect new customers to the DSL technology. Over time this is going to mean fewer and fewer customers on copper, and I suspect AT&T already has a date in mind when they walk away from the technology completely.

Ironically, AT&T just recently announced that they were going to claim a seventh year of CAF II support in 2021 and will collect over $427 million in subsidies next year to supposedly support rural DSL. Hopefully, the FCC will view this announcement as grounds for stopping such payments. It would be absolutely insane to give millions to AT&T to support a technology that the company will no longer sell or install.

This timing of the announcement is also curious at a time when the pandemic is still raging. This means a home that needs to buy broadband to support students or adults working from home will no longer have that option if the only wired connection is AT&T DSL.

This announcement also creates an interesting dilemma for the FCC. Will the FCC pretend that the huge AT&T DSL footprint still exists? It’s impossible to pretend that areas have a broadband option when the only provider of landline service refused to connect new customers. I’m sure the FCC will act as if this announcement never happened – because recognizing it means now counting millions of homes as having no broadband option.

This day has been inevitably coming for decades. Regulators have long pretended that they could demand that the big telcos keep supporting an obsolete technology. AT&T and Verizon have been telling regulators for years that they are going to walk away from copper, and now one of the big telcos is doing so. It’s just a matter of time until AT&T begins decommissioning DSLAMs and starts tearing down copper wires for the salvage value – and I can’t see any way that regulators can stop them.

FCC Kills CableCards

The FCC Commissioners recently unanimously voted to eliminate the rules that require cable companies to support devices that use CableCard technology for connecting to video services. The largest user of the technology is TiVo, but consumers have also been able to buy settop boxes using the technology rather than paying monthly to lease a box from the cable company.

The requirement for CableCards came from the Telecommunications Act of 1996. The congressional authors of that act thought that consumers ought to have an alternative to leasing a mandatory settop box from a cable company. After some industry wrangling, the FCC ordered that cable companies be ready to allow devices with CableCards by July 2000.

The big cable companies hated the CableCard rule and refused to share network security keys with CableCard manufacturers, making it a major challenge for a customer to install a CableCard device. In 2005 the FCC clarified the original order and told cable companies that software had to be separate than settop box devices so that CableCards could connect to cable company networks.

Over time, the software on cable networks has grown increasingly complex, and CableCard technology never became plug and play. Anybody who has ever installed a TiVo box knows the challenge of getting the CableCard software to talk to a specific local cable system. Because of this, and because of ongoing resistance to cable companies to make it easy for CableCards to work, no major market for consumer-owned settop boxes ever emerged. However, even in recent years, there have been sales of roughly half a million CableCard devices per quarter.

The biggest user of CableCard technology is TiVo which has a CableCard in every DVR recorder it sells. The FCC order doesn’t force cable companies to continue to support CableCard technology, but they likely will. Any cable company settop box built before 2015 uses CableCard technology – that was the easiest way for the cable companies to make CableCards work.

However, the FCC eliminated the last vestige of regulation on CableCards, so there is nothing to stop a cable company from cutting off CableCard devices, other than perhaps a desire to not push more households to cut the cord. Cable companies are also free to charge extra to consumers for connecting with a CableCard device.

It’s more likely that CableCard devices will just become technically obsolete over time. Without the FCC’s rules in place, the cable companies might not worry about the impact on CableCards as they update settop box software. This likely spells the end of the traditional TiVo box that could record many hours of video to watch later. Most cable companies offer an alternate to TiVo and allow customers to record and store programming in the cloud rather than on a device in the home. However, TiVo and other companies already started that transition, and TiVo introduced a cloud DVR service in 2018 for a cord-cutter that allows recording of video content that comes from any source such as over-the-air, or from an online service.

Consumers who have used CableCard devices face having to eventually pay the monthly fee for a settop box if they want to keep traditional cable TV service. Ironically, there might be a bigger need for a settop box alternative today than there was in 2000. Largely freed from regulation, the cable companies have raised fees on settop boxes, and I’ve seen monthly rental rates as high as $15 per month.

In the end, the CableCard regulation was largely a bust. It provided an alternative to renting settop boxes, but the cable companies never stopped fighting the idea and never made it easy for consumers to connect and use a CableCard device.

Georgia’s New Broadband Maps

A year ago, the state of Georgia undertook an effort to accurately map broadband availability in the state. Like many states, Georgia understood that the FCC’s broadband maps badly overstate broadband coverage. The goal of the state mapping effort was to define areas that don’t have good broadband to stimulate broadband investment where it’s needed most.

The results from the mapping effort are stunning. The State shows that over 507,000 homes and businesses, and 1 million people in the state don’t have access to 25/3 Mbps broadband. That is double the 252,000 homes identified by the FCC as not having access to 25/3 Mbps broadband.

I work with the FCC data every week and I’ve always known it is terrible. Our firm and many others look at the data in individual markets, but I’ve never found a way to grasp the extent of the problems with the FCC data on the global scale. It’s unconscionable for the FCC to overstate broadband coverage by 100%, as shown by the Georgia analysis.

The FCC data comes from ISPs that report broadband speeds and coverage to the agency. ISPs have different incentives to overstate coverage, and it’s obvious that many of them do so. The main ISP benefit of overstating broadband coverage is to dissuade competition. The FCC also uses the faulty data reported by ISPs to determine areas that are eligible for FCC broadband grants. For example, the FCC’s maps were used to determine areas that are covered by the $16.4 billion in grants that will be awarded in October.

The homes and businesses living in areas where the FCC broadband data is overstated should be livid about the issue. The Georgia mapping effort identified 255,000 homes and businesses that are in areas that should be considered for FCC grants but that weren’t included in the RDOF grants. Those grants are going to fund a lot of new fiber networks.

Georgia took a different approach to mapping. The state created the Georgia Broadband Deployment Initiative (GBDI) in 2018. The purpose of the initiative is to “coordinate and establish broadband programs to increase economic, education, and social opportunities for Georgia citizens and businesses.” The GBDI is an inter-agency effort supported by the Department of Community Affairs (DCA), Georgia Technology Authority (GTA), Department of Economic Development (DEcD), State Properties Commission (SPC), and Georgia Department of Transportation (GDOT).

The GBDI contacted ISPs to discuss broadband coverage. More importantly, the agency initiated speed tests to find out the real speeds being delivered across the state. While there are admittedly some issues with the accuracy of a given speed test when taken in mass a true picture of broadband speeds emerges.

The best thing the GBDI has done was to create a map that shows the side-by-side difference between the state’s map and the FCC mapping data. Differences pop out immediately. I wish every state would do this since it lets anybody in the state understand the broadband speed issue in their neighborhood. The differences between the two maps is amazing. There are entire counties that the FCC largely believes has access to 25/3 or faster broadband that show only limited coverage on the state version of the map.

The GBDI website also lets people search their address and also see the details of the local Census block.

A lot of states undertaking mapping efforts and speed tests, and this might be the only way to strong-arm the FCC into fixing its mapping efforts. The FCC plans to implement a new mapping regime, but unfortunately, if the agency doesn’t punish ISPs for reporting false data, the new maps might not be any better than the old maps.

A Huge FCC Giveaway

Is there is a way to take the worst broadband subsidy program ever and make it worse? The FCC just answered that question by extending the CAF II program for a seventh year.

The CAF II program paid the large price-cap telcos to supposedly upgrade rural broadband to speeds of at least 10/1 Mbps. Over $11 billion was paid out over six years starting in 2015 and completing this year. This money went to the big telcos like AT&T, CenturyLink, Frontier, Windstream, Consolidated, and a few others. Buried in the original awards was a provision that the carriers could elect to extend payments for a seventh year – and of course, they are doing so.

Why do I call this subsidy plan a failure? Even in 2015, it was ludicrous to spend money to build 10/1 Mbps broadband. 2015 is the same year that the FCC increased the definition of broadband to 25/3 Mbps and so the FCC was investing in new Internet infrastructure in 2015 that didn’t qualify as broadband at the time of the award of funding. Worse, the FCC gave the big telcos six years to complete the construction of the upgraded 10/1 Mbps architecture – which is this year. The FCC is still paying money in 2020 to upgrade rural customers to speeds of 10/1 Mbps.

But that’s not the worst of it because it doesn’t look like a lot of the upgrades were ever done. Our company helps rural counties assess the condition of broadband, and it’s rare in many rural places that were covered by CAF II to find even a single customer getting broadband speeds of at least 10/1 Mbps. We’ve done speed tests in counties this year where the average download speeds are 4 to 5 Mbps, with a significant number of customers getting speeds under 1 Mbps. The big telcos have been cheerily reporting progress to the FCC on implementing CAF II, but in the real world, it’s hard to find any evidence that many upgrades have been made.

I have seen the DSL in rural county seats get faster, and I suppose this was done with CAF II money – even though the funding was supposed to be used for rural customers. When DSL is upgraded in a county seat, the only rural customers that see any benefit have to be within a mile or so from the town.

To improve rural DSL to 10/1 Mbps requires building a significant amount of rural fiber so that customers are within four or five miles of a fiber node equipped with DSL gear. We’ve driven whole counties looking for evidence of such upgrades and have rarely found the needed new fiber construction or electronics huts. There is no need to take my word for this – states like Georgia and Minnesota have created broadband maps that are showing no evidence for most of the CAF II upgrades.

And now the FCC is going to pay a seventh year of funding to these same telcos – only this time the companies don’t have to spend the seventh year funds to improve broadband. Instead this money is seen as ‘support’ to the telcos. This is a straight giveaway that means $503 million for CenturyLink, $427 million for AT&T, and $313 million for Frontier – straight to the bottom line. This is the most blatant handout of federal broadband funds I’ve ever seen – because these funds won’t improve broadband for any rural customer. This will just help AT&T make its dividend payments and help ease Frontier coming out of bankruptcy. 

The original plan in 20i5 included the provision for the seventh year of payout – but the FCC could have changed that rule at any time in the last six years. This is over a billion dollars being wasted  that could instead be added to the RDOF fund to build rural fiber or put into some other worthwhile broadband grant fund. The FCC would benefit rural communities more if they just walked around handing out this cash to rural folks during the pandemic.

This FCC has been pro-big carrier from the start – but adding a seventh year of CAF II is hard to see as anything other than federal waste being done openly. The companies getting this money didn’t meet the obligations of the original CAF II funding and are now perversely getting rewarded for their failure. This kind of waste makes me ill when I do the math and realize that this money could instead be used to build fiber for everybody living in the poorest 40 counties in the country. I guess it’s more important to ‘support’ AT&T instead of rural households with no broadband.