FCC Further Defines Speed Tests

The FCC recently voted to tweak the rules for speed testing for ISPs who accept federal funding from the Universal Service Fund or from other federal funding sources. This would include all rate-of-return carriers including those taking ACAM funding, carriers that won the CAF II reverse auctions, recipients of the Rural Broadband Experiment (RBE) grants, Alaska Plan carriers, and likely carriers that took funding in the New York version of the CAF II award process. These new testing rules will also apply to carriers accepting the upcoming RDOF grants.

The FCC had originally released testing rules in July 2018 in Docket DA 18-710. Those rules applied to the carriers listed above as well as to all price cap carriers and recipients of the CAF II program. The big telcos will start testing in January of 2020 and the FCC should soon release a testing schedule for everybody else – the dates for testing were delayed until this revised order was issued.

The FCC made the following changes to the testing program:

  • Modifies the schedule for commencing testing by basing it on the deployment obligations specific to each Connect America Fund support mechanism;
  • Implements a new pre-testing period that will allow carriers to become familiar with testing procedures without facing a loss of support for failure to meet the requirements;
  • Allows greater flexibility to carriers for identifying which customer locations should be tested and selecting the endpoints for testing broadband connections. This last requirement sounds to me like the FCC is letting the CAF II recipients off the hook by allowing them to only test customers they know meet the 10/1 Mbps speeds.

The final order should be released soon and will hopefully answer carrier questions. One of the areas of concern is that the FCC seems to want to test the maximum speeds that a carrier is obligated to deliver. That might mean having to give customers the fastest connection during the time of the tests even if they have subscribed to slower speeds.

Here are some of the key provisions of the testing program that were not changed by the recent order:

  • ISPs can choose between three methods for testing. First, they may elect what the FCC calls the MBA program, which uses an external vendor, approved by the FCC, to perform the testing. This firm has been testing speeds for the network built by large telcos for many years. ISPs can also use existing network tools if they are built into the customer CPE that allows test pinging and other testing methodologies. Finally, an ISP can install ‘white boxes’ that provide the ability to perform the tests.
  • Testing, at least for now is perpetual, and carriers need to recognize that this is a new cost they have to bear due to taking federal funding.
  • The number of tests to be conducted will vary by the number of customers for which a recipient is getting support; With 50 or fewer households the test is for 5 customers; for 51-500 households the test is 10% of households. For 500 or more households the test is 50 households. ISPs declaring a high latency must test more locations with the maximum being 370.
  • Tests for a given customer are for one solid week, including weekends in each quarter. Tests must be conducted in the evenings between 6:00 PM and 12:00 PM. Latency tests must be done every minute during the six-hour testing window. Speed tests – run separately for upload speeds and download speeds – must be done once per hour during the 6-hour testing window.
  • ISPs are expected to meet latency standards 95% of the time. Speed tests must achieve 80% of the expected upland and download speed 80% of the time. An example of this requirement is that a carrier guaranteeing a gigabit of speed must achieve 800 Mbps 80% of the time. ISPs that meet the speeds and latencies for 100% of customers are excused from quarterly testing and only have to test once per year.
  • There are financial penalties for ISPs that don’t meet these tests.
  • ISPs that have between 85% and 100% of households that meet the test standards lose 5% of their FCC support.
  • ISPs that have between 70% and 85% of households that meet the test standards lose 10% of their FCC support.
  • ISPs that have between 55% and 75% of households that meet the test standards lose 15% of their FCC support.
  • ISPs with less than 55% of compliant households lose 25% of their support.
  • The penalties only apply to funds that haven’t yet been collected by an ISP.

Is OTT Service Effective Competition for Cable TV?

The FCC made an interesting ruling recently that signals the end of regulation of basic cable TV. Charter Communications had petitioned the FCC for properties in Massachusetts claiming that the properties have ‘effective competition’ for cable TV due to competition from OTT providers – in this case, due to AT&T DirecTV Now, a service that offers a full range of local and traditional cable channels.

The term effective communications is a very specific regulatory term and once a market reaches that status a cable company can change rates at will for basic cable. – the tiers that include local network stations.

The FCC agreed with Charter and said that the markets are competitive and granted Charter the deregulated status. This designation in the past has been granted in markets that have a high concentration of satellite TV or else that have a lot of alternative TV offered by a fiber or DSL overbuilder that has gained a significant share of the market.

In making this ruling the FCC effectively deregulated cable everywhere since there is no market today that doesn’t have a substantial amount of OTT content competing with cable companies. Cable providers will still have to go through the process of asking to deregulate specific markets, but it’s hard to think that after this ruling that the FCC can say no to any other petition.

From a regulatory perspective, this is probably the right ruling. Traditional cable is getting clobbered and it looks like the industry as a whole might lose 5-6 full percentage of market share this year and end up under a 65% national penetration rate. While we are in only the third year where cord cutting became a measurable trend, the cable industry customer losses are nearly identical to the market losses for landline telephone at the peak of that market decline.

There are two consequences for consumers in a market that is declared to be effectively competitive. First, it frees cable companies from the last vestiges of basic cable rate regulation. This is not a huge benefit because cable companies have been free for years to raise rates in higher tiers of service. In a competitive market, a cable provider is also no longer required to carry local network channels in the basic tier – although very few cable systems have elected this option.

I’ve seen several articles discussing this ruling that assume that this will result in an instant rate increase in these markets – and they might be right. It’s a headscratcher watching cable companies raising rates lately when higher rates are driving households to become cord cutters. But cable executives don’t seem to be able to resist the ability to raise rates, and each time they do, the overall revenue of a cable system increases locally, even with customer defections.

It’s possible that this ruling represents nothing more than the current FCC’s desire to deregulate as many things as possible. One interesting aspect of this ruling is that the FCC has never declared OTT services like SlingTV or DirecTV Now to be MVDPs (multichannel video program distributors) – a ruling that would pull these services into the cable TV regulatory regime. From a purely regulatory viewpoint, it’s hard to see how a non-MVDP service can meet the technical requirements of effective competition. However, from a practical perspective, it’s not hard to perceive the competition.

Interestingly, customers are not leaving traditional cable TV and flocking to the OTT services that emulate regular cable TV service. Those services have recently grown to become expensive and most households seem to be happy cobbling together packages of content from OTT providers like Netflix and Amazon Prime that don’t carry a full range of traditional channels. From that market perspective, one has to wonder how much of a competitor DirecTV Now was in the specific markets, or even how Charter was able to quantify the level of competition from a specific OTT service.

Mapping Cellular Data Speeds

AT&T recently filed comments in Docket 19-195, the docket that is looking to change broadband mapping, outlining the company’s proposal for reporting wireless data speeds to the FCC. I think a few of their recommendations are worth noting.

4G Reporting. Both AT&T and Verizon support reporting on 4G cellular speeds using a 5 Mbps download and 1 Mbps upload test with a cell edge probability of 90% and a loading of 50%. Let me dissect that recommendation a bit. First, this means that customer has a 90% chance of being able to make a data connection at the defined edge if a cell tower coverage range.

The more interesting reporting requirement is the 50% loading factor. This means the reported coverage area would meet the 5/1 Mbps speed requirement only when a cell site is 50% busy with customer connections. Loading is something you rarely see the cellular companies talk about. Cellular technology is like most other shared bandwidth technologies in that a given cell site shares bandwidth with all users. A cell site that barely meets the 5/1 Mbps data speed threshold when it’s 50% busy is going to deliver significantly slower lower speeds as the cell site gets busier. We’ve all experienced degraded cellular performance at rush hours – the normal peak times for many cell sites. This reporting requirement is a good reminder that cellular data speeds vary during the day according to how many people are using a cell site – something the cellular companies never bother to mention in their many ads talking about their speeds and coverage.

The recommended AT&T maps would show areas that meet the 5/1 Mbps speed threshold, with no requirement to report faster speeds. I find this recommendation surprising because Opensignal reports the average US speeds of 4G LTE across America is as follows:

2017 2018
AT&T 12.9 Mbps 17.87 Mbps
Sprint 9.8 Mbps 13.9 Mbps
T-Mobile 17.5 Mbps 21.1 Mbps
Verizon 14.9 Mbps 20.9 Mbps

I guess that AT&T favors the lowly 5/1 Mbps threshold since that will show the largest possible coverage area for wireless broadband. While many AT&T cell sites provide much faster speeds, my guess is that most faster cell sites are in urban areas and AT&T doesn’t want to provide maps showing faster speeds such as 15 Mbps because that would expose how slow their speeds are in most of the country. If AT&T offered faster speeds in most places, they would be begging to show multiple tiers of cellular broadband speeds.

Unfortunately, maps using the 5/1 Mbps criteria won’t distinguish between urban places with fast 4G LTE and more rural places that barely meet the 5 Mbps threshold – all AT&T data coverage will be homogenized into one big coverage map.

About the only good thing I can say about the new cellular coverage maps is that if the cellular companies report honestly, we’re going to see the lack of rural cellular broadband for the first time.

5G Broadband Coverage. I don’t think anybody will be shocked that AT&T (and the other big cellular companies) don’t want to report 5G. Although they are spending scads of money touting their roll-out of 5G they think it’s too early to tell the public where they have coverage.

AT&T says that requiring 5G reporting at this early stage of the new technology would reveal sensitive information about cell site location. I think customers who pony up extra for 5G want to know where they can use their new expensive handsets.

AT&T wants 5G coverage to fall under the same 5/1 Mbps coverage maps, even though the company is touting vastly faster speeds using new 5G phones.

It’s no industry secret that most of the announced 5G deployment announcements are mostly done for public relations purposes. For example, AT&T is loudly proclaiming the number of major cities that now have 5G, but this filing shows that they don’t want the public to know the small areas that can participate in these early market trials.

If 5G is a reasonable substitute for landline broadband, then the technology should not fall under the cellular reporting requirements. Instead, the cellular carriers should be forced to show where they offer speeds exceeding 10/1 Mbps, 25/3 Mbps and 100/10 Mbps, and 1 Gbps. I’m guessing a 5G map using these criteria would largely show a country that has no 5G coverage – but we’ll never know unless the FCC forces the wireless companies to tell the truth. I think that people should be cautious about speeding extra for 5G-capable phones until the cellular carriers are honest with them about the 5G coverage.

The FCC’s 15th Annual Broadband Deployment Report

The FCC just released its most recent annual report on the state of US broadband. This report is mandated by Section 706 of the Telecommunications Act of 1996 which requires the FCC to “determine whether advanced telecommunications capability is being deployed to all Americans in a reasonable and timely fashion”. The FCC concludes in this latest report that broadband deployment is reasonable and that actions taken by this Commission are helping to close the broadband gap.

I take exception to several findings in this latest report. First, everybody in the country now understands that the FCC’s conclusions are based upon dreadfully inaccurate 477 data reported by ISPs. There have been numerous studies undertaken at the state and local levels that show that the FCC maps undercount households without broadband. Even USTelecom, the group mostly representing the largest telcos showed that the FCC maps in Missouri and Virginia classified 38% of rural homes as being served when in fact they were unserved. Microsoft has been gathering credible data showing that well over 150 million homes aren’t connecting at the FCC’s defined broadband speed of 25/3 Mbps.

For the FCC to draw any conclusions based upon inaccurate 477 data is ridiculous. A few years ago the FCC could have claimed to not understand the extent to which their data is flawed, but they’ve been shown extensive evidence that the 477 data is incredibly bad, and yet they still plowed forward in this report pretending that statistics based upon 477 data have any meaning. There is not one number in this report that has even the slightest amount of credibility and the FCC knows this.

With the knowledge that the FCC now has about the inaccuracy of their data, this FCC should have humbly admitted that they don’t know the number of households that don’t have broadband. The report could have discussed ways that the Commission is trying to fix the bad data and described steps they have taken to improve rural broadband. But for this report to lead off with a claim that the number of homes without broadband fell by 18% in 2018 is a joke – there is zero chance that’s an accurate statistic. This report should have stated that external analysis has shown that the state of broadband is a lot worse than what they’ve reported in prior annual reports.

I also take exception to the opening statement of the report where the FCC claims that its top goal is “closing the digital divide and bringing the educational, healthcare, social, and civic benefits of connectivity to all Americans seeking broadband access.” This FCC’s top goal is clearly to eliminate regulatory rules that create any obligations for the largest carriers. This FCC already completely deregulated broadband – something an agency would never do if their goal was to improve broadband access. Most of the major dockets that have been approved by this FCC have made it easier for the big carriers to deploy 5G or to otherwise avoid any regulatory burdens.

It’s insulting to the American people for the agency to state that their top goal is improving broadband when their actions show that their priorities are elsewhere. Regulatory agencies are not supposed to engage in propaganda, and this document reeks of self-promotion.

Finally, this report trots out the oft-repeated message that broadband is improving because of this FCC’s effort to remove barriers to broadband investment. I don’t think Chairman Pai makes a speech or writes an opinion that doesn’t bring up this disproved argument. We know by now that those without broadband fall into two categories – rural homes that don’t have access to a broadband connection and urban households that can’t afford broadband. The big telcos aren’t spending any of their cash to solve these two problems.

There has been a lot of fiber built in recent years. AT&T built fiber to pass 12 million homes as a condition for its merger with DirecTV – an effort the company announced was done this past summer. Verizon has been building fiber to bolster their cellular network, including an expansion of small cell sites – largely as a way to reduce their reliance on paying transport to others. These fiber efforts have nothing to do with the repeal of net neutrality and the ending of broadband regulation. Chairman Pai probably ought to cut back on making this claim, because his real legacy is that he’s emboldened the big cable companies to begin regularly increasing broadband rates since there’s no threat of regulatory oversight. Chairman Pai and his light-touch regulation will get the credit for why broadband costs $100 per month a few years from now.

A New National Broadband Plan?

Christopher Terry recently published an article for the Benton Institute that details how the National Broadband Plan has failed. This plan was initiated by Congress in 2009, which instructed the FCC to develop a plan to make sure that every American had access to broadband within a decade. The article details the many spectacular ways that the plan has failed.

In my opinion, the National Broadband Plan never had the slightest chance of success because it didn’t have any teeth. Congress authorized the creation of the plan as a way for politicians to show that they were pro-broadband. The plan wasn’t much more than a big showy public relations stunt. Congress makes symbolic votes all of the time and this was just another gesture that demonstrated that Congress cared about broadband and that also served to quiet broadband proponents for a few years. If Congress cared about broadband they would have followed up the plan with a vote to force the FCC to implement at least some aspects of the plan.

I have no doubt that those who worked to develop the plan are likely offended by my post-mortem of the effort. I know that several people who worked on the plan still prominently display that fact in their resume a decade later. I’m sure that working on the plan was an exhilarating process, but at the end of the day, the effort must be measured in terms of success. The folks that created the plan and the rest of the country were duped by the FCC.

The FCC never had the slightest interest in adopting the big recommendations of the plan. There is probably no better evidence of this when the Tom Wheeler FCC awarded $11 billion to the big telcos in the CAF II process – an award that couldn’t have been more antithetical to the National Broadband Plan. To those that follow FCC dockets, there are dozens of examples over the last decade where the FCC sided with big carriers instead of siding with better rural broadband.

The fact is that the US government doesn’t do well with grandiose plans and lofty long-term goals. Government agencies like the FCC mostly implement things that are mandated by Congress – and even then they often do the bare minimum. Even without the National Broadband Plan, the FCC already has a Congressional mandate to make certain that rural broadband is equivalent to urban broadband – and we annually see them do a song and dance to show how they are complying with this mandate while they instead largely ignore it.

This is not to say that broadband plans are generically bad. For example, the state of Minnesota developed its own set of broadband goals, with the most prominent goal of defining broadband in the state as connections of at least 100 Mbps. The state has implemented that goal when awarding broadband grants, and unlike the FCC, the state has awarded grant funding to build real rural broadband solutions. They’ve refused to spend money on technologies that deliver speeds that the state doesn’t consider as broadband.

I fully expect to hear a plea to develop a new plan and I hope that most of the folks who are working for better broadband ignore any such effort. Compared to ten years ago there are now a lot of organizations working for better broadband. Hundreds of rural communities have created citizen broadband committees looking for a local solution. There are county governments all over the country making grants to help lure ISPs to serve their county. Statewide groups are working to solve the digital divide and the homework gap. There are a lot of people actively advocating for real broadband solutions.

These advocates don’t need a national goal document to tell them what they want. By now, communities understand good broadband in the simplest form – it’s something their community either has or doesn’t have. Communities now understand the digital divide and the homework gap. Wasting federal dollars to create a new National Broadband Plan wouldn’t move any community one inch closer to better broadband, and I hope we resist the temptation to go down that path.

Perverting the FCC Comment Process

In a recent article, BuzzFeed dug into the issue of the FCC receiving millions of bogus comments in the last two rounds of the net neutrality docket. During the 2015 net neutrality comment period, the agency received over 4 million comments. Many of these were legitimate comments such as many that were driven by HBO’s John Oliver, who prompted people to comment in favor of net neutrality.

When the new FCC wanted to reverse the original net neutrality order they had to again open up the docket for public comment. This second time the FCC got over 20 million comments. The comments were so voluminous that the FCC website crashed in May 2017.

There were fake comments filed on both sides of the issue. On the pro-net neutrality side were 8 million nearly identical comments that were tied to email addresses from FakeMailGenerator.com. There were another million comments from people with @pornhub.com email addresses. On the anti-net neutrality side Buzzfeed identified several organizations that uploaded millions of comments using names, addresses and email addresses that came from a major data breach. These fake comments were generated on behalf of real people who had no idea their name was being used in the FCC proceeding. The fake filings included comments from some people who had died and also some anti-net neutrality comments from a few Democrats in the House of Representatives who clearly were pro-net neutrality.

While the FCC’s net neutrality dockets received the largest number of fake comments, there are fake comments being filed in other FCC dockets and false comments are being made for legislation at state legislatures.

As somebody who often comments on FCC dockets, the fake comments give me heartburn. Flooding a docket with fake comments makes it likely that legitimate comments are not read or considered. What might be the most interesting thing about the net neutrality docket is that in both cases it was clear the FCC COmmissioners had already decided how they were going to vote – so the fake comments had no real impact. But most FCC dockets are not partisan. For example, there were a lot of fake comments filed in the docket that was considering changing the rules for cable cards – the devices that allow people to avoid paying for the cable company settop boxes. That kind of docket is not partisan and is more typical of the kinds of issues that the FCC has to wangle with.

Hopefully, legal action will be taken against the bad actors that were identified in the net neutrality filings. There are several companies that have been formed for the express purposes of generating large volumes of comments in government dockets. There is nothing wrong in working with organizations to generate comments to politicians. It’s almost a definition of the first amendment if AARP galvanizes members to comment against changes in social security. But it’s a perversion of democracy when fake comments are generated to try to influence the political process.

Fighting this issue was not made any easier when the current FCC under Ajit Pai ignored public records requests in 2017 that wanted to look deeper at the underlying fake comments. After a lawsuit was filed the FCC eventually responded to public records requests that led to investigations like the one described in the Buzzfeed article.

There are probably ways for the FCC and other agencies to restrict the volume of fake comments. For example, the FCC might end the process of allowing for large quantities of comments to be filed on a bulk basis. But federal agencies have to be careful to not kill legitimate comments. It’s not unusual for an organization to encourage members to file, and they often do so using the same language in multiple filings.

This is another example of how technology can be used for negative purposes – in essence, the FCC was hacked in these dockets. As long as there is a portal for citizens to make comments it’s likely that there will be fake comments made. Fake comments are often being made outside the government process and fake reviews are a big problem for web sites like Amazon and Yelp. We need to find a way to stop the fake comments from overwhelming the real comments.

The Fight Over Retransmission Consent

There has been a quiet legislative battle brewing in Congress all year concerning the renewing of STELAR (Satellite Television Extension and Localism Act Reauthorization). This is a bill that comes up for renewal every five years. The original bill in 1988 was intended to make sure that rural customers got access to major network television. We lived in a different world in 1988 and it was a technical challenge for satellite providers to get access to local network affiliates (ABC, CBS, FOX, NBC, and PBS) across the country and to broadcast those signals into the appropriate local markets. The original STELA legislation allowed the satellite companies to import network channels from other markets. Without the law, in 1988 numerous rural markets would have lost the ability to watch the major networks.

Of course, Congress loves to tack riders into legislation and the STELA legislation became the primary vehicle for updating the retransmission rules for all broadcasters and cable companies. The fight over the renewal of the legislation has been fierce this year since retransmission consent is the biggest issue of contention between broadcasters and cable TV providers.

Retransmission fees have exploded over the last decade. The average local network station now typically charges cable companies a fee of $3 – $4 per viewer for the right to retransmit their content on cable networks. Not too many years ago this was done for free.

It’s not hard to understand the motivation for the broadcast industry. Advertising revenues are in freefall due to cord-cutting and due to the proliferation of web advertising using Google and Facebook. Retransmission fees are a way for broadcasters to fill the coffers and replace that lost revenue. Interestingly, though, most of the retransmission revenue ends up at the big corporations that own the network channels. I’ve talked to local network station owners who say that their corporate parents suck away most of the retransmission revenues in the form of fees to continue with the network franchise. At the end of the day, most of the retransmission revenues end up with the parent companies of ABC (Disney), CBS (CBS Corporation), FOX (FOX Corporation), and NBC (Comcast).

There is no question that retransmission fees are hurting the public because they have been one of the primary drivers (along with sports programming) for ongoing substantial rate increases.  The average cable subscriber is now paying between $12 and $15 per month for the right to view network channels on their cable system. These are the fees that many cable companies have been hiding in something like a ‘broadcast fee’ to allow them to still advertise a low price for basic cable.

Like with many of the most contentious issues, the fight is largely about money. With the current number of cable customers around 85 million, retransmission fees are generating $12 to $14 billion per year. However, if you read the comments of the two sides of the issues you would think the argument was fully about protecting the consumer. Many of the arguments being made are about stopping blackouts – which occur when broadcasters and cable companies can’t agree on the fees and conditions for buying programming. If the issue was really about the consumer then Congress would be talking about capping retransmission fees or at least limiting the annual increase of the fees.

To some degree, the issue transcends cord-cutting. Anybody buying an online service that includes the major networks is also paying some version of these same fees. That’s one of the primary reasons why the prices of the only TV-equivalent services have been rising. Online services have more flexibility because they are not required to carry any specific programming. However, once a service decides to carry the four major networks, they are somewhat at the mercy of the broadcasters. As an example, I currently subscribe to Playstation Vue which carriers the same local network affiliates that I would also get from Charter. One has to imagine that the fees charged to Playstation Vue are similar to what is being charged to the local cable company.

The way to know this is a huge issue is because the industry has created organizations focused on this one issue. For example, most of the cable companies other than Comcast (who is on the opposite side for this issue) have created the American Television Alliance to lobby against certain provisions of the bill. If you look at their website it looks like it’s a consumer-friendly site, but the members are largely the big cable companies. This is a bogus lobbying organization created solely to lobby on this legislation.

The legislation was introduced in July as the Modern Television Act of 2019. The five-year clock on the last legislation expires soon, but in 2014 the legislation was passed many months after the expiration date. It doesn’t look likely for the latest legislation to pass on time.

Is Telephony a Natural Monopoly?

For my entire career, I’ve heard it said that telecommunications is a natural monopoly. That was the justification for creating monopoly exchange boundaries for telcos and for issuing exclusive franchise agreements for cable companies. This historic reasoning is why the majority of Americans in urban areas are still stuck with duopoly competition that is trending towards a cable monopoly.

I worked for Southwestern Bell pre-divestiture and they were proud of their monopoly. Folks at Ma Bell thought the telephone monopoly was the best possible deal for the public and they constantly bragged about the low rates for a residential telephone line, usually at something less than $15 per month. But when you looked closer, the monopoly was not benefitting the average household. Long distance was selling for 12 cents to 25 cents per minute and a major percentage of households had monthly phone bills over $100 per month.

I’ve been doing some reading on the history of the telephone industry and found some history I never knew about – and which is different than what Ma Bell told employees for 100 years.

Alexander Graham Bell was granted many patents for telephone service in 1876. During the 18-year life of the original patents, Bell telephone held a monopoly on telephone service. Bell Telephone mostly built to large businesses and to rich neighborhoods and the country still predominantly communicated via telegraph. Bell Telephone was not considered much of a success. By 1894 there was still less than 5 telephones in the country per 1,000 population, and there were only 37 average calls per day per 1,000 people.

As soon as the patents expired, numerous competitors entered the market. They built to towns that Bell Telephone had ignored but also built a competing network in many Bell Telephone markets. By the end of 1896, 80 competitors that had grabbed 5% of the total telephone market. By 1900 there were 3,000 competitive telephone companies.

By 1907 the competitors had grabbed 51% of the national market and had also driven down urban telephone rates. AT&T’s returns (AT&T had officially become the name of Bell Telephone) had dropped from 46% annually in the late 1800s to 8% by 1906. After 17 years of monopoly, the country had only 270,000 telephones. After 13 years of competition there were over 6 million phones in the country.

The death of telephone competition started when Theodore Vail became president of AT&T in 1907. By 1910 the company was buying competitors and lobbying for a monopoly scenario. Federal regulators stepped in to slow Bell’s the purchase of telephone companies after Vail tried to buy Western Union.

In a compromise reached with the federal government, AT&T agreed to stop buying telcos and to interconnect with independent telephone companies to create one nationwide network. That compromise was known as the Kingsbury Commitment. Vail used this compromise to carve out monopoly service areas by only agreeing to interconnect with companies that would create exchange boundaries and further agree not to compete in AT&T exchanges. With almost the opposite result that federal regulators had hoped for, the Kingsbury Commitment resulted in a country carved into AT&T monopoly telephone service areas.

From that time forward federal regulators supported the new monopoly borders, cementing the arrangement with the Telecommunications Act of 1934. State regulators liked the monopolies because they were easier to regulate – state regulation turned into rate-making procedures that raised rates on businesses to keep lower residential rates. AT&T thrived in this environment because they were guaranteed a rate of return, regardless of performance.

The history of telephone service shows that the industry is not a natural monopoly. A natural monopoly is one where one provider can produce lower rates than are achieved by allowing competition. Competing networks forced lower telephone rates at the turn of the last century. After the establishment of the AT&T monopoly we saw monopoly abuse through high long distance rates that didn’t drop until MCI challenged the monopoly status quo. Today we have a world full of multiple wires and networks and the idea of natural monopoly is no longer considered as valid. Unfortunately, many of the vestiges of the regulations that protect the big telcos are still in place and still create hurdles to unfettered competition.

Shame on the Regulators

It’s clear that even before the turn of this century that the big telcos largely walked away from maintaining and improving residential service. The evidence for this is the huge numbers of neighborhoods that are stuck with older copper technologies that haven’t been upgraded.  The telcos made huge profits over the decades in these neighborhoods and ideally should not have been allowed to walk away from their customers.

In the Cities. Many neighborhoods in urban areas still have first or second-generation DSL over copper with fastest speeds of 3 Mbps or 6 Mbps. That technology had a shelf-life of perhaps seven years and is now at least fifteen years old.

The companies that deployed the most DSL are AT&T and CenturyLink (formerly Quest). The DSL technology should have been upgraded over time by plowing profits back into the networks. This happened in some neighborhoods, but as has been shown in several detailed studies in cities like Cleveland and Dallas, the faster DSL was brought to more affluent neighborhoods, leaving poorer neighborhoods, even today, with the oldest DSL technology.

The neighborhoods that saw upgrades saw DSL speeds between 15 Mbps and 25 Mbps. Many of these neighborhoods eventually saw speeds as fast as 50 Mbps using a technology that bonded two 25 Mbps DSLs circuits. There are numerous examples of neighborhoods with 50 Mbps DSL sitting next to ones with 3 Mbps DSL.

Verizon used a different tactic and upgraded neighborhoods to FiOS fiber. But this was also done selectively although Verizon doesn’t seem to have redlined as much as AT&T, but instead built FiOS only where the construction cost was the lowest.

In Europe, the telcos decided to complete with the cable companies and have upgraded DSL over time, with the fastest DSL today offering speeds as fast as 300 Mbps. There is talk coming out of DSL vendors talking about ways to goose DSL up to gigabit speeds (but only for short distances). The telcos here basically stopped looking at better DSL technology after the introduction of VDSL2 at least fifteen years ago.

By now the telcos should have been using profits to build fiber. AT&T has done this using the strategy of building little pockets of fiber in every community near to existing fiber splice points. However, the vast majority of rural households served by AT&T are not being offered fiber, and AT&T said recently that they have no plans to build more fiber. CenturyLink built fiber to past nearly 1 million homes a few years ago, but that also seems like a dead venture going forward. But now, in 2019, each of these telcos should have been deep into urban neighborhoods in their whole service area with fiber. Had they done so they would not be getting clobbered so badly by the cable companies that are taking away millions of DSL customers every year.

Rural America. The big telcos started abandoning rural America as much as thirty years ago. They’ve stopped maintaining copper and have not voluntarily made any investments in rural America for a long time. There was a burst of rural construction recently when the FCC gave them $11 billion to improve rural broadband to 10/1 Mbps – but that doesn’t seem to be drawing many rural subscribers.

It’s always been a massive challenge to bring the same speeds to rural America that can be provided in urban America. This is particularly so with DSL since the speeds drop drastically with distance. DSL upgrades that could benefit urban neighborhoods don’t work well in farmland. But the telcos should have been expanding fiber deeper into the network over time to shorten loop lengths. Many independent telephone companies did this the right way and they were able over time to goose rural DSL speeds up to 25 Mbps.

The big telcos should have been engaging in a long-term plan to continually shorten rural copper loop lengths. That meant building fiber, and while shortening loop lengths they should have served households close to fiber routes with fiber. By now all of the small towns in rural America should have gotten fiber.

This is what regulated telcos are supposed to do. The big telcos made vast fortunes in serving residential customers for many decades. Regulated entities are supposed to roll profits back into improving the networks as technology improves – that’s the whole point of regulating the carrier of last resort.

Unfortunately, the industry got sidetracked by competition from CLECS. This competition first manifested in competition for large business customers. The big telcos used that competition to convince regulators they should be deregulated. Over time the cable companies provided real residential competition in cities, which led to the de facto total deregulation of telcos.

In Europe, the telcos never stopped competing in cities because regulators didn’t let them quit. The telcos have upgraded to copper speeds that customers still find attractive, but the telcos all admit that the next upgrade needs to be fiber. In the US, the big telcos exerted political pressure to gain deregulation at the first hint of competition. US telcos folded and walked away from their customers rather than fighting to maintain revenues.

Rural America should never have been deregulated. Shame on every regulator in every state that voted to deregulate the big telcos in rural America. Shame on every regulator that allowed companies like Verizon palm off their rural copper to companies like Frontier – a company that cannot succeed, almost by definition.

In rural America the telcos have a physical network monopoly and the regulators should have found ways to support rural copper rather than letting the telcos walk away from it. We know this can be done by looking at the different approaches taken by the smaller independent telephone companies. These small companies took care of their copper and most have now taken the next step to upgrade to fiber to be ready for the next century.

Court Upholds Repeal of Net Neutrality

The DC Circuit Court of Appeals ruled on the last day of September that the FCC had the authority to kill Title II regulation and to repeal net neutrality. However, the ruling wasn’t entirely in the FCC’s favor. The agency was ordered to look again at how the repeal of Title II regulation affects public safety. In a more important ruling, the courts said that the FCC didn’t have the authority to stop states and municipalities from establishing their own rules for net neutrality.

This court was ruling on the appeal of the FCCs net neutrality order filed by Mozilla and joined by 22 states and a few other web companies like Reddit and Etsy. Those appeals centered on the FCC’s authority to kill Title II regulation and to hand broadband regulation to the Federal Trade Commission.

Net neutrality has been a roller coaster of an issue. Tom Wheeler’s FCC put the net neutrality rules in place in 2015. An appeal of that case got a court ruling that the FCC was within its power to implement net neutrality. After a change in administration, the Ajit Pai FCC killed net neutrality in 2017 by also killing Title II regulation. Now the courts have said that the FCC also has the authority to not regulate net neutrality.

The latest court order will set off another round of fighting about net neutrality. The FCC had quashed a law in California to introduce their version of net neutrality and this order effectively will allow those California rules to go into effect. That battle is far from over and there will be likely new appeals against the California rules and similar rules enacted in Washington. It wouldn’t be surprising to see other states enact rules in the coming year since the net neutrality issue is overwhelmingly popular with voters. It’s possibly the worst of all worlds for big ISPs if they have to follow different net neutrality rules in different states. I think they’d much prefer federal net neutrality rules rather than different rules in  a dozen states.

The reversal of net neutrality rules only went effect in June of 2018 and there have been no major violations of the old rules since then. The ISPs were likely waiting for the results of this court ruling and also are wary of a political and regulatory backlash if they start breaking net neutrality rules. The closest thing we had to a big issue was mentioned in this ruling. Verizon had cut off broadband for firemen in California who were working on wildfires after the firemen exceeded their monthly data caps. It turns out that wasn’t a net neutrality violation, but rather an enforcement issue on a corporate cellular account. But the press on that case was bad enough to prompt the courts to require the FCC to take another look at how ISPs treat public safety.

This issue is also far from over politically. Most of the democratic presidential candidates have come out in favor of net neutrality and if Democrats win the White House you can expect a pro-net neutrality chairman of the FCC. Chairman Pai believes that by killing Title II regulation that a future FCC will have a harder time putting the rules back in place. But the two court appeals have shown that the courts largely believe the FCC has the authority to implement or not implement net neutrality as they see fit.

While net neutrality is getting all of the press, the larger issue is that the FCC has washed its hands of broadband regulation. The US is the only major economy in the world to not regulate the broadband industry. This makes little sense in a country where are a large part of the country is still controlled by the cable/telco duopoly, which many argue is quickly becoming a cable monopoly. It’s easy to foresee bad behavior from the big ISPs if they aren’t regulated. We’ve seen the big ISPs increase broadband rates in the last few years and there is no regulatory authority in the country that can apply any brakes to the industry. The big ISPs are likely to demand more money out of Google, Facebook and the big web companies.

The FCC handed off the authority to regulate broadband to the Federal Trade Commission. That means practically no regulation because the FTC tackles a single corporation for bad behavior but does not establish permanent rules that apply to other similar businesses. The FTC might slam AT&T or Comcast from time to time, but that’s not likely to change the behavior of the rest of the industry very much.

There is only one clear path for dealing with net neutrality. Congress can stop future FCC actions and the ensuing lawsuits by passing a clear set of laws that either implements net neutrality or that forbids it. However, until there is a Congress and a White House willing to together implement such a law this is going to continue to bounce around.

The big ISPs and Chairman Pai argued that net neutrality was holding back broadband investments in the country – a claim that has no basis when looking at the numbers. However, there is definitely an impact in the industry from regulatory uncertainty, and nobody is benefitting from an environment where subsequent administrations alternately pass and repeal net neutrality. We need to resolve this once way or the other.