Data Caps Again?

My prediction is that we are going to see more stringent data caps in our future. Some of the bigger ISPs have data caps today, but for the most part the caps are not onerous. But I foresee data caps being reintroduced as another way for big ISPs to improve revenues.

You might recall that Comcast tried to introduce a monthly 300 GB data cap in 2015. When customers hit that mark Comcast was going to charge $10 for every additional 50 GB of download, or $30 extra for unlimited downloading.

There was a lot of public outcry about those data caps. Comcast backed down from the plan due to pressure from the Tom Wheeler FCC. At the time the FCC probably didn’t have the authority to force Comcast to kill the data caps, but the nature of regulation is that big companies don’t go out of their way to antagonize regulators who can instead cause them trouble in other areas.

To put that Comcast data cap into perspective, in September of 2017 Cisco predicted that home downloading of video would increase 31% per year through 2021. They estimated the average household data download in 2017 was already around 130 GB per month. You might think that means that most people wouldn’t be worried about the data caps. But it’s easy to underestimate the impact of compound growth and at a 31% growth rate the average household download of 130 GB would grow to 383 gigabits by 2021 – considerably over Comcast’s propose data cap.

Even now there are a lot of households that would be over that caps. It’s likely that most cord cutters use more than 300 GB per month – and it can be argued that the Comcast’s data caps would punish those who drop their video. My daughter is off to college now and our usage has dropped, but we got a report from Comcast when she was a senior that said we used over 600 GB per month.

So what are the data caps for the largest ISPs today?

  • Charter, Altice, Verizon and Frontier have no data caps.
  • Comcast moved their data cap to 1 terabyte, with $10 for the first 50 GB and $50 monthly for unlimited download.
  • AT&T has almost the stingiest data caps. The cap on DSL is 150 GB, on U-verse is 250 GB, on 300 Mbps FTTH is 1 TB and is unlimited for a Gbps service. They charge $10 per extra 50 GB.
  • CenturyLink has a 1 TB cap on DSL and no cap on fiber.
  • Cox has a 1 TB cap with $30 for an extra 500 GB or $50 unlimited.
  • Cable One has no charge but largely forces customers who go over caps to upgrade to more expensive data plans. Their caps are stingy – the cap on a 15 Mbps DSL connection is 50 GB.
  • Mediacom has perhaps the most expensive data caps – 60 Mbps cap is 150 GB, 100 Mbps is 1 TB. But the charge for violating the cap is $10 per GB or $50 for unlimited.

Other than AT&T, Mediacom and Cable One none of the other caps sound too restrictive.

Why do I think we’ll see data caps again? All of the ISPs are looking forward just a few years and wondering where they will find the revenues to increase the demand from Wall Street for ever-increasing earnings. The biggest cable companies are still growing broadband customers, mostly by taking customers from DSL. But they understand that the US broadband market is approaching saturation – much like has happened with cellphones. Once every home that wants broadband has it, these companies are in trouble because bottom line growth for the last decade has been fueled by the growth of broadband customers and revenues.

A few big ISPs are hoping for new revenues from other sources. For instance, Comcast has already launched a cellular product and also is seeing good success with security and smart home service. But even they will be impacted when broadband sales inevitably stall – other ISPs will feel the pinch before Comcast.

ISPs only have a few ways to make more money once customer growth has stalled, with the primary one being higher rates. We saw some modest increases earlier this year in broadband rates – something that was noticeable because rates have been the same for many years. I fully expect we’ll start seeing sizable annual increases in broadband rates – which go straight to the bottom line for ISPs. The impact from broadband rate increases is major for these companies – Comcast and Charter, for example, make an extra $250 million per year from a $1 increase in broadband rates.

Imposing stricter data caps can be as good as a rate increase for an ISPs. They can justify it by saying that they are charging more only for those who use the network the most. As we see earnings pressure on these companies I can’t see them passing up such an easy way to increase earnings. In most markets the big cable companies are a near monopoly and consumers who need decent speeds have fewer alternative as each year passes.Since the FCC has now walked away from broadband regulations there will be future regulatory hindrance to the return of stricter data caps.

Is AT&T Violating Net Neutrality?

I got a text on my AT&T cellphone last month that told me that my wireless plan now includes sponsored data. Specifically they told me that I could now stream movies and other content from DirecTV or U-Verse TV without the video counting against my monthly data cap. This has been available to AT&T post-paid customers for a while, but now is apparently available to all customers. What I found most interesting about the message was that it coincided with the official end of net neutrality.

AT&T is not the first cellular company to do this. Verizon tried this a few years ago, although that attempt was largely unsuccessful because they didn’t offer much content that people wanted to watch. T-Mobile does something similar with their Binge-on program, but since most of their data plans are unlimited, customers can watch anything on their phones, not just the Binge-on video.

The sponsored data from AT&T would be a direct violation of net neutrality if it was still in effect and is a textbook example of paid prioritization. By excusing the DirecTV content from cellular data caps they have created an advantage for DirecTV compared to competitors. It doesn’t really matter that AT&T also happens to own DirecTV, and I imagine that AT&T is now shopping this same idea around to other video providers.

So what is wrong with what AT&T is doing? Certainly their many customers that buy both AT&T cellphones and DirecTV will like the plan. Cellular data in the US is still some of the most expensive data in the world and letting customers watch unlimited video from a sponsored video provider is a huge benefit to customers. Most people are careful to not go over monthly data limits, and that means they carefully curtail watching video on cellphones. But customers taking advantage of sponsored video are going to watch video that would likely have exceeded their monthly data cap – it doesn’t take more than a handful of movies to do that.

AT&T has huge market power with almost 140 million cellphones users on their network at the end of last year. Any video provider they sponsor is going to gain a significant advantage over other video providers. AT&T customers that like watching video on their cellphones are likely to pick DirecTV over Comcast or any other video provider.

It’s also going to be extremely tempting for AT&T to give prioritized routing to DirecTV video – what means implementing the Internet fast lane. AT&T is going to want their cellular customers to have a quality experience, and they can do that by making sure that DirecTV video has the best connections throughout their network. They don’t necessarily have to throttle other video to make DirecTV better – they can just make sure that DirectTV video gets the best possible routing.

I know to many people the AT&T plan is going to feel somewhat harmless. After all, they are bundling together their own cellular and video products. But it’s a short step from here for AT&T to start giving priority to content from others who are willing to pay for it. It’s not to hard to imagine them offering the same plan to Netflix, YouTube or Facebook.

If this plan expands beyond AT&T’s own video, we’ll start seeing the negative impacts of paid prioritization:

  • Only the biggest companies like Netflix, Facebook or Google can afford to pay AT&T for the practice. This is going to shut out smaller video providers and start-ups. Already in the short history of the web we’ve seen a big turnover in the popular platforms on the web – gone or greatly diminished are earlier platforms like AOL, CompuServe and Prodigy. But with the boost given by paid prioritization the big companies today will get a step-up to remain as predominant players on the web. Innovation is going to be severely hampered.
  • This is also the beginning of a curated web where many people only see the world through the filter of the predominant web services. We already see that phenomenon a lot today, but when people are funneled to only using the big web services this will grow and magnify.
  • It’s not hard to imagine the next step where we see reduced price data plans that are ‘sponsored’ by somebody like Facebook. Such platforms will likely make it a challenge for customers to step outside their platform. And that will lead to a segmentation and slow death of the web as we know it.

Interestingly, the Tom Wheeler FCC told AT&T that this practice was unacceptable. But through the change of administration AT&T never stopped the practice and is now expanding it. It’s likely that courts are going to stay some or all of the net neutrality order until the various lawsuits on the issue get resolved. But AT&T clearly feels emboldened to move forward with this, probably since they know the current FCC won’t address the issue even if net neutrality stays in effect.

Progress of the CAF II Program

If readers recall, the CAF II program is providing funds to the largest telcos to upgrade rural facilities in their incumbent operating territories to broadband speeds of at least 10 Mbps down and 1 Mbps up. The CAF II deployment began in the fall of 2015 and lasts for 6 years, so we are now almost 2.5 years into the deployment period. I was curious about how the bigger telcos are doing in meeting their CAF II build-out requirements. The FCC hasn’t published any progress reports on CAF II deployments, so I found the following from web searches:

AT&T. The company took $427 million annually for the six years ($2.56 billion) to bring broadband to 2.2 million rural customers. The company has said they are going to use a combination of improved DSL and fixed wireless broadband using their cellular frequencies to meet their build-out requirements. From their various press releases it seems like they are planning on more wireless than wireline connections (and they have plans in many rural places of tearing down the copper).

The only big public announcement of a wireless buildout for AT&T is a test in Georgia initiated last year. On their website the company says their goal at the end of 2018 is to offer improved broadband to 440,000 homes, which would mean a 17% CAF II coverage at just over the mid-point of their 6-year build-out commitment.

On a side note, AT&T had also promised the FCC, as a condition of the DirecTV merger that they would be pass 12.5 million homes and business with fiber by mid-2019. They report reaching only 4 million by the end of 2017.

CenturyLink. CenturyLink accepted $500 million annually ($3 billion) in CAF II funding to reach 1.2 million rural homes. In case you’re wondering why CenturyLink is covering only half of the homes as AT&T for roughly the same funding – the funding for CAF II varies by Census block according to density. The CenturyLink coverage area is obviously less densely populated than the areas being covered by AT&T.

FierceTelecom reported in January that CenturyLink has now upgraded 600,000 CAF II homes by the end of last year, or 37% of their CAF II commitment. The company says that their goal is to have 60% coverage by the end of this year. CenturyLink is primarily upgrading rural DSL, although they’ve said that they are considering using point-to-multipoint wireless for the most rural parts of the coverage areas. The company reports that in the upgrades so far that 70% of the homes passed so far can get 20 Mbps download or faster.

Frontier. The last major recipient of CAF II funding is Frontier. The company originally accepted $283 million per year to upgrade 650,000 passings. They subsequently acquired some Verizon properties that had accepted $49 million per year to upgrade 37,000 passings. That’s just under $2 billion in total funding.

FierceTelecom reported in January that Frontier reached 45% of the CAF II area with broadband speeds of at least 10/1 Mbps by the end of 2017. The company also notes that in making the upgrades for rural customers that they’ve also upgraded the broadband in the towns near the CAF II areas and have increased the broadband speeds of over 900,000 passings nationwide.

Frontier is also largely upgrading DSL, although they are also considering point-to-multipoint wireless for the more rural customers.

Other telcos also took major CAF II funding, but I couldn’t find any reliable progress reports on their deployments. This includes Windstream ($175 million per year), Verizon ($83 million per year), Consolidated ($51 million per year), and Hawaiian Telcom ($26 million per year).

The upcoming reverse auction this summer will provide up to another $2 billion in funding to reach nearly 1 million additional rural homes. In many cases these are the most remote customers, and many are found in many of the same areas where the CAF II upgrades are being made. It will be interesting to see if the same telcos will take the funding to finish the upgrades. There is a lot of speculation that the cellular carriers will pursue a lot of the reverse auction upgrades.

But the real question to be asked for these properties is what comes next. The CAF II funding lasts until 2021. The speeds being deployed with these upgrades are already significantly lower than the speeds available in urban America. A household today with a 10 Mbps download speed cannot use broadband in the ways that are enjoyed by urban homes. My guess is that there will be continued political pressure to continue to upgrade rural speeds and that we haven’t seen the end of the use of the Universal Service Fund to upgrade rural broadband.

States and Net Neutrality

We now know how states are going to react to the end of net neutrality. There are several different responses so far. First, a coalition of 23 states filed a lawsuit challenging the FCC’s ability to eliminate net neutrality and Title II regulation of broadband. The lawsuit is mostly driven by blue states, but there are red states included like Mississippi and Kentucky.

The lawsuit argues that the FCC has exceeded its authority in eliminating net neutrality. The lawsuit makes several claims:

  • The suit claims that under the Administrative Procedure Act (ACA) the FCC can’t make “arbitrary and capricious” changes to existing policies. The FCC has defended net neutrality for over a decade and the claim is that the FCC’s ruling fails to provide enough justification for abandoning the existing policy.
  • The suit claims that the FCC ignored the significant public record filed in the case that details the potential harm to consumers from ending net neutrality.
  • The suit claims that the FCC exceeded its authority by reclassifying broadband service as a Title I information service rather than as a Title II telecommunications service.
  • Finally, the suit claims that the FCC ruling improperly preempts state and local laws.

Like with past challenges of major FCC rulings, one would expect this suit to go through at least several levels of courts, perhaps even to the supreme court. It’s likely that the loser of the first ruling will appeal. This process is likely to take a year or longer. Generally, the first court to hear the case will determine quickly if some or all of the FCC’s ruling net neutrality order will be stayed until resolution of the lawsuit.

I lamented in a recent blog how partisan this and other FCCs have gotten. It would be a positive thing for FCC regulation in general if the courts put some cap on the ability of the FCC to create new policy without considering existing policies and the public record about the harm that can be caused by a shift in policy. Otherwise we face having this and future FCCs constantly changing the rules every time we get a new administration – and that’s not healthy for the industry.

A second tactic being used by states is to implement a state law that effectively implements net neutrality at the state level. The states of New York, New Jersey and Montana have passed laws that basically mimic the old FCC net neutrality rules at the state level. It’s an interesting tactic and will trigger a lawsuit about state rights if challenged (and I have to imagine that somebody will challenge these laws). I’ve read a few lawyers who opine that this tactic has some legs since the FCC largely walked away from regulating broadband, and in doing so might have accidentally opened up the door for the states to regulate the issue. If these laws hold up that would mean a hodgepodge of net neutrality rules by state – something that benefits nobody.

Another tactic being taken is for states, and even a few cities, to pass laws that change the purchasing rules so that any carrier that bids for government telecom business must adhere to net neutrality. This is an interesting tactic and I haven’t seen anybody that thinks this is not allowed. Governments have wide latitude in deciding the rules for purchasing goods and services and there are already many similar restrictions that states put onto purchasing. The only problem with this tactic is going to be if eventually all of the major carriers violate the old net neutrality rules. That could leave a state with poor or no good choice of telecom providers.

As usual, California is taking a slightly different tactic. They want to require that carriers must adhere to net neutrality if they use state-owned telecom facilities or facilities that were funded by the state. Over the years California has built fiber of its own and also given out grants for carriers to build broadband networks. This includes a recently announced grant program that is likely to go largely to Frontier and CenturyLink. If this law is upheld it could cause major problems for carriers that have taken state money in the past.

It’s likely that there are going to be numerous lawsuits challenging different aspects of the various attempts by states to protect net neutrality. And there are likely to also be new tactics tried by states during the coming year to further muddy the picture. It’s not unusual for the courts to finally decide the legitimacy of major FCC decisions. But there are so many different tactics being used here that we are likely to get conflicting rulings from different courts. It’s clearly going to take some time for this to all settle out.

One interesting aspect of all of this is how the FCC will react if their cancellation of net neutrality is put on hold by the courts. If that happens it means that some or all of net neutrality will still be the law of the land. The FCC always has the option to enforce or not enforce the rules, so you’d suspect that they wouldn’t do much about ISPs that violate the spirit of the rules. But more importantly, the FCC is walking away from regulating broadband as part of killing Title II regulation. They are actively shuttling some regulatory authority to the FTC for issues like privacy. It seems to me that this wouldn’t be allowed until the end of the various lawsuits. I think the one thing we can count on is that this is going to be a messy regulatory year for broadband.

Broadband Regulation in Limbo

The recent ruling earlier this week by the US Court of Appeals for the 9th Circuit highlights the current weak state of regulations over broadband. The case is one that’s been around for years and stems from AT&T’s attempt to drive customers off of their original unlimited cellphone data plans. AT&T began throttling unlimited customers when they reached some unpublished threshold of data use, in some cases as small as 2 GB in a month. AT&T then lied to the FCC about the practice when they inquired. This case allows the FTC suit against AT&T to continue.

The ruling demonstrates that the FTC has some limited jurisdiction over common carriers like AT&T. However, the clincher came when the court ruled that the FTC only has jurisdiction over issues where the carriers aren’t engaging in common-carrier services. This particular case involves AT&T not delivering a product they promised to customers and thus falls under FTC jurisdiction. But the court made it clear that future cases that involve direct common carrier functions, such as abuse of net neutrality would not fall under the FTC.

This case clarifies the limited FTCs jurisdiction over ISPs and contradicts the FCC’s statements that the FTC is going to be able to step in and take their place on most matters involving broadband. The court has made it clear that is not the case. FCC Chairman Ajit Pai praised this court ruling and cited it as a good example of how the transition of jurisdiction to the FTC is going to work as promised. But in looking at the details of the ruling, that is not true.

This court ruling makes it clear that there is no regulatory body now in charge of direct common carrier issues. For instance, if Netflix and one of the ISPs get into a big fight about paid prioritization there would be nowhere for Netflix to turn. The FCC would refuse to hear the case. The FTC wouldn’t be able to take the case since it involves a common carrier issue. And while a court might take the case, they would have no basis on which to make a ruling. As long as the ISP didn’t break any other kinds of laws, such as reneging on a contract, a court would have no legal basis on which to rule for or against the ISPs behavior.

That means not only that broadband is now unregulated, it also means that there is no place for some body to complain against abuse by ISPs until the point where that abuse violates some existing law. That is the purest definition of limbo that I can think of for the industry.

To make matters worse, even this jumbled state of regulation is likely to more muddled soon by the courts involved in the various net neutrality suits. Numerous states have sued the FCC for various reasons, and if past practice holds, the courts are liable to put some or all of the FCC’s net neutrality decision on hold.

It’s hard to fathom what that might mean. For example, if the courts were to put the FCC’s decision to cancel Title II regulation on hold, then that would mean that Title II regulation would still be the law of the land until the net neutrality lawsuits are finally settled. But this FCC has made it clear that they don’t want to regulate broadband and they would likely ignore such a ruling in practice. The Commission has always had the authority to pick and choose cases it will accept and I’m picturing that they would refuse to accept cases that relied on their Title II regulation authority.

That would be even muddier for the industry than today’s situation. Back to the Netflix example, if Title II regulation was back in effect and yet the FCC refused to pursue a complaint from Netflix, then Netflix would likely be precluded from trying to take the issue to court. The Netflix complaint would just sit unanswered at the FCC, giving Netflix no possible remedy, or even a hearing about their issues.

The real issue that is gumming up broadband regulation is not the end of Title II regulation. The move to Title II regulation just became effective with the recent net neutrality decision and the FCCs before that had no problem tackling broadband issues. The real problem is that this FCC is washing their hands of broadband regulation, and supposedly tossed that authority to the FTC – something the court just made clear can’t work in the majority of cases.

This FCC has shown that there is a flaw in their mandate from Congress in that they feel they are not obligated to regulate broadband. So I guess the only fix will be if Congress makes the FCC’s jurisdiction, or lack of jurisdiction clear. Otherwise, we couldn’t even trust a future FCC to reverse course, because it’s now clear that the decision to regulate or not regulate broadband is up to the FCC and nobody else. The absolute worst long-term outcome would be future FCCs regulating or not regulating depending upon changes in the administration.

My guess is that AT&T and the other big ISPs are going to eventually come to regret where they have pushed this FCC. There are going to be future disputes between carriers and the ISPs are going to find that the FCC can not help them just like they can’t help anybody complaining against them. That’s a void that is going to serve this industry poorly.

Another FCC Giveaway

The FCC just voted implement a plan to give up to $4.53 billion dollars to the cellular carriers over the next ten years to bring LTE cellular and data to the most remote parts of America. While this sounds like a laudable goal, this FCC seems determined to hand out huge sums of money to the biggest telecom companies in the country. This program is labeled Mobility II and will be awarded through an auction among the cellular companies.

As somebody who travels frequently in rural America there certainly are still a lot of places with poor or no cellphone coverage. My guess is that the number of people that have poor cellphone coverage is greater than what the FCC is claiming. This fund is aimed at providing coverage to 1.4 million people with no LTE cellphone coverage and another 1.7 million people where the LTE coverage is subsidized.

Recall that the FCC’s knowledge of cellphone coverage comes from the cellphone companies who claim better coverage than actually exists. Cellphone coverage is similar to DSL where the quality of signal to a given customer depends upon distance from a cellphone tower. Rural America has homes around almost every tower that have crappy coverage and that are probably not counted in these figures.

My main issue with the program is not the goal – in today’s world we need cellphone coverage except to the most remote places in the country. My problem is that the perceived solution is to hand yet more billions to the cellular carriers – money that could instead fund real broadband in rural America. Additionally, the ten-year implementation is far too long. That’s an eternity to wait for an area with no cellular coverage.

I think the FCC had a number of options other than shelling out billions to the cellular companies:

  • The FCC could require the cellular companies to build these areas out of their own pockets as a result of having taken the licensed spectrum. Other than Sprint, these companies are extremely profitable right now and just got a lot more profitable because of the recent corporate tax-rate reductions. The FCC has always had build-out requirements for spectrum and the FCC could make it mandatory to build the rural areas as a condition for retaining the spectrum licenses in the lucrative urban areas.
  • The FCC could instead give unused spectrum to somebody else that is willing to use it. The truth is that the vast majority of licensed spectrum sits unused in rural America. There is no reason that spectrum can’t come with a use-it-or-lose it provision so that unused spectrum reverts back to the FCC to give to somebody else. There are great existing wireless technologies that work best with licensed spectrum and it’s aggravating to see the spectrum sit unused but still unavailable to those who might use it.
  • Finally, the FCC could force the cellular carriers to use towers built by somebody else. I work with a number of rural counties that would gladly build towers and the necessary fiber to provide better cellphone coverage. It would cost the cellular carriers nothing more than the cell site electronics if others were to build the needed core infrastructure.

This idea of handing billions to the big telecom companies is a relatively new one. Obviously the lobbyists of the big companies have gained influence at the FCC. It’s not just this FCC that is favoring the big companies. Originally the CAF II program was going to be available to everybody using reverse auction rules. But before that program was implemented the Tom Wheeler FCC decided to instead just give the money to the big telcos if they wanted it. The telcos even got to pick and choose and reject taking funding for remote places which will now be auctioned this summer.

That same CAF II funding could have been used to build a lot of rural fiber or other technologies that would have provided robust broadband networks. But instead the telcos got off the hook by having to only upgrade to 10/1 Mbps – a speed that was already obsolete at the time of the FCC order.

Now we have yet another federal program that is going to shovel more billions of dollars to big companies to provide broadband that will supposedly meet a 10/1 Mbps speed. But like with CAF II, the carriers will get to report the results of the program to the FCC. I have no doubt that they will claim success even if coverage remains poor. Honestly, there are days as an advocate for rural broadband that you just want to bang your head against a wall it’s hard to see billions and billions wasted that could have brought real broadband to numerous rural communities.

Setting the FCC Definition of Broadband

In the recently released 2018 Broadband Progress Report the FCC reluctantly kept the official definition of broadband at 25/3 Mbps. I say reluctantly because three of the Commissioners were on record for either eliminating the standard altogether or else reverting back to the older definition of 10/1 Mbps.

I’m guessing the Commissioners gave in to a lot of public pressure to keep the 25/3 standard. Several Commissioners had also taken a public stance that they wanted to allow cellular data to count the same for a household as landline broadband – and that desire was a big factor in lowering the definition since cellphones rarely meet the 25/3 speed standard.

The deliberation on the topic this year raises the question if there is some way to create a rule that would better define the speed of needed broadband. It’s worth looking back to see how the Tom Wheeler FCC came up with the 25/3 definition. They created sample profiles of the way that households of various sizes are likely to want to use broadband. In doing so, they added together the bandwidth needed for various tasks such as watching a movie or supporting a cellphone.

But the FCC’s method was too simple and used the assumption that various simultaneous uses of broadband are additive. They added together the uses for a typical family of four which resulted in bandwidth needs greater than 20 Mbps download, and used that as the basis for setting the 25/3 standard. But that’s now home broadband works. There are several factors that affect the actual amount of bandwidth being used:

For example, doing simultaneous tasks on a broadband network increases the overhead on the home network. If you are watching a single Netflix stream, the amount of needed bandwidth is predictable and steady. But if three people in a home are each watching a different Netflix the amount of needed bandwidth is greater than adding together the three theoretical streams. When your ISP and your home router try to receive and untangle multiple simultaneous streams there are collisions of packets that get lost and which have to be retransmitted. This is described as adding ‘overhead’ to the transmission process. Depending on the nature of the data streams the amount of collision overhead can be significant.

Almost nobody directly wires the signal from their ISP directly too all of their devices. Instead we use WiFi to move data around to various devices in the home. A WiFi router has an overhead of its own that adds to the overall bandwidth requirement. As I’ve covered in other blogs, a WiFi network is not impacted only by the things you are trying to do in your home, but a WiFi network is slowed when it pauses to recognizes demands for connection from your neighbor’s WiFi network.

Any definition of home broadband needs should reflect these overheads. If a household actually tries to download 25 Mbps of usage from half a dozen sources at the same time on a 25 Mbps, the various overheads and collisions will nearly crash the system.

The FCC’s definition of broadband also needs to reflect the real world. For example, most of the unique programming created by Netflix and Amazon Prime are now available in 4K. I bought a large TV last year and we now watch 4K when it’s available. That means a stream of 15-20 Mbps download. That stream forced me to upgrade my home WiFi network to bring a router into the room with the TV.

The FCC’s speed definition finally needs to consider the busy hour of the day – the time when a household uses the most broadband. That’s the broadband speed that the home needs.

We know household bandwidth needs keep increasing. Ten years ago I was happy with a 5 Mbps broadband product. Today I have a 60 Mbps product that seems adequate, but I know from tests I did last year that I would be unhappy with a 25 Mbps connection.

The FCC needs a methodology that would somehow measure actual download speeds at a number of homes over time to understand what homes area really using for bandwidth. There are ways that this could be done. For example, the FCC could do something similar for broadband like what Nielsen does for cable TV. The FCC could engage one of the industry firms that monitor broadband usage such as Akamai to sample a large number of US homes. There could be sample voluntary homes that meet specific demographics that would allow monitoring of their bandwidth usage. The accumulated data from these sample homes would provide real-life bandwidth usage as a guide to setting the FCC’s definition of broadband. Rather than changing the official speed periodically, the FCC could change the definition as needed as dictated by the real-world data.

The FCC does some spot checking today of the broadband speeds as reported by the ISPs that feed the national broadband map. But that sampling is random and periodic and doesn’t provide the same kind of feedback that a formal ongoing measuring program would show. We have tools that could give the FCC the kind of feedback it needs. Of course, there are also political and other factors used in setting the official definition of broadband, and so perhaps the FCC doesn’t want real facts to get into the way.

Partisanship and the FCC

The current FCC has a clear bias towards the big cable companies, telcos and cellular companies. There is nothing particularly wrong with that since this FCC represents an administration that also is big-business oriented. Past FCC’s have clearly favored policies that reflected the administration in charge. For instance, the prior FCC under Tom Wheeler was pro-consumer in many ways and pushed things like net neutrality and privacy – issues that had the support of the administration but not of the giant ISPs.

However, in the last few decades the FCC has gotten a lot more partisan. It’s becoming rare to see a policy vote that doesn’t follow party lines. This isn’t true of everything and we see unanimous FCC Commissioner support for things like providing more spectrum for broadband. But FCC voting on any topic that has political overtones now seem to follow party lines.

The most recent example of the increased partisanship is evident with the release of this year’s 2018 Broadband Deployment Report to Congress. In that report Chairman Pai decided to take the stance that the state of broadband in the country is fine and needs no FCC intervention. The FCC is required to determine the state of broadband annually and report the statistics and its conclusions to Congress. More importantly, Section 706 of the Telecommunications Act of 1996 requires that the FCC must take proactive steps to close any perceived gaps in broadband coverage.

In order to declare that the state of broadband in the country doesn’t require any further FCC action, Chairman Pai needed to come up with a narrative to support his conclusion. The argument he’s chosen to defend his position is a bit startling because by definition it can’t be true.

The new broadband report, released on February 9 concludes that “advanced telecommunications capability is being deployed to all Americans in a reasonable and timely fashion. . . This finding does not mean that all Americans now have broadband access. Rather, it means that we are back on the right track when it comes to deployment“. The kicker comes in when the report says that the FCC’s data from ISPs “does not yet reflect the beneficial effects of the Commission’s actions in 2017,” such as “ending the adverse impact on investment caused by the [2015 Net Neutrality] Order. . . For instance, several companies, including AT&T, Verizon, Frontier, and Alaska Communications either commenced or announced new deployments in 2017.

In effect, the FCC now says that broadband deployment is back on track due to its December 2017 net neutrality repeal. But the ‘facts’ it cites don’t support its argument. First, any broadband improvements made by the cited telcos in 2017 would have been authorized and started in 2016 before this new FCC even was in place. Further, a big percentage of the recent broadband deployments of these particular telcos are due to earlier FCC decisions prior to the Pai FCC. For example, AT&T was required as a requirement of the purchase of DirectTV to pass 12 million new residences and businesses with fiber. A lot of the broadband spending made by AT&T, Frontier and Alaska Communications are using CAF II funds given to them by the Wheeler FCC and which the companies are required to spend. None of those expenditures have anything to do with the repeal of net neutrality. And since net neutrality was only reversed a few months ago, it’s impossible to believe that any of the broadband spending in 2017 was due to this FCC. It’s far too early to see if that order will have any impact on rural broadband expenditures (something no industry experts expect).

This FCC Broadband Report concludes that deployment of broadband in the country is on track and reasonable. Yet the numbers in the report show that there are still 19 million Americans in rural America without access to adequate broadband. There are 12 million school-aged children who are suffering from the homework gap because they don’t have broadband at home.

By declaring that broadband deployment is adequate, Chairman Pai has let his FCC off the hook for having to take any actions to address the issue. But his stated reasons are based upon an argument that is clearly not supported by any facts. This would seem to put the Chairman in violation of his Section 706 obligations, although that’s probably something only Congress can determine.

I’m saddened to see the FCC become so partisan. This is not a new phenomenon and we saw partisan voting under the last several FCCs. Before that we had pro-big business FCCs such as the one under Chairman Michael Powell. But that FCC was far less partisan and still basically used the facts at its disposal in making decisions and setting policy.

The FCC has a mandate to balance what’s good for both the public and the telecom companies. In an ideal world the FCC would be a neutral arbiter that listens to the facts and sides with the best arguments. This trend towards a partisan FCC is bad for the industry because it means that major policies will flip-flop when we change administrations – and that’s not good for ISPs or the public. Partisanship does not excuse this FCC from abrogating its responsibility and making specious arguments not supported by facts. This FCC has taken partisanship too far.

Regulating From Broadband Maps

One of the more bizarre things we do in the US is regulate broadband based upon broadband maps. There are numerous federal grant and subsidy programs that rely upon these maps (and the underlying databases that support them) as well as various state programs. The FCC also uses this same data when reporting broadband penetration in the country to Congress each year, as just occurred on February 9.

The maps are intended to show how many households can purchase broadband of various speeds. Currently the arbitrary speed thresholds tested are download speeds of 10 Mbps, 25 Mbps and 100 Mbps. These speeds are measured due to past decisions by the FCC. For example, the FCC chose a 10/1 Mbps speed goal for any company that accepted CAF II money to upgrade rural broadband. The FCC’s current definition of broadband is still set at 25/3 Mbps.

Anybody that understands broadband networks knows that much of the data included in the databases and the mapping is incorrect, and sometimes pure fantasy. That makes sense when you understand that the speeds in this mapping process are all self-reported by ISPs.

There are numerous reasons why the speeds in these databases are not an accurate reflection of the real world:

  • There are still ISPs that report advertised speeds rather than actual speeds received by customers.
  • Any speeds represented for a whole DSL network are inaccurate by definition. DSL speeds vary according to the size of the copper wires, the condition of the copper cable and the distance from the source of the DSL broadband signal. That means that in a DSL network the speeds available to customers vary street by street, and even house by house. We’ve always known that DSL reported in the mapping databases is overstated and that most telcos that report DSL speeds report theoretical speeds. I’m not sure I blame them, but the idea of any one speed being used to represent the performance of a DSL network is ludicrous.
  • The speeds in the database don’t recognize network congestion. There are still many broadband networks around that bog down under heavy usage, which means evenings in a residential neighborhood. Nobody wants to be told that their network is performing at 10 Mbps if the best speed they can ever get when they want to use it is a fraction of that.
  • The speeds don’t reflect that ISPs give some customers faster speeds. In networks where bandwidth is shared among all users on a neighborhood node, if a few customers are sold a faster-than-normal speed, then everybody else will suffer corresponding slower speeds. Network owners are able to force extra speed to customers that pay a premium for the service, but to the detriment of everybody else.
  • The maps don’t reflect the way networks were built. In most towns you will find homes and businesses that were somehow left out of the initial network construction. For example, when cable companies were first built they largely ignored business districts that didn’t want to buy cable TV. There are lots of cases of apartment and subdivision owners that didn’t allow in the incumbent telco or cable company. And there are a lot of homes that just got missed by the network. I was just talking to somebody in downtown Asheville where I live who is not connected to the cable network for some reason.
  • Not all ISPs care about updating the databases. There are many wireless and other small ISPs that don’t update the databases every time they make some network change that affects speeds. In fact, there are still some small ISPs that just ignore the FCC mapping requirement. At the other extreme there are small ISPs that overstate the speeds in the databases, hoping that it might drive customer requests to buy service.
  • One of the most insidious speed issues in networks are the data bursts that many ISPs frontload into their broadband products. They will send a fast burst of speed for the first minute or two for any demand for bandwidth. This improves the customer experience since a large percentage of requests to use bandwidth are for web searches or other short-term uses of bandwidth. Any customer using this feature will obtain much faster results from a speed test than their actual long-use data speeds since they are actually testing only the burst speed. A rural customer using burst might see 4 Mbps on a speed test and still find themselves unable to maintain a connection to Netflix.
  • Sometimes there are equipment issues. The best-known case of this is a widespread area of upstate New York where Charter has kept old DOCSIS 1.0 cable modems in homes that are not capable of receiving the faster data speeds the company is selling. It’s likely that the faster network speed is what is included in the database, not the speed that is choked by the old modems.
  • And finally, speed isn’t everything. Poor latency can ruin the utility of any broadband connection, to the point where the speed is not that important.

Unfortunately, most of the errors in the broadband databases and maps overstate broadband speeds rather than under-report them. I’ve worked with numerous communities and talk to numerous people who are not able to get the broadband speeds suggested by the FCC databases for their neighborhoods. Many times the specific issue can be pinned down to one of the above causes. But that’s no consolation for somebody who is told by the FCC that they have broadband when they don’t.

No New Telecom Act

For years it’s been obvious that we need a new telecom act. The Telecommunications Act of 1996 was largely aimed at promoting telephone competition and is now quaintly outdated. Today, carriers that want to provide traditional voice services still have to jump through a gauntlet of regulatory requirements while ISPs providing VoIP or no voice product have almost no regulation.

The 1996 Act is dated and some of its provisions cause unneeded problems within the industry. A good example is Google Fiber’s struggle getting onto poles in various cities. Google has shunned taking the regulated path, but in doing so they have not been availed the protections of the 1996 Act that provides access to poles, conduits and ducts. Since most new fiber builders are not offering traditional voice, the distinctions between regulated and unregulated carriers is out of date. But unless Congress changes the rules established by the 1996 Act, the FCC and the courts have choice but to enforce any explicit regulations required by that Act.

It’s also easy to overlook that the 1996 Act rewrote many of the rules for the cable industry. For example, some of the rules covered by the Act still require traditional cable providers to provide several specific tiers of cable service. It’s obvious that these rules no longer make sense and are hindering traditional cable companies from offering competitive small packages and the a la carte programming that customers clearly want.

At the 2018 State of the Net conference held last week Rep. Greg Walden, the chair of the House Energy & Commerce Committee said that he did not foresee any major telecom legislation this year, but rather piecemeal tweaks of telecom law to fix obvious problems. This same sentiment has been expressed by Sen John Thune who has the same role on the Senate Committee on Commerce, Science and Transportation. These committees are where telecom legislation begins.

We see this piecemeal approach in Congress right now. There are nearly a dozen proposed bills floating around Congress right now that have an impact on telecom. For example:

  • There are several bills that would simplify the paperwork to get funding from the Universal Service fund and would make it easier to fix telecom infrastructure after a natural disaster.
  • There are also several bills that would loosen or exempt telecom projects that get federal funding from having to undertake environmental and historic preservation reviews if facilities are placed in existing rights-of-ways.
  • There is a bill to streamline the application for placing telecom facilities on federal land, including a one-year shot-clock forcing a yes or no answer to an application.
  • There is a proposed bill that would require the FCC to monitor and improve broadband availability in ‘urban broadband deserts”.

There is nothing wrong with any of these bills and they propose to make changes that make sense. For example, the requirement to undertake environmental and historic preservation studies when using federal grant money probably added 15% of cost to projects funded by the BTOP program a few years ago. It makes no sense to do these studies when new telecom facilities are to be placed on existing poles or within the existing shoulders of roads. Tweaking the rules will save unneeded expense for future fiber projects.

But these bills are all small in scope and ignore the big issues. The time has probably come to eliminate telephone regulations, other than perhaps the few rules that directly protect consumers. It’s also time to open up access to poles and conduits to everybody without making them jump through the hurdles created by the 1996 Act. It’s time to eliminate any federal rules that dictate how cable networks must package their programming. There are number of these big issues that cannot be easily fixed by small piecemeal bills.

There is an even bigger issue looming over the creation of a new telecom act. The FCC has basically written itself out of the picture for regulating broadband. There are some aspects of broadband that need to be regulated and Congress would have to drag the FCC back into this role.

A new telecom act could create a fresh start for the industry and the FCC. All of the drama concerning Title II regulation of broadband was due to the fact that Congress failed to provide any guidance for regulating broadband. The FCC struggled over the last decade trying to find a backdoor way to justify governing some aspects of broadband – something the Congress could have fixed at any time by giving explicit authority to the FCC.

Regulating broadband one small inch at a time is not good policy. Any ISP can rattle off a list of a dozen things that don’t work as well as they would like. The only way to get the fresh start we need is with a new telecom act aimed at the new world we really live in. We are no longer a world that needs heavy telephone regulations or that should tell cable TV providers what to put on the air. What we need is a new framework that would empower the FCC to make sure that we can affordably build the fiber and wireless networks that are vital to our future. We need rules that require that broadband stay within affordable reach of most households. We need rules that prohibit ISPs from spying on customers. We really need Congress to do their jobs and restart the industry on a regulatory path that fits our times.