Abandoned Telecom Infrastructure

I saw an article about Merrill, Oregon where the city was wrestling about what to do with an abandoned cable TV network hanging on poles in the City. It’s actually a fairly common occurrence to have abandoned telecom property on poles and I’ve been contacted by a number of cities over the years wondering what how to deal with the situation.

In this particular case the historic cable system in the city was operated by Rapid Communications out of Texas. That company sold cable properties to a number of companies in 2008 and the Merrill system went to Almega Cable Company, which stopped offering service in the City and went out of business in 2011.

There are all sorts of telecom assets that have been abandoned on poles and defunct cable companies are only one example. I saw a lot of WiFi mesh networks abandoned fifteen years ago as operators folded and never retrieved their equipment. There are numerous CLECs that folded in the late 1990s and that walked away from fiber networks on poles.

Having an abandoned set of wires on poles complicates the lives of any other pole users in the market. The unused wires take up space on poles and make it hard for anybody else to add additional wires onto the pole.

Abandoned networks also create havoc for the normal pole attachment process. This process requires buy-in from existing utilities to move or rearrange cables to make room for a new attacher. A new attacher can be paralyzed if they are unable to create the required clearance from existing wires.

In the end I’ve almost always seen the responsibility for getting rid of the network fall to the local government. Somebody has to go through the process of making certain there is no remaining active owner of the network before it can be condemned. Generally the pole owner is not willing to take on that role unless they have a need of their own to add wires to the poles.

Merrill is now undertaking the task of condemning the network. They have to follow law and post public notices to make sure that nobody claims rights to the cables. In the case of a cable company the City not only has to deal with the wires on poles, but also with customer drops and pedestals scattered throughout the community.

Merrill is hoping that some new carrier will want to use the cable network for overlashing fiber. Overlashing is the process of tying the fiber to existing wires and is generally the lowest cost method of fiber construction. But even if they find a taker for the offer my guess is that the new fiber provider is not going to want to assume ownership for the coaxial cables since that would give them liability for any issues or problems with the old wiring. So the City might end up owning the cables in perpetuity. If they don’t find a buyer, the city will have to pay to have the cables removed – although in today’s market there might be enough value in the copper inside the coaxial cables to offset the cost of removal.

We are going to see a lot more abandoned assets on poles in the future. We are just now entering a period when numerous companies are going to want to hang wireless devices of all types on poles. Some of these devices are tiny and I’ve seen others that are the size of a dorm refrigerator. It’s inevitable that some of the wireless deployments will fail, or that the wireless companies will lose the customers served by a given device.

Over time a significant inventory of abandoned wireless devices will likely grow in most cities. And unlike an abandoned cable network, my guess is that it’s often going to be hard to know which wireless devices have been abandoned or even who owns many of them. Cities ought to be considering ordinances today that require the companies that deploy wireless devices to somehow notify them of what they are doing and to also clearly label the ownership of each device.

But there is a movement at the FCC, in Congress and in States legislatures to institute rules for wireless carriers that would override any local rules. Such global rules are going to hinder cities in the coming decades when they try to deal with abandoned assets clogging their pole lines. Most of the proposed new rules I’ve seen don’t address this issue, which will make it messy to deal with later.

Is AT&T Violating Net Neutrality?

I got a text on my AT&T cellphone last month that told me that my wireless plan now includes sponsored data. Specifically they told me that I could now stream movies and other content from DirecTV or U-Verse TV without the video counting against my monthly data cap. This has been available to AT&T post-paid customers for a while, but now is apparently available to all customers. What I found most interesting about the message was that it coincided with the official end of net neutrality.

AT&T is not the first cellular company to do this. Verizon tried this a few years ago, although that attempt was largely unsuccessful because they didn’t offer much content that people wanted to watch. T-Mobile does something similar with their Binge-on program, but since most of their data plans are unlimited, customers can watch anything on their phones, not just the Binge-on video.

The sponsored data from AT&T would be a direct violation of net neutrality if it was still in effect and is a textbook example of paid prioritization. By excusing the DirecTV content from cellular data caps they have created an advantage for DirecTV compared to competitors. It doesn’t really matter that AT&T also happens to own DirecTV, and I imagine that AT&T is now shopping this same idea around to other video providers.

So what is wrong with what AT&T is doing? Certainly their many customers that buy both AT&T cellphones and DirecTV will like the plan. Cellular data in the US is still some of the most expensive data in the world and letting customers watch unlimited video from a sponsored video provider is a huge benefit to customers. Most people are careful to not go over monthly data limits, and that means they carefully curtail watching video on cellphones. But customers taking advantage of sponsored video are going to watch video that would likely have exceeded their monthly data cap – it doesn’t take more than a handful of movies to do that.

AT&T has huge market power with almost 140 million cellphones users on their network at the end of last year. Any video provider they sponsor is going to gain a significant advantage over other video providers. AT&T customers that like watching video on their cellphones are likely to pick DirecTV over Comcast or any other video provider.

It’s also going to be extremely tempting for AT&T to give prioritized routing to DirecTV video – what means implementing the Internet fast lane. AT&T is going to want their cellular customers to have a quality experience, and they can do that by making sure that DirecTV video has the best connections throughout their network. They don’t necessarily have to throttle other video to make DirecTV better – they can just make sure that DirectTV video gets the best possible routing.

I know to many people the AT&T plan is going to feel somewhat harmless. After all, they are bundling together their own cellular and video products. But it’s a short step from here for AT&T to start giving priority to content from others who are willing to pay for it. It’s not to hard to imagine them offering the same plan to Netflix, YouTube or Facebook.

If this plan expands beyond AT&T’s own video, we’ll start seeing the negative impacts of paid prioritization:

  • Only the biggest companies like Netflix, Facebook or Google can afford to pay AT&T for the practice. This is going to shut out smaller video providers and start-ups. Already in the short history of the web we’ve seen a big turnover in the popular platforms on the web – gone or greatly diminished are earlier platforms like AOL, CompuServe and Prodigy. But with the boost given by paid prioritization the big companies today will get a step-up to remain as predominant players on the web. Innovation is going to be severely hampered.
  • This is also the beginning of a curated web where many people only see the world through the filter of the predominant web services. We already see that phenomenon a lot today, but when people are funneled to only using the big web services this will grow and magnify.
  • It’s not hard to imagine the next step where we see reduced price data plans that are ‘sponsored’ by somebody like Facebook. Such platforms will likely make it a challenge for customers to step outside their platform. And that will lead to a segmentation and slow death of the web as we know it.

Interestingly, the Tom Wheeler FCC told AT&T that this practice was unacceptable. But through the change of administration AT&T never stopped the practice and is now expanding it. It’s likely that courts are going to stay some or all of the net neutrality order until the various lawsuits on the issue get resolved. But AT&T clearly feels emboldened to move forward with this, probably since they know the current FCC won’t address the issue even if net neutrality stays in effect.

Progress of the CAF II Program

If readers recall, the CAF II program is providing funds to the largest telcos to upgrade rural facilities in their incumbent operating territories to broadband speeds of at least 10 Mbps down and 1 Mbps up. The CAF II deployment began in the fall of 2015 and lasts for 6 years, so we are now almost 2.5 years into the deployment period. I was curious about how the bigger telcos are doing in meeting their CAF II build-out requirements. The FCC hasn’t published any progress reports on CAF II deployments, so I found the following from web searches:

AT&T. The company took $427 million annually for the six years ($2.56 billion) to bring broadband to 2.2 million rural customers. The company has said they are going to use a combination of improved DSL and fixed wireless broadband using their cellular frequencies to meet their build-out requirements. From their various press releases it seems like they are planning on more wireless than wireline connections (and they have plans in many rural places of tearing down the copper).

The only big public announcement of a wireless buildout for AT&T is a test in Georgia initiated last year. On their website the company says their goal at the end of 2018 is to offer improved broadband to 440,000 homes, which would mean a 17% CAF II coverage at just over the mid-point of their 6-year build-out commitment.

On a side note, AT&T had also promised the FCC, as a condition of the DirecTV merger that they would be pass 12.5 million homes and business with fiber by mid-2019. They report reaching only 4 million by the end of 2017.

CenturyLink. CenturyLink accepted $500 million annually ($3 billion) in CAF II funding to reach 1.2 million rural homes. In case you’re wondering why CenturyLink is covering only half of the homes as AT&T for roughly the same funding – the funding for CAF II varies by Census block according to density. The CenturyLink coverage area is obviously less densely populated than the areas being covered by AT&T.

FierceTelecom reported in January that CenturyLink has now upgraded 600,000 CAF II homes by the end of last year, or 37% of their CAF II commitment. The company says that their goal is to have 60% coverage by the end of this year. CenturyLink is primarily upgrading rural DSL, although they’ve said that they are considering using point-to-multipoint wireless for the most rural parts of the coverage areas. The company reports that in the upgrades so far that 70% of the homes passed so far can get 20 Mbps download or faster.

Frontier. The last major recipient of CAF II funding is Frontier. The company originally accepted $283 million per year to upgrade 650,000 passings. They subsequently acquired some Verizon properties that had accepted $49 million per year to upgrade 37,000 passings. That’s just under $2 billion in total funding.

FierceTelecom reported in January that Frontier reached 45% of the CAF II area with broadband speeds of at least 10/1 Mbps by the end of 2017. The company also notes that in making the upgrades for rural customers that they’ve also upgraded the broadband in the towns near the CAF II areas and have increased the broadband speeds of over 900,000 passings nationwide.

Frontier is also largely upgrading DSL, although they are also considering point-to-multipoint wireless for the more rural customers.

Other telcos also took major CAF II funding, but I couldn’t find any reliable progress reports on their deployments. This includes Windstream ($175 million per year), Verizon ($83 million per year), Consolidated ($51 million per year), and Hawaiian Telcom ($26 million per year).

The upcoming reverse auction this summer will provide up to another $2 billion in funding to reach nearly 1 million additional rural homes. In many cases these are the most remote customers, and many are found in many of the same areas where the CAF II upgrades are being made. It will be interesting to see if the same telcos will take the funding to finish the upgrades. There is a lot of speculation that the cellular carriers will pursue a lot of the reverse auction upgrades.

But the real question to be asked for these properties is what comes next. The CAF II funding lasts until 2021. The speeds being deployed with these upgrades are already significantly lower than the speeds available in urban America. A household today with a 10 Mbps download speed cannot use broadband in the ways that are enjoyed by urban homes. My guess is that there will be continued political pressure to continue to upgrade rural speeds and that we haven’t seen the end of the use of the Universal Service Fund to upgrade rural broadband.

States and Net Neutrality

We now know how states are going to react to the end of net neutrality. There are several different responses so far. First, a coalition of 23 states filed a lawsuit challenging the FCC’s ability to eliminate net neutrality and Title II regulation of broadband. The lawsuit is mostly driven by blue states, but there are red states included like Mississippi and Kentucky.

The lawsuit argues that the FCC has exceeded its authority in eliminating net neutrality. The lawsuit makes several claims:

  • The suit claims that under the Administrative Procedure Act (ACA) the FCC can’t make “arbitrary and capricious” changes to existing policies. The FCC has defended net neutrality for over a decade and the claim is that the FCC’s ruling fails to provide enough justification for abandoning the existing policy.
  • The suit claims that the FCC ignored the significant public record filed in the case that details the potential harm to consumers from ending net neutrality.
  • The suit claims that the FCC exceeded its authority by reclassifying broadband service as a Title I information service rather than as a Title II telecommunications service.
  • Finally, the suit claims that the FCC ruling improperly preempts state and local laws.

Like with past challenges of major FCC rulings, one would expect this suit to go through at least several levels of courts, perhaps even to the supreme court. It’s likely that the loser of the first ruling will appeal. This process is likely to take a year or longer. Generally, the first court to hear the case will determine quickly if some or all of the FCC’s ruling net neutrality order will be stayed until resolution of the lawsuit.

I lamented in a recent blog how partisan this and other FCCs have gotten. It would be a positive thing for FCC regulation in general if the courts put some cap on the ability of the FCC to create new policy without considering existing policies and the public record about the harm that can be caused by a shift in policy. Otherwise we face having this and future FCCs constantly changing the rules every time we get a new administration – and that’s not healthy for the industry.

A second tactic being used by states is to implement a state law that effectively implements net neutrality at the state level. The states of New York, New Jersey and Montana have passed laws that basically mimic the old FCC net neutrality rules at the state level. It’s an interesting tactic and will trigger a lawsuit about state rights if challenged (and I have to imagine that somebody will challenge these laws). I’ve read a few lawyers who opine that this tactic has some legs since the FCC largely walked away from regulating broadband, and in doing so might have accidentally opened up the door for the states to regulate the issue. If these laws hold up that would mean a hodgepodge of net neutrality rules by state – something that benefits nobody.

Another tactic being taken is for states, and even a few cities, to pass laws that change the purchasing rules so that any carrier that bids for government telecom business must adhere to net neutrality. This is an interesting tactic and I haven’t seen anybody that thinks this is not allowed. Governments have wide latitude in deciding the rules for purchasing goods and services and there are already many similar restrictions that states put onto purchasing. The only problem with this tactic is going to be if eventually all of the major carriers violate the old net neutrality rules. That could leave a state with poor or no good choice of telecom providers.

As usual, California is taking a slightly different tactic. They want to require that carriers must adhere to net neutrality if they use state-owned telecom facilities or facilities that were funded by the state. Over the years California has built fiber of its own and also given out grants for carriers to build broadband networks. This includes a recently announced grant program that is likely to go largely to Frontier and CenturyLink. If this law is upheld it could cause major problems for carriers that have taken state money in the past.

It’s likely that there are going to be numerous lawsuits challenging different aspects of the various attempts by states to protect net neutrality. And there are likely to also be new tactics tried by states during the coming year to further muddy the picture. It’s not unusual for the courts to finally decide the legitimacy of major FCC decisions. But there are so many different tactics being used here that we are likely to get conflicting rulings from different courts. It’s clearly going to take some time for this to all settle out.

One interesting aspect of all of this is how the FCC will react if their cancellation of net neutrality is put on hold by the courts. If that happens it means that some or all of net neutrality will still be the law of the land. The FCC always has the option to enforce or not enforce the rules, so you’d suspect that they wouldn’t do much about ISPs that violate the spirit of the rules. But more importantly, the FCC is walking away from regulating broadband as part of killing Title II regulation. They are actively shuttling some regulatory authority to the FTC for issues like privacy. It seems to me that this wouldn’t be allowed until the end of the various lawsuits. I think the one thing we can count on is that this is going to be a messy regulatory year for broadband.

Broadband Regulation in Limbo

The recent ruling earlier this week by the US Court of Appeals for the 9th Circuit highlights the current weak state of regulations over broadband. The case is one that’s been around for years and stems from AT&T’s attempt to drive customers off of their original unlimited cellphone data plans. AT&T began throttling unlimited customers when they reached some unpublished threshold of data use, in some cases as small as 2 GB in a month. AT&T then lied to the FCC about the practice when they inquired. This case allows the FTC suit against AT&T to continue.

The ruling demonstrates that the FTC has some limited jurisdiction over common carriers like AT&T. However, the clincher came when the court ruled that the FTC only has jurisdiction over issues where the carriers aren’t engaging in common-carrier services. This particular case involves AT&T not delivering a product they promised to customers and thus falls under FTC jurisdiction. But the court made it clear that future cases that involve direct common carrier functions, such as abuse of net neutrality would not fall under the FTC.

This case clarifies the limited FTCs jurisdiction over ISPs and contradicts the FCC’s statements that the FTC is going to be able to step in and take their place on most matters involving broadband. The court has made it clear that is not the case. FCC Chairman Ajit Pai praised this court ruling and cited it as a good example of how the transition of jurisdiction to the FTC is going to work as promised. But in looking at the details of the ruling, that is not true.

This court ruling makes it clear that there is no regulatory body now in charge of direct common carrier issues. For instance, if Netflix and one of the ISPs get into a big fight about paid prioritization there would be nowhere for Netflix to turn. The FCC would refuse to hear the case. The FTC wouldn’t be able to take the case since it involves a common carrier issue. And while a court might take the case, they would have no basis on which to make a ruling. As long as the ISP didn’t break any other kinds of laws, such as reneging on a contract, a court would have no legal basis on which to rule for or against the ISPs behavior.

That means not only that broadband is now unregulated, it also means that there is no place for some body to complain against abuse by ISPs until the point where that abuse violates some existing law. That is the purest definition of limbo that I can think of for the industry.

To make matters worse, even this jumbled state of regulation is likely to more muddled soon by the courts involved in the various net neutrality suits. Numerous states have sued the FCC for various reasons, and if past practice holds, the courts are liable to put some or all of the FCC’s net neutrality decision on hold.

It’s hard to fathom what that might mean. For example, if the courts were to put the FCC’s decision to cancel Title II regulation on hold, then that would mean that Title II regulation would still be the law of the land until the net neutrality lawsuits are finally settled. But this FCC has made it clear that they don’t want to regulate broadband and they would likely ignore such a ruling in practice. The Commission has always had the authority to pick and choose cases it will accept and I’m picturing that they would refuse to accept cases that relied on their Title II regulation authority.

That would be even muddier for the industry than today’s situation. Back to the Netflix example, if Title II regulation was back in effect and yet the FCC refused to pursue a complaint from Netflix, then Netflix would likely be precluded from trying to take the issue to court. The Netflix complaint would just sit unanswered at the FCC, giving Netflix no possible remedy, or even a hearing about their issues.

The real issue that is gumming up broadband regulation is not the end of Title II regulation. The move to Title II regulation just became effective with the recent net neutrality decision and the FCCs before that had no problem tackling broadband issues. The real problem is that this FCC is washing their hands of broadband regulation, and supposedly tossed that authority to the FTC – something the court just made clear can’t work in the majority of cases.

This FCC has shown that there is a flaw in their mandate from Congress in that they feel they are not obligated to regulate broadband. So I guess the only fix will be if Congress makes the FCC’s jurisdiction, or lack of jurisdiction clear. Otherwise, we couldn’t even trust a future FCC to reverse course, because it’s now clear that the decision to regulate or not regulate broadband is up to the FCC and nobody else. The absolute worst long-term outcome would be future FCCs regulating or not regulating depending upon changes in the administration.

My guess is that AT&T and the other big ISPs are going to eventually come to regret where they have pushed this FCC. There are going to be future disputes between carriers and the ISPs are going to find that the FCC can not help them just like they can’t help anybody complaining against them. That’s a void that is going to serve this industry poorly.

Another FCC Giveaway

The FCC just voted implement a plan to give up to $4.53 billion dollars to the cellular carriers over the next ten years to bring LTE cellular and data to the most remote parts of America. While this sounds like a laudable goal, this FCC seems determined to hand out huge sums of money to the biggest telecom companies in the country. This program is labeled Mobility II and will be awarded through an auction among the cellular companies.

As somebody who travels frequently in rural America there certainly are still a lot of places with poor or no cellphone coverage. My guess is that the number of people that have poor cellphone coverage is greater than what the FCC is claiming. This fund is aimed at providing coverage to 1.4 million people with no LTE cellphone coverage and another 1.7 million people where the LTE coverage is subsidized.

Recall that the FCC’s knowledge of cellphone coverage comes from the cellphone companies who claim better coverage than actually exists. Cellphone coverage is similar to DSL where the quality of signal to a given customer depends upon distance from a cellphone tower. Rural America has homes around almost every tower that have crappy coverage and that are probably not counted in these figures.

My main issue with the program is not the goal – in today’s world we need cellphone coverage except to the most remote places in the country. My problem is that the perceived solution is to hand yet more billions to the cellular carriers – money that could instead fund real broadband in rural America. Additionally, the ten-year implementation is far too long. That’s an eternity to wait for an area with no cellular coverage.

I think the FCC had a number of options other than shelling out billions to the cellular companies:

  • The FCC could require the cellular companies to build these areas out of their own pockets as a result of having taken the licensed spectrum. Other than Sprint, these companies are extremely profitable right now and just got a lot more profitable because of the recent corporate tax-rate reductions. The FCC has always had build-out requirements for spectrum and the FCC could make it mandatory to build the rural areas as a condition for retaining the spectrum licenses in the lucrative urban areas.
  • The FCC could instead give unused spectrum to somebody else that is willing to use it. The truth is that the vast majority of licensed spectrum sits unused in rural America. There is no reason that spectrum can’t come with a use-it-or-lose it provision so that unused spectrum reverts back to the FCC to give to somebody else. There are great existing wireless technologies that work best with licensed spectrum and it’s aggravating to see the spectrum sit unused but still unavailable to those who might use it.
  • Finally, the FCC could force the cellular carriers to use towers built by somebody else. I work with a number of rural counties that would gladly build towers and the necessary fiber to provide better cellphone coverage. It would cost the cellular carriers nothing more than the cell site electronics if others were to build the needed core infrastructure.

This idea of handing billions to the big telecom companies is a relatively new one. Obviously the lobbyists of the big companies have gained influence at the FCC. It’s not just this FCC that is favoring the big companies. Originally the CAF II program was going to be available to everybody using reverse auction rules. But before that program was implemented the Tom Wheeler FCC decided to instead just give the money to the big telcos if they wanted it. The telcos even got to pick and choose and reject taking funding for remote places which will now be auctioned this summer.

That same CAF II funding could have been used to build a lot of rural fiber or other technologies that would have provided robust broadband networks. But instead the telcos got off the hook by having to only upgrade to 10/1 Mbps – a speed that was already obsolete at the time of the FCC order.

Now we have yet another federal program that is going to shovel more billions of dollars to big companies to provide broadband that will supposedly meet a 10/1 Mbps speed. But like with CAF II, the carriers will get to report the results of the program to the FCC. I have no doubt that they will claim success even if coverage remains poor. Honestly, there are days as an advocate for rural broadband that you just want to bang your head against a wall it’s hard to see billions and billions wasted that could have brought real broadband to numerous rural communities.

Setting the FCC Definition of Broadband

In the recently released 2018 Broadband Progress Report the FCC reluctantly kept the official definition of broadband at 25/3 Mbps. I say reluctantly because three of the Commissioners were on record for either eliminating the standard altogether or else reverting back to the older definition of 10/1 Mbps.

I’m guessing the Commissioners gave in to a lot of public pressure to keep the 25/3 standard. Several Commissioners had also taken a public stance that they wanted to allow cellular data to count the same for a household as landline broadband – and that desire was a big factor in lowering the definition since cellphones rarely meet the 25/3 speed standard.

The deliberation on the topic this year raises the question if there is some way to create a rule that would better define the speed of needed broadband. It’s worth looking back to see how the Tom Wheeler FCC came up with the 25/3 definition. They created sample profiles of the way that households of various sizes are likely to want to use broadband. In doing so, they added together the bandwidth needed for various tasks such as watching a movie or supporting a cellphone.

But the FCC’s method was too simple and used the assumption that various simultaneous uses of broadband are additive. They added together the uses for a typical family of four which resulted in bandwidth needs greater than 20 Mbps download, and used that as the basis for setting the 25/3 standard. But that’s now home broadband works. There are several factors that affect the actual amount of bandwidth being used:

For example, doing simultaneous tasks on a broadband network increases the overhead on the home network. If you are watching a single Netflix stream, the amount of needed bandwidth is predictable and steady. But if three people in a home are each watching a different Netflix the amount of needed bandwidth is greater than adding together the three theoretical streams. When your ISP and your home router try to receive and untangle multiple simultaneous streams there are collisions of packets that get lost and which have to be retransmitted. This is described as adding ‘overhead’ to the transmission process. Depending on the nature of the data streams the amount of collision overhead can be significant.

Almost nobody directly wires the signal from their ISP directly too all of their devices. Instead we use WiFi to move data around to various devices in the home. A WiFi router has an overhead of its own that adds to the overall bandwidth requirement. As I’ve covered in other blogs, a WiFi network is not impacted only by the things you are trying to do in your home, but a WiFi network is slowed when it pauses to recognizes demands for connection from your neighbor’s WiFi network.

Any definition of home broadband needs should reflect these overheads. If a household actually tries to download 25 Mbps of usage from half a dozen sources at the same time on a 25 Mbps, the various overheads and collisions will nearly crash the system.

The FCC’s definition of broadband also needs to reflect the real world. For example, most of the unique programming created by Netflix and Amazon Prime are now available in 4K. I bought a large TV last year and we now watch 4K when it’s available. That means a stream of 15-20 Mbps download. That stream forced me to upgrade my home WiFi network to bring a router into the room with the TV.

The FCC’s speed definition finally needs to consider the busy hour of the day – the time when a household uses the most broadband. That’s the broadband speed that the home needs.

We know household bandwidth needs keep increasing. Ten years ago I was happy with a 5 Mbps broadband product. Today I have a 60 Mbps product that seems adequate, but I know from tests I did last year that I would be unhappy with a 25 Mbps connection.

The FCC needs a methodology that would somehow measure actual download speeds at a number of homes over time to understand what homes area really using for bandwidth. There are ways that this could be done. For example, the FCC could do something similar for broadband like what Nielsen does for cable TV. The FCC could engage one of the industry firms that monitor broadband usage such as Akamai to sample a large number of US homes. There could be sample voluntary homes that meet specific demographics that would allow monitoring of their bandwidth usage. The accumulated data from these sample homes would provide real-life bandwidth usage as a guide to setting the FCC’s definition of broadband. Rather than changing the official speed periodically, the FCC could change the definition as needed as dictated by the real-world data.

The FCC does some spot checking today of the broadband speeds as reported by the ISPs that feed the national broadband map. But that sampling is random and periodic and doesn’t provide the same kind of feedback that a formal ongoing measuring program would show. We have tools that could give the FCC the kind of feedback it needs. Of course, there are also political and other factors used in setting the official definition of broadband, and so perhaps the FCC doesn’t want real facts to get into the way.

Partisanship and the FCC

The current FCC has a clear bias towards the big cable companies, telcos and cellular companies. There is nothing particularly wrong with that since this FCC represents an administration that also is big-business oriented. Past FCC’s have clearly favored policies that reflected the administration in charge. For instance, the prior FCC under Tom Wheeler was pro-consumer in many ways and pushed things like net neutrality and privacy – issues that had the support of the administration but not of the giant ISPs.

However, in the last few decades the FCC has gotten a lot more partisan. It’s becoming rare to see a policy vote that doesn’t follow party lines. This isn’t true of everything and we see unanimous FCC Commissioner support for things like providing more spectrum for broadband. But FCC voting on any topic that has political overtones now seem to follow party lines.

The most recent example of the increased partisanship is evident with the release of this year’s 2018 Broadband Deployment Report to Congress. In that report Chairman Pai decided to take the stance that the state of broadband in the country is fine and needs no FCC intervention. The FCC is required to determine the state of broadband annually and report the statistics and its conclusions to Congress. More importantly, Section 706 of the Telecommunications Act of 1996 requires that the FCC must take proactive steps to close any perceived gaps in broadband coverage.

In order to declare that the state of broadband in the country doesn’t require any further FCC action, Chairman Pai needed to come up with a narrative to support his conclusion. The argument he’s chosen to defend his position is a bit startling because by definition it can’t be true.

The new broadband report, released on February 9 concludes that “advanced telecommunications capability is being deployed to all Americans in a reasonable and timely fashion. . . This finding does not mean that all Americans now have broadband access. Rather, it means that we are back on the right track when it comes to deployment“. The kicker comes in when the report says that the FCC’s data from ISPs “does not yet reflect the beneficial effects of the Commission’s actions in 2017,” such as “ending the adverse impact on investment caused by the [2015 Net Neutrality] Order. . . For instance, several companies, including AT&T, Verizon, Frontier, and Alaska Communications either commenced or announced new deployments in 2017.

In effect, the FCC now says that broadband deployment is back on track due to its December 2017 net neutrality repeal. But the ‘facts’ it cites don’t support its argument. First, any broadband improvements made by the cited telcos in 2017 would have been authorized and started in 2016 before this new FCC even was in place. Further, a big percentage of the recent broadband deployments of these particular telcos are due to earlier FCC decisions prior to the Pai FCC. For example, AT&T was required as a requirement of the purchase of DirectTV to pass 12 million new residences and businesses with fiber. A lot of the broadband spending made by AT&T, Frontier and Alaska Communications are using CAF II funds given to them by the Wheeler FCC and which the companies are required to spend. None of those expenditures have anything to do with the repeal of net neutrality. And since net neutrality was only reversed a few months ago, it’s impossible to believe that any of the broadband spending in 2017 was due to this FCC. It’s far too early to see if that order will have any impact on rural broadband expenditures (something no industry experts expect).

This FCC Broadband Report concludes that deployment of broadband in the country is on track and reasonable. Yet the numbers in the report show that there are still 19 million Americans in rural America without access to adequate broadband. There are 12 million school-aged children who are suffering from the homework gap because they don’t have broadband at home.

By declaring that broadband deployment is adequate, Chairman Pai has let his FCC off the hook for having to take any actions to address the issue. But his stated reasons are based upon an argument that is clearly not supported by any facts. This would seem to put the Chairman in violation of his Section 706 obligations, although that’s probably something only Congress can determine.

I’m saddened to see the FCC become so partisan. This is not a new phenomenon and we saw partisan voting under the last several FCCs. Before that we had pro-big business FCCs such as the one under Chairman Michael Powell. But that FCC was far less partisan and still basically used the facts at its disposal in making decisions and setting policy.

The FCC has a mandate to balance what’s good for both the public and the telecom companies. In an ideal world the FCC would be a neutral arbiter that listens to the facts and sides with the best arguments. This trend towards a partisan FCC is bad for the industry because it means that major policies will flip-flop when we change administrations – and that’s not good for ISPs or the public. Partisanship does not excuse this FCC from abrogating its responsibility and making specious arguments not supported by facts. This FCC has taken partisanship too far.

AT&T and Net Neutrality

The big ISPs know that the public is massively in favor of net neutrality. It’s one of those rare topics that polls positively across demographics and party lines. Largely through lobbying efforts of the big ISPs, the FCC not only killed net neutrality regulation but they surprised most of the industry by walking away from regulating broadband at all.

We now see states and cities that are trying to bring back net neutrality in some manner. A few states like California are creating state laws that mimic the old net neutrality rules. Many more states are limiting purchasing for state telecom to ISPs that don’t violate net neutrality. Federal Democratic politicians are creating bills that would reinstate net neutrality and force it back under FCC jurisdiction.

This all has the big ISPs nervous. We certainly see this in the way that the big ISPs are talking about net neutrality. Practically all of them have released statements talking about how much they support the open Internet. These big companies already all have terrible customer service ratings and they don’t want to now be painted as the villains who are trying to kill the web.

A great example is AT&T. The company’s blog posted a letter from Chairman Randall Stephenson that makes it sound like AT&T is pro net neutrality. It fails to mention how the company went to court to overturn the FCC’s net neutrality decision or how much they spent lobbying to get the ruling overturned.

AT&T also took out full-page ads in many major newspapers making the same points. In those ads the company added a new talking point that net neutrality ought to also apply to big web companies like Facebook and Twitter. That is a red herring because web companies, by definition, can’t violate net neutrality since they don’t control the pipe to the customers. Many would love to see privacy rules that stop the web companies from abusing customer data – but that is a separate issue than net neutrality. AT&T seems to be making this point to confuse the public and deflect the blame away from themselves.

Stephenson says that AT&T is favor of federal legislation that would ensure net neutrality. But what he doesn’t say is that AT&T favors a bill the big companies are pushing that would implement a feel-good watered-down version of net neutrality. Missing from that proposed law (and from all of AT&T’s positions) is any talk of paid priority – one of the three net neutrality principles. AT&T has always wanted paid prioritization. They want to be able to charge Netflix or Google extra to access their networks since those two companies are the largest drivers of web traffic.

In my mind, abuse of paid prioritization can break the web. ISPs already charge their customers enough money to fully cover the cost of the network needed to support broadband. Customers with unlimited data plans, like most landline connections, have the right to download as much content as they want. The idea of an AT&T then also charging the content providers for the privilege to get to customers is a terrible idea for a number of reasons.

Consider Netflix. It’s likely that they would pass any fees paid to AT&T on to customers. And in doing so, AT&T has violated the principle of non-discrimination of traffic, albeit indirectly, by making it more expensive for people to use Netflix. AT&T will always say that are not the cause of a Netflix rate increase – but AT&T is able to influence the market price of web services, and in doing so discriminate against web traffic.

The other problem with paid prioritization is that it is a barrier to the next Netflix. New companies without Netflix’s huge customer base could not afford the fees to connect to AT&T and other large ISPs. And that barrier will stop the next big web company from launching.

I’ve been predicting that the ISPs are not going to do anything that drastically violates net neutrality for a while. They are going to be cautious about riling up the public and legislators since they understand that Congress could reinstate both net neutrality and broadband regulation at any time. The ISPs are enjoying the most big-company friendly FCC there has ever been, and they are getting everything they want out of them.

But big ISPs like AT&T know that the political and regulatory pendulum can and will likely swing the other way. Their tactic for now seems to be to say they are for net neutrality while still working to make sure it doesn’t actually come back. So we will see more blogs and newspaper ads and support for watered-down legislation. They are clearly hoping the issue loses steam so that the FCC and administration don’t reinstate rules they don’t want. But they realistically know that they are likely to be judged by their actions rather than their words, so I expect them to ease into practices that violate net neutrality in subtle ways that they hope won’t be noticed.

Regulating From Broadband Maps

One of the more bizarre things we do in the US is regulate broadband based upon broadband maps. There are numerous federal grant and subsidy programs that rely upon these maps (and the underlying databases that support them) as well as various state programs. The FCC also uses this same data when reporting broadband penetration in the country to Congress each year, as just occurred on February 9.

The maps are intended to show how many households can purchase broadband of various speeds. Currently the arbitrary speed thresholds tested are download speeds of 10 Mbps, 25 Mbps and 100 Mbps. These speeds are measured due to past decisions by the FCC. For example, the FCC chose a 10/1 Mbps speed goal for any company that accepted CAF II money to upgrade rural broadband. The FCC’s current definition of broadband is still set at 25/3 Mbps.

Anybody that understands broadband networks knows that much of the data included in the databases and the mapping is incorrect, and sometimes pure fantasy. That makes sense when you understand that the speeds in this mapping process are all self-reported by ISPs.

There are numerous reasons why the speeds in these databases are not an accurate reflection of the real world:

  • There are still ISPs that report advertised speeds rather than actual speeds received by customers.
  • Any speeds represented for a whole DSL network are inaccurate by definition. DSL speeds vary according to the size of the copper wires, the condition of the copper cable and the distance from the source of the DSL broadband signal. That means that in a DSL network the speeds available to customers vary street by street, and even house by house. We’ve always known that DSL reported in the mapping databases is overstated and that most telcos that report DSL speeds report theoretical speeds. I’m not sure I blame them, but the idea of any one speed being used to represent the performance of a DSL network is ludicrous.
  • The speeds in the database don’t recognize network congestion. There are still many broadband networks around that bog down under heavy usage, which means evenings in a residential neighborhood. Nobody wants to be told that their network is performing at 10 Mbps if the best speed they can ever get when they want to use it is a fraction of that.
  • The speeds don’t reflect that ISPs give some customers faster speeds. In networks where bandwidth is shared among all users on a neighborhood node, if a few customers are sold a faster-than-normal speed, then everybody else will suffer corresponding slower speeds. Network owners are able to force extra speed to customers that pay a premium for the service, but to the detriment of everybody else.
  • The maps don’t reflect the way networks were built. In most towns you will find homes and businesses that were somehow left out of the initial network construction. For example, when cable companies were first built they largely ignored business districts that didn’t want to buy cable TV. There are lots of cases of apartment and subdivision owners that didn’t allow in the incumbent telco or cable company. And there are a lot of homes that just got missed by the network. I was just talking to somebody in downtown Asheville where I live who is not connected to the cable network for some reason.
  • Not all ISPs care about updating the databases. There are many wireless and other small ISPs that don’t update the databases every time they make some network change that affects speeds. In fact, there are still some small ISPs that just ignore the FCC mapping requirement. At the other extreme there are small ISPs that overstate the speeds in the databases, hoping that it might drive customer requests to buy service.
  • One of the most insidious speed issues in networks are the data bursts that many ISPs frontload into their broadband products. They will send a fast burst of speed for the first minute or two for any demand for bandwidth. This improves the customer experience since a large percentage of requests to use bandwidth are for web searches or other short-term uses of bandwidth. Any customer using this feature will obtain much faster results from a speed test than their actual long-use data speeds since they are actually testing only the burst speed. A rural customer using burst might see 4 Mbps on a speed test and still find themselves unable to maintain a connection to Netflix.
  • Sometimes there are equipment issues. The best-known case of this is a widespread area of upstate New York where Charter has kept old DOCSIS 1.0 cable modems in homes that are not capable of receiving the faster data speeds the company is selling. It’s likely that the faster network speed is what is included in the database, not the speed that is choked by the old modems.
  • And finally, speed isn’t everything. Poor latency can ruin the utility of any broadband connection, to the point where the speed is not that important.

Unfortunately, most of the errors in the broadband databases and maps overstate broadband speeds rather than under-report them. I’ve worked with numerous communities and talk to numerous people who are not able to get the broadband speeds suggested by the FCC databases for their neighborhoods. Many times the specific issue can be pinned down to one of the above causes. But that’s no consolation for somebody who is told by the FCC that they have broadband when they don’t.