FCC Speed Tests for ISPs

ISPs awarded CAF II funding in the recent auction need to be aware that they will be subject to compliance testing for both latency and speeds on their new broadband networks. There are financial penalties for those failing to successfully meet these tests. The FCC revised the testing standards in July in Docket DA 18-710. These new testing standards become effective with testing starting in the third quarter of 2019. There new standards will replace the standards already in place for ISPs that receive funding from earlier rounds of the CAF program as well as ISPs getting A-CAM or other rate-of-return USF funding.

ISPs can choose between three methods for testing. First, they may elect what the FCC calls the MBA program, which uses an external vendor, approved by the FCC, to perform the testing. This firm has been testing speeds for the network built by large telcos for many years. ISPs can also use existing network tools if they are built into the customer CPE that allow test pinging and other testing methodologies. Finally, an ISP can install ‘white boxes’ that provide the ability to perform the tests.

The households to be tested are chosen at random by the ISP every two years. The FCC doesn’t describe a specific method for ensuring that the selections are truly random, but the ISP must describe to the FCC how this is done. It wouldn’t be hard for an ISP to fudge the results of the testing if they make sure that customers from slow parts of their network are not in the testing sample.

The number of tests to be conducted varies by the number of customers for which a recipient is getting CAF support; if the number is CAF households is 50 or fewer they must test 5 customers; if there are 51-500 CAF households they must test 10% of households. For 500 or greater CAF households they must test 50. ISPs that declare a high latency must test more locations with the maximum being 370.

ISPs must conduct the tests for a solid week, including weekends in every quarter to eliminate seasonality. Tests must be conducted in the evenings between 6:00 PM and 12:00 PM. Latency tests must be done every minute during the six-hour testing window. Speed tests – run separately for upload speeds and download speeds – must be done once per hour during the 6-hour testing window.

The FCC has set expected standards for the speed tests. These standards are based upon the required speeds of a specific program – such as the first CAF II program that required speeds of at least 10/1 Mbps. In the latest CAF program the testing will be based upon the speeds that the ISP declared they could meet when entering the action – speeds that can be as fast as 1 Gbps.

ISPs are expected to meet latency standards 95% of the time. Speed tests must achieve 80% of the expected upland and download speed 80% of the time. This might surprise people living in the original CAF II areas, because the big telcos only need to achieve download speeds of 8 Mbps for 80% of customers to meet the CAF standard. The 10/1 Mbps standard was low enough, but this lets the ISPs off the hook for underperforming even for that incredibly slow speed. This requirement means that an ISP guaranteeing gigabit download speeds needs to achieve 800 Mbps 80% of the time. ISPs that meet the speeds and latencies for 100% of customers are excused from quarterly testing and only have to test once per year.

There are financial penalties for ISPs that don’t meet these tests.

  • ISPs that have between 85% and 100% of households that meet the test standards lose 5% of their FCC support.
  • ISPs that have between 70% and 85% of households that meet the test standards lose 10% of their FCC support.
  • ISPs that have between 55% and 75% of households that meet the test standards lose 15% of their FCC support.
  • ISPs with less than 55% of compliant households lose 25% of their support.

For CAF II auction winners these reductions in funding would only be applied to the remaining time periods after they fail the tests. This particular auction covers a 10-year period of time and the testing would start once the new networks are operational, which is required to be completed between years 3 and 6 after funding.

This will have the biggest impact on ISPs that overstated their network capability. For instance, there were numerous ISPs that claimed the ability in the CAF auction to deliver 100 Mbps and they are going to lose 25% of the funding if they deliver speeds slower than 80 Mbps.

Winners of the CAF II Auction

The FCC CAF II reverse auction recently closed with an award of $1.488 billion to build broadband in rural America. This funding was awarded to 103 recipients that will collect the money over ten years. The funded projects must be 40% complete by the end of three years and 100% complete by the end of six years. The original money slated for the auction was almost $2 billion, but the reverse auction reduced the amount of awards and some census blocks got no bidders.

The FCC claims that 713,176 rural homes will be getting better broadband, but the real number of homes with a benefit from the auction is 513,000 since the auction funded Viasat to provide already-existing satellite broadband to 190,000 homes in the auction.

The FCC claims that 19% of the homes covered by the grants will be offered gigabit speeds, 53% will be offered speeds of at least 100 Mbps and 99.75% will be offered speeds of at least 25 Mbps. These statistics have me scratching my head. The 19% of the homes that will be offered gigabit speeds are obviously going to be getting fiber. I know a number of the winners who will be using the funds to help pay for fiber expansion. I can’t figure what technology accounts for the rest of the 53% of homes that supposedly will be able to get 100 Mbps speeds.

As I look through the filings I note that many of the fixed wireless providers claim that they can serve speeds over 100 Mbps. It’s true that fixed wireless can be used to deliver 100 Mbps speeds. To achieve that speed customers either need to be close to the tower or else a wireless carrier has to dedicate extra resources to that customer to achieve that speed – meaning less of that tower can be used to serve other customers. I’m not aware of any WISPs that offer ubiquitous 100 Mbps speeds, because to do so means serving a relatively small number of customers from a given tower. To be fair to the WISPs, their CAF II filings also say they will be offering slower speeds like 25 Mbps and 50 Mbps. The FCC exaggerated the results of the auction by claiming that any recipient capable of delivering 100 Mbps to a few customers will be delivering it to all customers – something that isn’t true. The fact is that not many of the households over the 19% getting fiber will ever buy 100 Mbps broadband. I know the FCC wants to get credit for improving rural broadband, but there is no reason to hype the results to be better than they are.

I also scratch my head wondering why Viasat was awarded $122 million in the auction. The company is the winner of funding for 190,595 households, or 26.7% of the households covered by the entire auction. Satellite broadband is every rural customer’s last choice for broadband. The latency is so poor on satellite broadband that it can’t be used for any real time applications like watching live video, making a Skype call, connecting to school networks to do homework or for connecting to a corporate WAN to work from home. Why does satellite broadband even qualify for the CAF II funding? Viasat had to fight to get into the auction and their entry was opposed by groups like the American Cable Association. The Viasat satellites are already available to all of the households in the awarded footprint, so this seems like a huge government giveaway that won’t bring any new broadband option to the 190,000 homes.

Overall the outcome of the auction was positive. Over 135,000 rural households will be getting fiber. Another 387,000 homes will be getting broadband of at least 25 Mbps, mostly using fixed wireless, with the remaining 190,000 homes getting the same satellite option they already have today.

It’s easy to compare this to the original CAF II program that gave billions to the big telcos and only required speeds of 10/1 Mbps. That original CAF II program was originally intended to be a reverse auction open to anybody, but at the last minute the FCC gave all of the money to the big telcos. One has to imagine there was a huge amount of lobbying done to achieve that giant giveaway.

Most of the areas covered by the first CAF II program had higher household density than this auction pool, and a reverse auction would have attracted a lot of ISPs willing to invest in faster technologies than the telcos. The results of this auction show that most of those millions of homes would have gotten broadband of at least 25 Mbps instead of the beefed-up DSL or cellular broadband they are getting through the big telcos.

Upgrading Broadband Speeds

A few weeks ago Charter increased my home broadband speeds from 60 Mbps to 130 Mbps with no change in price. My upload speed seems to be unchanged at 10 Mbps. Comcast is in the process of speed upgrades and is increasing base speeds to between 100 Mbps and 200 Mbps download speeds in various markets.

I find it interesting that while the FCC is having discussions about keeping the definition of broadband at 25 Mbps that the big cable companies – these two alone have over 55 million broadband customers – are unilaterally increasing broadband speeds.

These companies aren’t doing this out of the goodness of their hearts, but for business reasons. First, I imagine that this is a push to sharpen the contrast with DSL. There are a number of urban markets where customers can buy 50 Mbps DSL from AT&T and others and this upgrade opens up a clear speed difference between cable broadband and DSL.

However, I think the main reason they are increasing speeds is to keep customers happy. This change was done quietly, so I suspect that most people had no idea that the change was coming. I also suspect that most people don’t regularly do speed tests and won’t know about the speed increase – but many of them will notice better performance.

One of the biggest home broadband issues is inadequate WiFi, with out-of-date routers or poor router placement degrading broadband performance. Pushing faster speeds into the house can overcome some of these WiFi issues.

This should be a wake-up call to everybody else in the industry to raise their speeds. There are ISPs and overbuilders all across the country competing against the giant cable companies and they need to immediately upgrade speeds or lose the public relations battle in the market place. Even those who are not competing against these companies need to take heed, because any web search is going to show consumers that 100 Mbps broadband or greater is now the new standard.

These unilateral changes make a mockery of the FCC. It’s ridiculous to be having discussions about setting the definition of broadband at 25 Mbps when the two biggest ISPs in the country have base product speeds 5 to 8 times faster than that. States with broadband grant programs also have the speed conversation and this will hopefully alert them that the new goal for broadband needs to be at least 100 Mbps.

These speed increases were inevitable. We’ve known for decades that the home demand for broadband has been doubling every three years. When the FCC first started talking about 25 Mbps as the definition of acceptable broadband, the math said that within six years we’d be having the same discussion about 100 Mbps broadband – and here we are having that discussion.

The FCC doesn’t want to recognize the speed realities in the world because they are required by law to try to bring rural speeds to be par with urban speeds. But this can’t be ignored because these speed increases are not just for bragging rights. We know that consumers find ways to fill faster data pipes. Just two years ago I saw articles wondering if there was going to be any market for 4K video. Today, that’s the first thing offered to me on both Amazon Prime and Netflix. They shoot all new programming in 4K and offer it at the top of their menus. It’s been reported that at the next CES electronics shows there will be several companies pushing commercially available 8K televisions. This technology is going to require a broadband connection between 60 Mbps and 100 Mbps depending upon the level of screen action. People are going to buy these sets and then demand programming to use them – and somebody will create the programming.

8K video is not the end game. Numerous companies are working on virtual presence where we will finally be able to converse with a hologram of somebody as if they were in the same room. Early versions of this technology, which ought to be available soon will probably use the same range of bandwidth as 8K video, but I’ve been reading about near-future technologies that will produce realistic holograms and that might require as much as a 700 Mbps connection – perhaps the first real need for gigabit broadband.

While improving urban data speeds is great, every increase in urban broadband speeds highlights the poor condition of rural broadband. While urban homes are getting 130 – 200 Mbps for decent prices there are still millions of homes with either no broadband or with broadband at speeds of 10 Mbps or less. The gap between urban and rural broadband is growing wider every year.

If you’ve been reading this blog you know I don’t say a lot of good things about the big cable companies. But kudos to Comcast and Charter for unilaterally increasing broadband speeds. Their actions speak louder than anything that we can expect out of the FCC.

Modernizing CPNI Rules

I think we badly need new CPNI rules for the industry. CPNI stands for ‘Customer Proprietary Network Information’ and are rules to govern the use of data that telcos and ISPs gather on their customers. CPNI rules are regulated by the FCC and I think it’s fully within their current mandate to update the rules to fit the modern world.

While CPNI is related to privacy issues it’s not exactly the same. CPNI rules involve how ISPs use the customer data that they must gather in order to make the network operate. Originally CPNI rules involved telephone call details – who we called, who called us, etc. Telcos have been prohibited by CPNI rules from using this kind of data without the express consent of a consumer (or else in response to a valid subpoena from law enforcement).

Today the telcos and ISPs gather a lot more information about us than just telephone calling information. For instance, a cellular company not only knows all of your call details, but they know where you are whenever you call, text or make a data connection from your cellphone. Every ISP knows every web search you make since they are the ones routing those requests to the Internet. If you buy newer ISP products like home automation they know all sorts of details that they can gather from monitoring motion detectors and other devices that are part of their service.

Such CPNI data is valuable because it can be used by the ISP to assemble a profile of each customer, particularly when CPNI data is matched with data gathered from other sources. Every large ISP has purchased a business arm that is aimed to help them monetize customer data. The ISPs are all envious of the huge advertising revenues generated by Facebook and Google and want to climb into the advertising game.

The FCC was given the authority to limit how carriers use customer proprietary data, granted by Section 222(b) of the Telecommunications Act of 1934. Those statutes specifically prohibit carriers from using CPNI data for marketing purposes. Over the years the FCC developed more specific CPNI rules that governed telcos. However, the FCC has not updated the specific CPNI rules to cover the wide range of data that ISPs gather on us today. Telcos still ask customers for permission to use their telephone records, but they are not required to get customer permission to track web sites we visit or our location when using a cellphone.

The FCC could invoke CPNI protections for companies that they regulate. It gets dicier for the FCC to expand CPNI rules past traditional carriers. All sorts of web companies also gather information on users. Google makes most of their money through their search engine. They not only charge companies to get higher ranking for Google searches, but they monetize customer data by building profiles of each user that they can market to advertisers. These profiles are supposedly very specific – they can direct advertisers to users who have searched for any specific topic, be it people searching for information about diabetes or those looking to buy a new truck.

There are many who argue that companies like Google should be brought under the same umbrella of rules as ISPs. The ISPs rightfully claim that companies like Google have a major market advantage. But the ISPs clearly prefer the regulatory world where no company is subject to CPNI rules.

There other web applications that are harder to justify as being related to CPNI. For example, a social network like Facebook gathers huge amounts of private data about its users – but those users voluntarily build profiles and share that data freely.

There are more complicated cases such as Amazon, which has been accused of using customer shopping data to develop its own product lines to directly compete with vendors selling on the Amazon platform. The company clearly uses customer data for their own marketing purposes – but Amazon is clearly not a carrier and it would be a huge stretch to pull them under the CPNI rules.

It’s likely that platforms like Facebook or Amazon would have to be regulated with new privacy rules rather than with CPNI rules. That requires an act of Congress, and it’s likely that any new privacy rules would apply to a whole large range of companies that use the web – the approach taken by the European Union.

Killing FTC Regulation?

NCTA, the lobbying group for the big cable companies filed a pleading with the Federal Trade Commission (FTC) asking the agency to not get involved with regulating the broadband industry. When the FCC killed net neutrality, Chairman Ajit Pai promised that it was okay for the FCC to step away from broadband regulation since the FTC was going to take over much of the regulatory role. Now, a month after net neutrality went into effect we have the big cable ISPs arguing that the FTC should have a limited role in regulation broadband. The NTCA comments were filed in a docket that asks how the FTC should handle the regulatory role handed to them by the FCC.

Pai’s claim was weak from the outset because of the nature of the way that the FTC regulates. They basically pursue corporations of all kinds that violate federal trade rules or who abuse the general public. For example, the FTC went after AT&T for throttling customers who had purchased unlimited data plans. However, FTC rulings don’t carry the same weight as FCC orders. Rulings are specific to the company under investigation. Rulings might lead other companies to modify their behavior, but an FTC order doesn’t create a legal precedent that automatically applies to all carriers. In contrast, FCC rulings can be made to apply to the whole industry and rulings can change the regulations for every ISP.

The NCTA petition asks the FTC to not pursue complaints about regulatory complaints against ISPs. For example, they argue that the agency shouldn’t be singling out ISPs for unique regulatory burdens, but should instead pursue the large Internet providers like Facebook and Google. The NCTA claims that market forces will prevent bad behavior by ISP and will punish a carrier that abuses its customers. They claim there is sufficient competition for cable broadband, such as from DSL, that customers will leave an ISP that is behaving poorly. In a world where they have demolished DSL and where cable is a virtual monopoly in most markets they really made that argument! We have a long history in the industry that says otherwise, and even when regulated by the FCC there are long laundry lists of ways that carriers have mistreated their customers.

One of the more interesting requests is that the ISPs want the FTC to preempt state and local rules that try to regulate them. I am sure this is due to vigorous activity at the state level currently to create rules for net neutrality and privacy regulations. They want the FTC to issue guidelines to state Attorney Generals and state consumer protection agencies to remind them that broadband is regulated only at the federal level. It’s an interesting argument to make after the FCC has punted on regulating broadband and when this filing is asking the FTC to do the same. The ISPs want the FTC to leave them alone while asking the agency to act as the watchdog to stop others from trying to regulate the industry.

I think this pleading was inevitable since the big ISPs are trying to take full advantage of the FCC walking away from broadband regulation. The ISPs view this as an opportunity to kill regulation everywhere. At best the FTC would be a weak regulator of broadband, but the ISPs don’t want any scrutiny of the way they treat their customers.

The history of telecom regulation has always been in what I call waves. Over time the amount of regulations build up to a point where companies can make a valid claim of being over-regulated. Over-regulation can then be relieved either by Congress or by a business-friendly FCC who loosens regulatory constraints. But when regulations get too lax the big carriers inevitably break enough rules that attracts an increase of new regulation.

We are certainly hitting the bottom of a trough of a regulatory wave as regulations are being eliminated or ignored. Over time the large monopolies in the industry will do what monopolies always do. They will take advantage of this period of light regulation and will abuse customers in various ways and invite new regulations. My bet is that customer privacy will be the issue that will start the climb back to the top of the regulatory wave. The ISPs argument that market forces will force good behavior on their part is pretty laughable to anybody who has watched the big carriers over the years.

Regulating Digital Platforms

It seems like one of the big digital platforms is in the news almost daily – and not in a positive way. Yet there has been almost no talk in the US of trying to regulate digital platforms like Facebook and Google. Europe has taken some tiny steps, but regulation there are still in the infancy state. In this country the only existing regulations that apply to the big digital platforms are antitrust laws, some weak privacy rules, and general corporate regulation from the Federal Trade Commission that protect against general consumer fraud.

Any time there has been the slightest suggestion of regulating these companies we instantly hear the cry that the Internet must be free and unfettered. This argument harkens back to the early days of the Internet when the Internet was a budding industry and seems irrelevant now that these are some of the biggest corporations in the world that hold huge power in our daily lives.

For example, small businesses can thrive or die due to a change in an algorithm on the Google search engine. Search results are so important to businesses that the billion-dollar SEO industry has grown to help companies manipulate their search results. We’ve recently witnessed the damage that can be done by nefarious parties on platforms like Facebook to influence voting or to shape public opinion around almost any issue.

Our existing weak regulations are of little use in trying to control the behavior of these big companies. For example, in Europe there have been numerous penalties levied against Google for monopoly practices, but the fines haven’t been very effective in controlling Google’s behavior. In this country our primary anti-trust tool is to break up monopolies – an extreme remedy that doesn’t make much sense for the Google search engine or Facebook.

Regulating digital platforms would not be easy because one of the key concepts of regulation is understanding a business well enough to craft sensible rules that can throttle abuses. We generally regulate monopolies and the regulatory rules are intended to protect the public from the worst consequences of monopoly use. It’s not hard to make a case that both Facebook and Google are near-monopolies – but it’s not easy to figure out what we would do to regulate them in any sensible way.

For example, the primary regulations we have for electric companies is to control profits of the monopolies to keep rates affordable. In the airline industry we regulate issues of safety to force the airlines to do the needed maintenance on planes. It’s hard to imagine how to regulate something like a search engine in the same manner when a slight change in a search engine algorithm can have big economic consequences across a wide range of industries. It doesn’t seem possible to somehow regulate the fairness of a web search.

Regulating social media platforms would be even harder. The FCC has occasionally in the past been required by Congress to try to regulate morality issues – such as monitoring bad language or nudity on the public airwaves. Most of the attempts by the FCC to follow these congressional mandates were ineffective and often embarrassing for the agency. Social platforms like Facebook are already struggling to define ways to remove bad actors from their platform and it’s hard to think that government intervention in that process can do much more than to inject politics into an already volatile situation.

One of the problems with trying to regulate digital platforms is defining who they are. The FCC today has separate rules that can be used to regulate telecommunications carriers and media companies. How do you define a digital platform? Facebook, LinkedIn and Snapchat are all social media – they share some characteristics but also have wide differences. Just defining what needs to be regulated is difficult, if not impossible. For example, all of the social media platforms gain much of their value from user-generated content. Would that mean that a site like WordPress that houses this blog is a social media company?

Any regulations would have to start in Congress because there is no other way for a federal agency to be given the authority to regulate the digital platforms. It’s not hard to imagine that any effort out of Congress would concentrate on the wrong issues, much like the rules that made the FCC the monitor of bad language. I know as a user of the digital platforms that I would like to see some regulation in the areas of privacy and use of user data – but beyond that, regulating these companies is a huge challenge.

One Touch Make Ready

Earlier this month in WC Docket No. 17-84 and WT Docket No. 17-79 the FCC released new rules for one touch make ready (OTMR) for connecting wires to poles. These new rules allow a new attacher to a pole to use a single contractor to perform simple make-ready work, which they define as work where “existing attachments in the communications space of a pole could be transferred without any reasonable expectation of a service outage or facility damage and does not require splicing of any existing communication attachment or relocation of an existing wireless attachment.” These new rules will go into effect on February 1, 2019 or sooner, after 30 days, if the new rules are published in the Federal Register announcing approval by the Office of Management and Budget.

The OTMR rules don’t apply to more complex make-ready work where poles need to be replaced or where existing cables must be cut and spliced to accomplish the needed changes. The new rules don’t cover wireless attachments, so this is not an order that lets wireless companies place devices anywhere on poles at their choice (something the wireless companies are lobbying for). These rules also don’t apply to any work done above the power space at the top of poles.

For those not familiar with make-ready, a new attacher must pay to rearrange existing wires if there is not enough space on the poles for the new wire to meet safety standards. In most cases this can be accomplished by shifting existing wires higher or lower on the pole to create the needed clearance.

Possibly the most interesting part of the new order is that the FCC says that a new attacher is not responsible for the cost of fixing problems that are due to past attachers being out of compliance with safety codes. The reality is that most make-ready work is due to past attachers not spacing their wires according to code. This FCC language opens the door for new attachers to argue that some of the cost of make-ready should be charged to past attachers. Anybody who wants to make such claims needs to photograph and document existing violations before doing the work. I can foresee big fights over this issue after the make-ready work is completed.

 These rules end some of the practices that have made it time consuming and costly to put a new wire on a pole. Existing rules have allowed for sequential make-ready, where each existing utility can send out a crew to do the work, adding extra time as each separate crew coordinates the work, as well as adding to the cost since the new attacher has to pay for the multiple crews.

The new rules don’t apply everywhere and to all pole owners. There is still an exception for poles owned by municipalities and by electric cooperatives. The rules also don’t automatically apply to any state that has its own set of pole attachment rules. There are currently 22 states that have adopted at least some of their own pole attachment rules and the states still have the option to modify the new FCC rules. Expect delays in many states past the February 1 effective date as states deliberate on the issue. Interestingly, there are also two cities, Louisville, KY and Nashville, TN, that have already adopted their own version of OTMR and the order does not say if local governments have this right.

The order considerably shortens the time required to perform simple make ready. There are many nuances in the new time line that make it hard to condense to a paragraph, but the time lines are considerably shorter than the previous FCC rules. The FCC also shortened the time line for some of the steps for complex make-ready. Unfortunately, in many cases it’s the complex make-ready time lines that will still impact a project, because a few poles needing complex make ready can delay implementation of a new fiber route.

The order encourages pole owners to publish a list of contractors that are qualified to do the make ready work. The new rules also define the criteria for selecting a contractor in the case where the pole owner doesn’t specify one. Pole owners can veto a suggested contractor from the new attacher, but in doing so they must suggest a qualified contractor they find acceptable. Not mentioned in the order is the situation where a utility insists on doing all work themselves.

As a side note, this order also prohibits state and local governments from imposing moratoria on new wireless pole attachments. The ruling doesn’t stop states from imposing new rules, but it prohibits them from blocking wireless carriers from getting access to poles.

Overall this is a positive order for anybody that wants to add fiber to existing poles. It simplifies and speeds up the pole attachment process, at least for simple attachments. It should significantly hold down pole attachment costs by allowing one contractor to do all of the needed work rather than allowing each utility to bill for moving their own wires. There are still some flaws with the order. For instance, although the time frames have been reduced, the pole attachment process can still take a long time when complex pole attachment work is needed. But overall this is a much needed improvement in the process that has caused most of the delays in deploying new fiber.

The Definition of Broadband

The FCC recently issued the Notice of Inquiry (NOI) seeking input on next years broadband progress report. As usual, and perhaps every year into the future, this annual exercise stirs up the industry as we fight to define the regulatory speed of broadband. That definition matters because Congress has tasked the FCC to undertake efforts to make sure that everybody in the country has access to broadband. Today broadband is defined as 25 Mbps downstream and 3 Mbps upstream, and households that can’t buy that speed are considered underserved if they can get some broadband and unserved if they have no broadband options.

The NOI proposes keeping the 25/3 Mbps definition of broadband for another year. They know if they raise it that millions of homes will suddenly be considered to be underserved. However, the FCC is bowing to pressure and this year will gather data to see how many households have access to 50/5 Mbps broadband.

It was only a year ago when this FCC set off a firestorm by suggesting a reversion to the old definition of 10/1 Mbps. That change would have instantly classified millions of rural homes as having adequate broadband. The public outcry was immediate, and the FCC dropped the idea. For last year’s report the FCC also considered counting mobile broadband as a substitute for landline broadband – another move that would have reclassified millions into the served category. The FCC is not making that same recommendations this year – but they are gathering data on the number of people who access to cellular data speeds of 5/1 Mbps and 10/3 Mbps.

The FCC has also been tasked by Congress for getting faster broadband to schools. This year’s NOI recommends keeping the current FCC goal for all schools to immediately have access of 100 Mbps per 1,000 students, with a longer-term goal of 1 Gbps per 1,000 students.

Commissioner Jessica Rosenworcel has suggested in the current NOI that the official definition of broadband be increased to 100 Mbps download. She argues that our low target for defining broadband is why “the United States is not even close to leading the world” in broadband.

I think Commissioner Rosenworcel is on to something. The gap between the fastest and slowest broadband speeds is widening. This year both Comcast and Charter are unilaterally raising broadband speeds to customers. Charter kicked up the speed at my house from 60 Mbps to 130 Mbps a few weeks ago. AT&T is building fiber to millions of customers. Other fiber overbuilders continue to invest in new fiber construction.

The cable companies decided a decade ago that their best strategy was to stay ahead of the speed curve. This is at least the third round of unilateral speed increases that I can remember. A customer who purchased and kept a 20 Mbps connection a decade ago is probably now receiving over 100 Mbps for that same connection. One way to interpret Commissioner Rosenworcel’s suggestion is that the definition of broadband should grow over time to meet the market reality. If Charter and Comcast both think that their 50 million urban customers need speeds of at least 100 Mbps, then that ought to become the definition of broadband.

However, a definition of broadband at 100 Mbps creates a major dilemma for the FCC. The only two widely deployed technologies that can achieve that kind of speed today are fiber and cable company hybrid fiber/coaxial networks. As I wrote just a few days ago, there are new DSL upgrades available that can deliver up to 300 Mbps for 3,000 – 4,000 feet from a DSL hub – but none of the US telcos are pursuing the technology. Fixed wireless technology can deliver 100 Mbps – but only to customers living close to a wireless tower.

If the FCC was to adopt a definition of broadband at 100 Mbps, they would be finally recognizing that the fixes for rural broadband they have been funding are totally inadequate. They spent billions in the CAF II program to bring rural broadband up to 10/1 Mbps broadband. They are getting ready to give out a few more billion in the CAF II reverse auction which will do the same, except for a few grant recipients that use the money to help fund fiber.

By law, the FCC would have to undertake programs to bring rural broadband up to a newly adopted 100 Mbps standard. That would mean finding many billions of dollars somewhere. I don’t see this FCC being bold enough to do that – they seem determined to ignore the issue hoping it will go away.

This issue can only be delayed for a few more years. The country is still on the curve where the need for broadband at households doubles every three or so years. As the broadband usage in urban homes grows to fill the faster pipes being supplied by the cable companies it will become more apparent each year that the definition of broadband is a lot faster than the FCC wants to acknowledge.

Looking Closer at CAF II Broadband

AT&T is making the rounds in rural Kentucky, not too far from where I live, and is announcing the introduction of their residential wireless broadband product that is the result of the FCC’s CAF II program. Today I’m looking at more detail at that product.

AT&T was required under the CAF II rules to deliver broadband speeds of at least 10 Mbps download and 1 Mbps upload. AT&T says Kentucky announcement that they will be delivering products with at least that much speed, so it’s possible that customers might see something a little faster. Or the company could cap speeds at 10 Mbps and we’ll have to wait for reports from customers about actual speeds.

AT&T accepted nearly $186 million in FCC funds to bring CAF II broadband capabilities to 84,333 households in the state, or $2,203 per household. They say all of those homes will have the broadband available by the end of 2020 (although there is no penalty if some of the homes don’t get covered – which one would expect since many homes are likely to be too far from a cell tower).

AT&T will be delivering the broadband in Kentucky using LTE broadband from cellphone towers. This is delivered to homes by placing a small antenna box (not a dish) on the exterior of a home. They say that they will be using a different set of frequencies for CAF II broadband than what is used for cellular service, meaning there should be no degradation of normal cellular service.

I saw a news article in Kentucky that says the price will be $50 per month, but that’s a special one-year price offer for customers also willing to sign up for DirecTV. Following are more specific details of the normal product and pricing:

  • Customers can get a price of $60 per month for 1-year by signing a 12-month contract. After the year the price increases to $70 per month and is set at $70 per month for those not willing to agree to a contract.
  • Customers signing a contract see no installation charge, but otherwise there is a $99 one-time fee to connect.
  • There is an early termination charge for customers that break the one-year contract of $10 for each remaining month of the contract.
  • There is a $150 fee for customers who don’t return the antenna box.
  • There is a monthly data cap of 170 Gigabytes of downloaded data. Customers pay $10 for each additional 50 GB of download up to a maximum of $200 per month. AT&T is offering a 340 GB monthly data cap right now for customers who bundle with DirecTV – but that’s a temporary offer until October 1.
  • AT&T also will layer on a monthly $1.99 administrative fee that they pocket.

I think the pricing is far too high considering that the $186 million given to AT&T probably paid for all, or nearly all of the cost of the upgrades needed to deliver the service. Some of that money probably was used to bolster fiber to rural cell sites and the funding would have been used to add the new electronics to cell sites. AT&T used free federal money to create a $72 monthly broadband product, and before even considering the data cap is a product with a huge margin return since AT&T doesn’t have to recover the cost of the underlying network.

The small data cap is going to generate a lot of additional revenue for AT&T. The monthly data cap of 170 GB is already too small. Comcast just reported in June that the average download for all of their 23 million broadband customers was 151 GB per month. That means there are already a significant number of homes that want to use more than AT&T’s monthly 170 GB cap. We know that monthly home demand for broadband keeps growing and the Comcast average just a year ago was 128 GB per month. With that growth, within a year the average customer will want more than AT&T’s cap.

A few years ago when I was on Comcast they measured my 3-person home as using nearly 700 GB per month. On the AT&T plan my monthly bill would be $180 per month. Within a few years most homes will want to use more data than AT&T’s cap. The FCC really screwed the public when they didn’t insist that carriers taking the funding should provide unlimited downloads, or at least some high data cap like 1 terabyte. That stingy data cap gives AT&T permission to print money in rural America.

The 10 Mbps speed is also a big problem. That speed today is already inadequate for most households who now want to engage in multiple simultaneous streams. I’ve written many times about the huge inefficiencies in home WiFi and a 10 Mbps connection is just barely adequate for two video streams as long as there are no other broadband uses in the home at the same time. A typical home with kids these days is going to want to simultaneously watch video, do homework, play games, browse the web, download files or work from home. A home with a 10 Mbps speed is not close to equivalent to much faster urban broadband connections. You don’t have to look forward more than a few years to know that a 10 Mbps data caps is soon going to feel glacially slow.

Finally, cellular data has a higher latency than landline broadband, with latency as high as 100 msec. Customers might have problems at times on this product maintaining video streams, making VoIP calls or staying connected to a school or work server.

I’m sure that a home that has never had broadband is going to welcome this product. But it’s not going to take them long to realize that this is not the same broadband available to most homes. They are also going to realize that it’s possibly the last speed upgrade they are going to see for a long time since AT&T and the FCC want to check off these homes as now having broadband.

Plummeting Franchise Fees

The City of Creve Coeur, Missouri recently filed a suit against Netflix and Hulu claiming that the companies should be paying the same local franchise fees as Charter Communication, which is the incumbent video provider in the community. The City claims that it is losing franchise tax revenues as people cut the cord and they want to tax the companies that are taking that business away from Charter. They argue that Netflix and Charter ride the same wires and rights-of-way to deliver content and both should be taxed the same.

My quick reaction is that the lawsuit will get little traction due to the numerous differences between Charter and Netflix. However, I’ve learned over the years that it’s hard to predict tax disputes and it’s certainly possible that a judge might agree that Netflix can be taxed. If the courts see this as a regulatory battle the case will likely get referred to the FCC, but there’s no telling what happens if it’s instead considered as a tax dispute.

Most cable franchise taxes around the country are levied against the amount of cable TV revenues sold in a community. The nature of franchise agreements varies across the country and there are some jurisdictions that also tax telephone and broadband services.

There some interesting differences between a cable provider like Charter and Netflix.

  • I’ve read a lot of franchise agreements and one of the most common characteristics of these agreements is that, while the assess the tax levy on cable revenues, the basis of the agreement is to grant access to public rights-of-way to allow a cable provider to hang wires or bury cable in the community. Charter owns a wired network in the City while a company like Netflix does not.
  • Franchise agreements almost always create an obligation for a cable provider to serve everywhere in the community, or at least to the parts of the community that have a certain level of home density. For instance, cable companies are often required to build wires to any parts of town that have at least 15 or 20 homes per linear mile. The same obligation can’t really be applied to Netflix – they can only sell to homes that have sufficient broadband to use their service.
  • There are often other requirements that come with a franchise. For instance, the franchise holder might be required to dedicate a channel for local government programming. Franchise holders are often required to provide fiber or bandwidth to the City. Netflix wouldn’t be able to meet any of these obligations.

I don’t know if the City ultimately wants Netflix and Hulu to sign a franchise agreement, but if they do the City might not like the result. Current regulations require that a City can’t demand concessions from one franchise holder that doesn’t apply to all franchise holders. I can picture a stripped-down franchise agreement for Netflix for which Charter would immediately demand to use if Netflix was excused from any obligations required of Charter.

The FCC does not want this issue handed to them because it opens the door to defining who is a cable company. The agency opened an investigation into this issue a few years ago and quietly let it drop, because it’s not a decision they want to make. The FCC is constrained on many issues related to cable by laws passed by Congress. I think the FCC decided early in the investigation that they did not want to tackle the sticky issues of declaring online programmers to be cable companies. Had the FCC done so then this suit might have good traction.

Even a few years ago at the early start of online content the FCC could see that the online content world would become messy. There are now companies like Sling TV and DirecTV Now which look a lot like a cable company in terms of programming. But there are far more online providers that don’t fit the mold. Is a company that only streams British comedy, or soccer, or mystery movies really a cable company? Is a web service that streams blogs a content provider? I think the FCC was right to let this issue quietly die. I’m sure the day will come when the FCC finally acts on the issue, but when they do it’s more likely that traditional cable companies will be freed from regulation instead of dragging OTT providers into regulation.

It’s hard to think any city can justify the legal expense of pursuing this to the end – even winning might not give them the results they want. Without congressional action the City would have to tackle each of the hundreds of online video content providers to somehow get them to also pay a tax. This feels a lot like tilting at windmills. However, many taxes we pay today started when one jurisdiction tackled the issue and others climbed aboard – so this is worth keeping an eye on.