More FCC Mapping Woes

The FCC has another new billion dollar grant program, this one aimed to improve rural cellular coverage. Labeled as the Mobility Fund II the program will conduct a reverse auction sometime next year to give $4.53 billion to cellular carriers to extend wireless coverage to the most remote parts of the country. For taking the funding a cellular carrier must bring 4G LTE coverage to the funded areas and achieve cellular download speeds of at least 10 Mbps. Funding will be distributed over 10 years with build out requirements sooner than that.

Just like with the CAF II program, the areas eligible for funding are based upon the FCC’s broadband maps using data collected by the existing cellular carriers. As you might expect, the maps show that the parts of the country with the worst coverage – those eligible for funding – are mostly in the mountains and deserts of the west and in Appalachia.

The release of the Mobility Fund II maps instantly set off an uproar as citizens everywhere complained about lack of cellular coverage and politicians from all over the country asked the FCC why there wasn’t more funding coming to their states. The FCC received letters from senators in Mississippi, Missouri, Maine and a number of other states complaining that their states have areas with poor or non-existent cellular coverage that were not covered be the new fund.

If you’ve traveled anywhere in rural America you know that there are big cellular dead spots everywhere. I’ve been to dozens of rural counties all across America in the last few years and every one of them has parts of their counties without good cellular coverage. Everybody living in rural America can point to areas where cellphones don’t work.

The issue boils down to the FCC mapping used to define cellular and broadband coverage. The maps for this program were compiled from a one-time data request to the cellular carriers asking for existing 4G coverage. It’s obvious by the protests that the carriers claim cellular coverage where it doesn’t exist.

In August, the Rural Wireless Association (RWA) filed a complaint with the FCC claiming that Verizon lied about its cellular coverage by claiming coverage in many areas that don’t have it. This is the association of smaller wireless companies (they still exist!). They say that the Verizon’s exaggerated coverage claims will block the funding to many areas that should be eligible.

The Mobility Fund II program allows carriers to challenge the FCC’s maps by conducting tests to identify areas that don’t have good cellular coverage. The smaller carriers in the RWA have been filing these challenges and the FCC just added 90 additional days for the challenge process. Those challenges will surely add new eligible coverage areas for this program.

But the challenge program isn’t going to uncover many of these areas because there are large parts of the country that are not close to an RWA carrier, and which won’t be challenged. People with no cellular coverage that are not part of the this grant program might never get good cellular coverage – something that’s scary as the big telcos plan to tear down copper in rural America.

The extent of the challenges against the Verizon data are good evidence that Verizon overstated 4G LTE coverage. The RWA members I know think Verizon did this purposefully to either block others from expanding cellular networks into areas already served by Verizon or to perhaps direct more of this new fund to areas where Verizon might more easily claim some of the $4.5 billion.

To give Verizon a tiny amount of credit, knowing cellular coverage areas is hard. If you’ve ever seen a coverage map from a single cell tower you’ll instantly notice that it looks like a many-armed starfish. There are parts of the coverage area where good signal extends outward for many miles, but there are other areas where the signal is blocked by a hill or other impediments. You can’t draw circles on a map around a cell tower to show coverage because it only works that way on the Bonneville Salt Flats. There can be dead spots even near to the cell tower.

The FCC fund is laudable in that it’s trying to bring cellular coverage to those areas that clearly don’t have it. But there are countless other holes in cellular coverage that cannot be solved with this kind of fund, and people living in the many smaller cellular holes won’t get any relief from this kind of funding mechanism. Oddly, this fund will bring cellular coverage to areas where almost nobody lives while not addressing cellular holes in more populated areas.

FCC Speed Tests for ISPs

ISPs awarded CAF II funding in the recent auction need to be aware that they will be subject to compliance testing for both latency and speeds on their new broadband networks. There are financial penalties for those failing to successfully meet these tests. The FCC revised the testing standards in July in Docket DA 18-710. These new testing standards become effective with testing starting in the third quarter of 2019. There new standards will replace the standards already in place for ISPs that receive funding from earlier rounds of the CAF program as well as ISPs getting A-CAM or other rate-of-return USF funding.

ISPs can choose between three methods for testing. First, they may elect what the FCC calls the MBA program, which uses an external vendor, approved by the FCC, to perform the testing. This firm has been testing speeds for the network built by large telcos for many years. ISPs can also use existing network tools if they are built into the customer CPE that allow test pinging and other testing methodologies. Finally, an ISP can install ‘white boxes’ that provide the ability to perform the tests.

The households to be tested are chosen at random by the ISP every two years. The FCC doesn’t describe a specific method for ensuring that the selections are truly random, but the ISP must describe to the FCC how this is done. It wouldn’t be hard for an ISP to fudge the results of the testing if they make sure that customers from slow parts of their network are not in the testing sample.

The number of tests to be conducted varies by the number of customers for which a recipient is getting CAF support; if the number is CAF households is 50 or fewer they must test 5 customers; if there are 51-500 CAF households they must test 10% of households. For 500 or greater CAF households they must test 50. ISPs that declare a high latency must test more locations with the maximum being 370.

ISPs must conduct the tests for a solid week, including weekends in every quarter to eliminate seasonality. Tests must be conducted in the evenings between 6:00 PM and 12:00 PM. Latency tests must be done every minute during the six-hour testing window. Speed tests – run separately for upload speeds and download speeds – must be done once per hour during the 6-hour testing window.

The FCC has set expected standards for the speed tests. These standards are based upon the required speeds of a specific program – such as the first CAF II program that required speeds of at least 10/1 Mbps. In the latest CAF program the testing will be based upon the speeds that the ISP declared they could meet when entering the action – speeds that can be as fast as 1 Gbps.

ISPs are expected to meet latency standards 95% of the time. Speed tests must achieve 80% of the expected upland and download speed 80% of the time. This might surprise people living in the original CAF II areas, because the big telcos only need to achieve download speeds of 8 Mbps for 80% of customers to meet the CAF standard. The 10/1 Mbps standard was low enough, but this lets the ISPs off the hook for underperforming even for that incredibly slow speed. This requirement means that an ISP guaranteeing gigabit download speeds needs to achieve 800 Mbps 80% of the time. ISPs that meet the speeds and latencies for 100% of customers are excused from quarterly testing and only have to test once per year.

There are financial penalties for ISPs that don’t meet these tests.

  • ISPs that have between 85% and 100% of households that meet the test standards lose 5% of their FCC support.
  • ISPs that have between 70% and 85% of households that meet the test standards lose 10% of their FCC support.
  • ISPs that have between 55% and 75% of households that meet the test standards lose 15% of their FCC support.
  • ISPs with less than 55% of compliant households lose 25% of their support.

For CAF II auction winners these reductions in funding would only be applied to the remaining time periods after they fail the tests. This particular auction covers a 10-year period of time and the testing would start once the new networks are operational, which is required to be completed between years 3 and 6 after funding.

This will have the biggest impact on ISPs that overstated their network capability. For instance, there were numerous ISPs that claimed the ability in the CAF auction to deliver 100 Mbps and they are going to lose 25% of the funding if they deliver speeds slower than 80 Mbps.

Winners of the CAF II Auction

The FCC CAF II reverse auction recently closed with an award of $1.488 billion to build broadband in rural America. This funding was awarded to 103 recipients that will collect the money over ten years. The funded projects must be 40% complete by the end of three years and 100% complete by the end of six years. The original money slated for the auction was almost $2 billion, but the reverse auction reduced the amount of awards and some census blocks got no bidders.

The FCC claims that 713,176 rural homes will be getting better broadband, but the real number of homes with a benefit from the auction is 513,000 since the auction funded Viasat to provide already-existing satellite broadband to 190,000 homes in the auction.

The FCC claims that 19% of the homes covered by the grants will be offered gigabit speeds, 53% will be offered speeds of at least 100 Mbps and 99.75% will be offered speeds of at least 25 Mbps. These statistics have me scratching my head. The 19% of the homes that will be offered gigabit speeds are obviously going to be getting fiber. I know a number of the winners who will be using the funds to help pay for fiber expansion. I can’t figure what technology accounts for the rest of the 53% of homes that supposedly will be able to get 100 Mbps speeds.

As I look through the filings I note that many of the fixed wireless providers claim that they can serve speeds over 100 Mbps. It’s true that fixed wireless can be used to deliver 100 Mbps speeds. To achieve that speed customers either need to be close to the tower or else a wireless carrier has to dedicate extra resources to that customer to achieve that speed – meaning less of that tower can be used to serve other customers. I’m not aware of any WISPs that offer ubiquitous 100 Mbps speeds, because to do so means serving a relatively small number of customers from a given tower. To be fair to the WISPs, their CAF II filings also say they will be offering slower speeds like 25 Mbps and 50 Mbps. The FCC exaggerated the results of the auction by claiming that any recipient capable of delivering 100 Mbps to a few customers will be delivering it to all customers – something that isn’t true. The fact is that not many of the households over the 19% getting fiber will ever buy 100 Mbps broadband. I know the FCC wants to get credit for improving rural broadband, but there is no reason to hype the results to be better than they are.

I also scratch my head wondering why Viasat was awarded $122 million in the auction. The company is the winner of funding for 190,595 households, or 26.7% of the households covered by the entire auction. Satellite broadband is every rural customer’s last choice for broadband. The latency is so poor on satellite broadband that it can’t be used for any real time applications like watching live video, making a Skype call, connecting to school networks to do homework or for connecting to a corporate WAN to work from home. Why does satellite broadband even qualify for the CAF II funding? Viasat had to fight to get into the auction and their entry was opposed by groups like the American Cable Association. The Viasat satellites are already available to all of the households in the awarded footprint, so this seems like a huge government giveaway that won’t bring any new broadband option to the 190,000 homes.

Overall the outcome of the auction was positive. Over 135,000 rural households will be getting fiber. Another 387,000 homes will be getting broadband of at least 25 Mbps, mostly using fixed wireless, with the remaining 190,000 homes getting the same satellite option they already have today.

It’s easy to compare this to the original CAF II program that gave billions to the big telcos and only required speeds of 10/1 Mbps. That original CAF II program was originally intended to be a reverse auction open to anybody, but at the last minute the FCC gave all of the money to the big telcos. One has to imagine there was a huge amount of lobbying done to achieve that giant giveaway.

Most of the areas covered by the first CAF II program had higher household density than this auction pool, and a reverse auction would have attracted a lot of ISPs willing to invest in faster technologies than the telcos. The results of this auction show that most of those millions of homes would have gotten broadband of at least 25 Mbps instead of the beefed-up DSL or cellular broadband they are getting through the big telcos.

Modernizing CPNI Rules

I think we badly need new CPNI rules for the industry. CPNI stands for ‘Customer Proprietary Network Information’ and are rules to govern the use of data that telcos and ISPs gather on their customers. CPNI rules are regulated by the FCC and I think it’s fully within their current mandate to update the rules to fit the modern world.

While CPNI is related to privacy issues it’s not exactly the same. CPNI rules involve how ISPs use the customer data that they must gather in order to make the network operate. Originally CPNI rules involved telephone call details – who we called, who called us, etc. Telcos have been prohibited by CPNI rules from using this kind of data without the express consent of a consumer (or else in response to a valid subpoena from law enforcement).

Today the telcos and ISPs gather a lot more information about us than just telephone calling information. For instance, a cellular company not only knows all of your call details, but they know where you are whenever you call, text or make a data connection from your cellphone. Every ISP knows every web search you make since they are the ones routing those requests to the Internet. If you buy newer ISP products like home automation they know all sorts of details that they can gather from monitoring motion detectors and other devices that are part of their service.

Such CPNI data is valuable because it can be used by the ISP to assemble a profile of each customer, particularly when CPNI data is matched with data gathered from other sources. Every large ISP has purchased a business arm that is aimed to help them monetize customer data. The ISPs are all envious of the huge advertising revenues generated by Facebook and Google and want to climb into the advertising game.

The FCC was given the authority to limit how carriers use customer proprietary data, granted by Section 222(b) of the Telecommunications Act of 1934. Those statutes specifically prohibit carriers from using CPNI data for marketing purposes. Over the years the FCC developed more specific CPNI rules that governed telcos. However, the FCC has not updated the specific CPNI rules to cover the wide range of data that ISPs gather on us today. Telcos still ask customers for permission to use their telephone records, but they are not required to get customer permission to track web sites we visit or our location when using a cellphone.

The FCC could invoke CPNI protections for companies that they regulate. It gets dicier for the FCC to expand CPNI rules past traditional carriers. All sorts of web companies also gather information on users. Google makes most of their money through their search engine. They not only charge companies to get higher ranking for Google searches, but they monetize customer data by building profiles of each user that they can market to advertisers. These profiles are supposedly very specific – they can direct advertisers to users who have searched for any specific topic, be it people searching for information about diabetes or those looking to buy a new truck.

There are many who argue that companies like Google should be brought under the same umbrella of rules as ISPs. The ISPs rightfully claim that companies like Google have a major market advantage. But the ISPs clearly prefer the regulatory world where no company is subject to CPNI rules.

There other web applications that are harder to justify as being related to CPNI. For example, a social network like Facebook gathers huge amounts of private data about its users – but those users voluntarily build profiles and share that data freely.

There are more complicated cases such as Amazon, which has been accused of using customer shopping data to develop its own product lines to directly compete with vendors selling on the Amazon platform. The company clearly uses customer data for their own marketing purposes – but Amazon is clearly not a carrier and it would be a huge stretch to pull them under the CPNI rules.

It’s likely that platforms like Facebook or Amazon would have to be regulated with new privacy rules rather than with CPNI rules. That requires an act of Congress, and it’s likely that any new privacy rules would apply to a whole large range of companies that use the web – the approach taken by the European Union.

Killing FTC Regulation?

NCTA, the lobbying group for the big cable companies filed a pleading with the Federal Trade Commission (FTC) asking the agency to not get involved with regulating the broadband industry. When the FCC killed net neutrality, Chairman Ajit Pai promised that it was okay for the FCC to step away from broadband regulation since the FTC was going to take over much of the regulatory role. Now, a month after net neutrality went into effect we have the big cable ISPs arguing that the FTC should have a limited role in regulation broadband. The NTCA comments were filed in a docket that asks how the FTC should handle the regulatory role handed to them by the FCC.

Pai’s claim was weak from the outset because of the nature of the way that the FTC regulates. They basically pursue corporations of all kinds that violate federal trade rules or who abuse the general public. For example, the FTC went after AT&T for throttling customers who had purchased unlimited data plans. However, FTC rulings don’t carry the same weight as FCC orders. Rulings are specific to the company under investigation. Rulings might lead other companies to modify their behavior, but an FTC order doesn’t create a legal precedent that automatically applies to all carriers. In contrast, FCC rulings can be made to apply to the whole industry and rulings can change the regulations for every ISP.

The NCTA petition asks the FTC to not pursue complaints about regulatory complaints against ISPs. For example, they argue that the agency shouldn’t be singling out ISPs for unique regulatory burdens, but should instead pursue the large Internet providers like Facebook and Google. The NCTA claims that market forces will prevent bad behavior by ISP and will punish a carrier that abuses its customers. They claim there is sufficient competition for cable broadband, such as from DSL, that customers will leave an ISP that is behaving poorly. In a world where they have demolished DSL and where cable is a virtual monopoly in most markets they really made that argument! We have a long history in the industry that says otherwise, and even when regulated by the FCC there are long laundry lists of ways that carriers have mistreated their customers.

One of the more interesting requests is that the ISPs want the FTC to preempt state and local rules that try to regulate them. I am sure this is due to vigorous activity at the state level currently to create rules for net neutrality and privacy regulations. They want the FTC to issue guidelines to state Attorney Generals and state consumer protection agencies to remind them that broadband is regulated only at the federal level. It’s an interesting argument to make after the FCC has punted on regulating broadband and when this filing is asking the FTC to do the same. The ISPs want the FTC to leave them alone while asking the agency to act as the watchdog to stop others from trying to regulate the industry.

I think this pleading was inevitable since the big ISPs are trying to take full advantage of the FCC walking away from broadband regulation. The ISPs view this as an opportunity to kill regulation everywhere. At best the FTC would be a weak regulator of broadband, but the ISPs don’t want any scrutiny of the way they treat their customers.

The history of telecom regulation has always been in what I call waves. Over time the amount of regulations build up to a point where companies can make a valid claim of being over-regulated. Over-regulation can then be relieved either by Congress or by a business-friendly FCC who loosens regulatory constraints. But when regulations get too lax the big carriers inevitably break enough rules that attracts an increase of new regulation.

We are certainly hitting the bottom of a trough of a regulatory wave as regulations are being eliminated or ignored. Over time the large monopolies in the industry will do what monopolies always do. They will take advantage of this period of light regulation and will abuse customers in various ways and invite new regulations. My bet is that customer privacy will be the issue that will start the climb back to the top of the regulatory wave. The ISPs argument that market forces will force good behavior on their part is pretty laughable to anybody who has watched the big carriers over the years.

Regulating Digital Platforms

It seems like one of the big digital platforms is in the news almost daily – and not in a positive way. Yet there has been almost no talk in the US of trying to regulate digital platforms like Facebook and Google. Europe has taken some tiny steps, but regulation there are still in the infancy state. In this country the only existing regulations that apply to the big digital platforms are antitrust laws, some weak privacy rules, and general corporate regulation from the Federal Trade Commission that protect against general consumer fraud.

Any time there has been the slightest suggestion of regulating these companies we instantly hear the cry that the Internet must be free and unfettered. This argument harkens back to the early days of the Internet when the Internet was a budding industry and seems irrelevant now that these are some of the biggest corporations in the world that hold huge power in our daily lives.

For example, small businesses can thrive or die due to a change in an algorithm on the Google search engine. Search results are so important to businesses that the billion-dollar SEO industry has grown to help companies manipulate their search results. We’ve recently witnessed the damage that can be done by nefarious parties on platforms like Facebook to influence voting or to shape public opinion around almost any issue.

Our existing weak regulations are of little use in trying to control the behavior of these big companies. For example, in Europe there have been numerous penalties levied against Google for monopoly practices, but the fines haven’t been very effective in controlling Google’s behavior. In this country our primary anti-trust tool is to break up monopolies – an extreme remedy that doesn’t make much sense for the Google search engine or Facebook.

Regulating digital platforms would not be easy because one of the key concepts of regulation is understanding a business well enough to craft sensible rules that can throttle abuses. We generally regulate monopolies and the regulatory rules are intended to protect the public from the worst consequences of monopoly use. It’s not hard to make a case that both Facebook and Google are near-monopolies – but it’s not easy to figure out what we would do to regulate them in any sensible way.

For example, the primary regulations we have for electric companies is to control profits of the monopolies to keep rates affordable. In the airline industry we regulate issues of safety to force the airlines to do the needed maintenance on planes. It’s hard to imagine how to regulate something like a search engine in the same manner when a slight change in a search engine algorithm can have big economic consequences across a wide range of industries. It doesn’t seem possible to somehow regulate the fairness of a web search.

Regulating social media platforms would be even harder. The FCC has occasionally in the past been required by Congress to try to regulate morality issues – such as monitoring bad language or nudity on the public airwaves. Most of the attempts by the FCC to follow these congressional mandates were ineffective and often embarrassing for the agency. Social platforms like Facebook are already struggling to define ways to remove bad actors from their platform and it’s hard to think that government intervention in that process can do much more than to inject politics into an already volatile situation.

One of the problems with trying to regulate digital platforms is defining who they are. The FCC today has separate rules that can be used to regulate telecommunications carriers and media companies. How do you define a digital platform? Facebook, LinkedIn and Snapchat are all social media – they share some characteristics but also have wide differences. Just defining what needs to be regulated is difficult, if not impossible. For example, all of the social media platforms gain much of their value from user-generated content. Would that mean that a site like WordPress that houses this blog is a social media company?

Any regulations would have to start in Congress because there is no other way for a federal agency to be given the authority to regulate the digital platforms. It’s not hard to imagine that any effort out of Congress would concentrate on the wrong issues, much like the rules that made the FCC the monitor of bad language. I know as a user of the digital platforms that I would like to see some regulation in the areas of privacy and use of user data – but beyond that, regulating these companies is a huge challenge.

One Touch Make Ready

Earlier this month in WC Docket No. 17-84 and WT Docket No. 17-79 the FCC released new rules for one touch make ready (OTMR) for connecting wires to poles. These new rules allow a new attacher to a pole to use a single contractor to perform simple make-ready work, which they define as work where “existing attachments in the communications space of a pole could be transferred without any reasonable expectation of a service outage or facility damage and does not require splicing of any existing communication attachment or relocation of an existing wireless attachment.” These new rules will go into effect on February 1, 2019 or sooner, after 30 days, if the new rules are published in the Federal Register announcing approval by the Office of Management and Budget.

The OTMR rules don’t apply to more complex make-ready work where poles need to be replaced or where existing cables must be cut and spliced to accomplish the needed changes. The new rules don’t cover wireless attachments, so this is not an order that lets wireless companies place devices anywhere on poles at their choice (something the wireless companies are lobbying for). These rules also don’t apply to any work done above the power space at the top of poles.

For those not familiar with make-ready, a new attacher must pay to rearrange existing wires if there is not enough space on the poles for the new wire to meet safety standards. In most cases this can be accomplished by shifting existing wires higher or lower on the pole to create the needed clearance.

Possibly the most interesting part of the new order is that the FCC says that a new attacher is not responsible for the cost of fixing problems that are due to past attachers being out of compliance with safety codes. The reality is that most make-ready work is due to past attachers not spacing their wires according to code. This FCC language opens the door for new attachers to argue that some of the cost of make-ready should be charged to past attachers. Anybody who wants to make such claims needs to photograph and document existing violations before doing the work. I can foresee big fights over this issue after the make-ready work is completed.

 These rules end some of the practices that have made it time consuming and costly to put a new wire on a pole. Existing rules have allowed for sequential make-ready, where each existing utility can send out a crew to do the work, adding extra time as each separate crew coordinates the work, as well as adding to the cost since the new attacher has to pay for the multiple crews.

The new rules don’t apply everywhere and to all pole owners. There is still an exception for poles owned by municipalities and by electric cooperatives. The rules also don’t automatically apply to any state that has its own set of pole attachment rules. There are currently 22 states that have adopted at least some of their own pole attachment rules and the states still have the option to modify the new FCC rules. Expect delays in many states past the February 1 effective date as states deliberate on the issue. Interestingly, there are also two cities, Louisville, KY and Nashville, TN, that have already adopted their own version of OTMR and the order does not say if local governments have this right.

The order considerably shortens the time required to perform simple make ready. There are many nuances in the new time line that make it hard to condense to a paragraph, but the time lines are considerably shorter than the previous FCC rules. The FCC also shortened the time line for some of the steps for complex make-ready. Unfortunately, in many cases it’s the complex make-ready time lines that will still impact a project, because a few poles needing complex make ready can delay implementation of a new fiber route.

The order encourages pole owners to publish a list of contractors that are qualified to do the make ready work. The new rules also define the criteria for selecting a contractor in the case where the pole owner doesn’t specify one. Pole owners can veto a suggested contractor from the new attacher, but in doing so they must suggest a qualified contractor they find acceptable. Not mentioned in the order is the situation where a utility insists on doing all work themselves.

As a side note, this order also prohibits state and local governments from imposing moratoria on new wireless pole attachments. The ruling doesn’t stop states from imposing new rules, but it prohibits them from blocking wireless carriers from getting access to poles.

Overall this is a positive order for anybody that wants to add fiber to existing poles. It simplifies and speeds up the pole attachment process, at least for simple attachments. It should significantly hold down pole attachment costs by allowing one contractor to do all of the needed work rather than allowing each utility to bill for moving their own wires. There are still some flaws with the order. For instance, although the time frames have been reduced, the pole attachment process can still take a long time when complex pole attachment work is needed. But overall this is a much needed improvement in the process that has caused most of the delays in deploying new fiber.

The Definition of Broadband

The FCC recently issued the Notice of Inquiry (NOI) seeking input on next years broadband progress report. As usual, and perhaps every year into the future, this annual exercise stirs up the industry as we fight to define the regulatory speed of broadband. That definition matters because Congress has tasked the FCC to undertake efforts to make sure that everybody in the country has access to broadband. Today broadband is defined as 25 Mbps downstream and 3 Mbps upstream, and households that can’t buy that speed are considered underserved if they can get some broadband and unserved if they have no broadband options.

The NOI proposes keeping the 25/3 Mbps definition of broadband for another year. They know if they raise it that millions of homes will suddenly be considered to be underserved. However, the FCC is bowing to pressure and this year will gather data to see how many households have access to 50/5 Mbps broadband.

It was only a year ago when this FCC set off a firestorm by suggesting a reversion to the old definition of 10/1 Mbps. That change would have instantly classified millions of rural homes as having adequate broadband. The public outcry was immediate, and the FCC dropped the idea. For last year’s report the FCC also considered counting mobile broadband as a substitute for landline broadband – another move that would have reclassified millions into the served category. The FCC is not making that same recommendations this year – but they are gathering data on the number of people who access to cellular data speeds of 5/1 Mbps and 10/3 Mbps.

The FCC has also been tasked by Congress for getting faster broadband to schools. This year’s NOI recommends keeping the current FCC goal for all schools to immediately have access of 100 Mbps per 1,000 students, with a longer-term goal of 1 Gbps per 1,000 students.

Commissioner Jessica Rosenworcel has suggested in the current NOI that the official definition of broadband be increased to 100 Mbps download. She argues that our low target for defining broadband is why “the United States is not even close to leading the world” in broadband.

I think Commissioner Rosenworcel is on to something. The gap between the fastest and slowest broadband speeds is widening. This year both Comcast and Charter are unilaterally raising broadband speeds to customers. Charter kicked up the speed at my house from 60 Mbps to 130 Mbps a few weeks ago. AT&T is building fiber to millions of customers. Other fiber overbuilders continue to invest in new fiber construction.

The cable companies decided a decade ago that their best strategy was to stay ahead of the speed curve. This is at least the third round of unilateral speed increases that I can remember. A customer who purchased and kept a 20 Mbps connection a decade ago is probably now receiving over 100 Mbps for that same connection. One way to interpret Commissioner Rosenworcel’s suggestion is that the definition of broadband should grow over time to meet the market reality. If Charter and Comcast both think that their 50 million urban customers need speeds of at least 100 Mbps, then that ought to become the definition of broadband.

However, a definition of broadband at 100 Mbps creates a major dilemma for the FCC. The only two widely deployed technologies that can achieve that kind of speed today are fiber and cable company hybrid fiber/coaxial networks. As I wrote just a few days ago, there are new DSL upgrades available that can deliver up to 300 Mbps for 3,000 – 4,000 feet from a DSL hub – but none of the US telcos are pursuing the technology. Fixed wireless technology can deliver 100 Mbps – but only to customers living close to a wireless tower.

If the FCC was to adopt a definition of broadband at 100 Mbps, they would be finally recognizing that the fixes for rural broadband they have been funding are totally inadequate. They spent billions in the CAF II program to bring rural broadband up to 10/1 Mbps broadband. They are getting ready to give out a few more billion in the CAF II reverse auction which will do the same, except for a few grant recipients that use the money to help fund fiber.

By law, the FCC would have to undertake programs to bring rural broadband up to a newly adopted 100 Mbps standard. That would mean finding many billions of dollars somewhere. I don’t see this FCC being bold enough to do that – they seem determined to ignore the issue hoping it will go away.

This issue can only be delayed for a few more years. The country is still on the curve where the need for broadband at households doubles every three or so years. As the broadband usage in urban homes grows to fill the faster pipes being supplied by the cable companies it will become more apparent each year that the definition of broadband is a lot faster than the FCC wants to acknowledge.

Killing Net Neutrality Again?

The current FCC repealed the net neutrality rules earlier this year, with that repeal going into effect last month. In a move that is a head-scratcher, the Department of Justice recently filed a petition with the Supreme Court asking them to squash a lower court ruling in favor of net neutrality.

The original net neutrality rules were implemented by the FCC in 2015. The FCC’s order relied upon the use of Title II regulations as their authority to pass those rules. AT&T and other opponents of the ruling immediately appealed the FCC’s action, and in 2016 the DC Circuit Court of Appeals ruled that the FCC had the authority to invoke Title II. The same opponents of net neutrality appealed that decision to the whole Circuit Court, and in 2017 the court refused to take the case, thus upholding the 2016 decision.

The Department of Justice is now asking the Supreme Court to overturn the 2016 order that upheld net neutrality. It’s an unusual request because net neutrality has already been repealed by the FCC, so it seems like the issue is moot.

Nobody is sure about the reason for this filing, because neither action or inaction by the Supreme Court would realistically change anything. The Department of Justice argues that it is cleaning up loose legal ends. Proponents of net neutrality say that it’s an action to make it more difficult for a future FCC to reinstate net neutrality. They say that the administration is trying to kill a precedent that could be used by a future administration to reinstate net neutrality.

What’s most interesting about the whole net neutrality fight is that both the past and current FCC have had to get creative to first pass the net neutrality rules, and then to repeal them. What’s been missing in this fight is a Congress willing to vote on the issue, because legislation would put the issue to rest. The messy court battles over net neutrality for the last decade are all due to a Congress that won’t weigh in on the issue.

The FCC is mandated to follow the direction of Congress. It seems unlikely that net neutrality is ever going to come up for a vote in Congress. Polls have shown huge public support for net neutrality with various polls over the last few years showing support between 76% and 85%. Nobody in the GOP wants to go on record as opposing the issue.

We are badly in need of a new Telecom Act. Many of the rules that govern the FCC are far out of date. We need to fix cable rules that are massively out of synch with a world of on-line content. We need updated privacy laws that deal with current technology. And we need to know definitively if Congress thinks that we should or shouldn’t regulate some aspects of broadband.

But we’re not likely to get a new Telecom Act to a vote since net neutrality is going to get dragged into any discussion of new regulations. That means we’re likely to see the FCC continue to deal with current issues for which they have no direction of basis of action. That can only result in an FCC that grows gradually weaker and ineffective in its ability to tackle the communications issues that we need to face.

What we don’t need is a government that is looking backwards and wasting legal resources to kill a court order for an issue that has already been decided by the current FCC. This is one of the dumbest and most wasteful court actions I’ve ever seen in the industry. We don’t need to fill the over-busy courts with frivolous lawsuits – we need a Congress and an FCC to together tackle the current pressing issues in the industry.

Defining Competition

One of the hardest things for regulators to do is to define when a given telecom market is competitive. It’s an important question because, by definition, regulators are largely obligated by law to regulate monopoly or oligopoly providers in any market that is considered to be non-competitive. That’s the basic reason that regulators exist.

The telephone industry provides a good story of an industry that went from non-competitive to competitive. For a century most people in the country got telephone service from AT&T who had a monopoly franchise to provide service in defined geographic areas. Smaller telcos had the same monopoly power in smaller footprints. The FCC and state regulators heavily regulated the telephone industry to protect against monopoly abuses. The system worked, and we had low telephone rates and quality service.

Over time the monopolies broke. Some of this came from budding competitors like MCI. Eventually the government jumped into the fray. Judge Green forced the divestiture of AT&T and the Congress passed the Telecommunications Act of 1996 to finish the job – and at that point telephone service, at least in urban areas was assumed to be competitive.

We’ve seen the same thing happen with cable TV. Most markets in the country traditionally had one cable provider monopoly that was regulated under FCC rules and through local franchise agreements. Technology has allowed others to compete with the cable companies and there is a formal process for a cable company to ask the FCC to declare a given market to be competitive – the test generally being that a competitor has won some significant portion of the customers in that market.

AT&T just made a filing at the FCC that argued that the cellular market is competitive, and Verizon made a similar filing. The obvious reason for these filings is to get the FCC to relax or eliminate relating the cellular industry. A competitive industry doesn’t need the same level of regulator oversight and it’s presumed that the market will take care of monopoly or oligopoly pricing and protect consumers.

However, the smaller cellular carriers don’t see the same market. The Competitive Carriers Association (CCA) represents over 100 of the smaller cellular carriers in the country, including T-Mobile and Sprint. The group made a filing that argues that the cellular industry is not competitive. They argue that AT&T and Verizon have grown to control 70% of the market and that regulatory barriers and competitive practices of the two big providers have negatively impacted their ability to fairly compete.

It’s an intriguing read and is a good primer for many of the issues facing the industry. For example, the filing looks at the many different spectrum bands used by the cellular companies and shows how FCC spectrum policies award most of the good spectrum to the largest providers.

These filings were prompted by an FCC proceeding that is considering the creation of a new annual report on competition in the mobile broadband market. As cellular broadband grows in importance the FCC is interested in measuring progress of the industry and wants to get ahead of the curve before the introduction of 5G. AT&T and Verizon want to quash this effort by arguing that the industry is already competitive. Such a declaration by the FCC might not only eliminate this proposed new report, but it might eventually lead to a reduction in regulations of the cellular industry.

Like with other broadband technologies, urban America has more options than rural America. The FCC seems to have pinned its hopes on cellular wireless to fix huge coverage gap for rural broadband. This proposed report would gather data about cellular customers and product speeds – something the big companies don’t want to see published. Everybody in rural America knows that there are huge gaps in cellular coverage like there is with other broadband technologies. There are still huge areas with no 4G coverage and many places without even voice coverage.

I have not always been a fan of FCC annual reports because in recent years they have sometimes become political and the FCC has emphasized or de-emphasized facts to suit their preferred narrative. But we can’t even know the broadband situation with facts – so I hope this particular report moves forward.