Auditing the Universal Service Fund

I recently heard FCC Commissioner Geoffrey Starks speak to the Broadband Communities meeting in Alexandria, Virginia. He expressed support for finding broadband solutions and cited several examples of communities that don’t have good broadband access today – both due to lack of connectivity and due to the lack of affordable broadband.

One of his more interesting comments is that he wants the FCC to undertake a ‘data-driven’ analysis of the effectiveness of the Universal Service Fund over the last ten years. He wants to understand where the fund has succeeded and where it has failed. Trying to somehow measure the effectiveness of the USF sounds challenging. I can think of numerous successes and failures of USF funding, but I also know of a lot of situations that I would have a hard time classifying as a success or failure.

Consider some of the challenges of looking backward. Over the last decade, the definition of broadband has changed from 4/1 Mbps to 25/3 Mbps. Any USF funds that supported the older speeds will look obsolete and inadequate today. Was using USF funding nine years ago to support slow broadband by today’s standards a success or a failure?

One of the biggest challenges of undertaking data-driven analysis is that the FCC didn’t gather the needed data over time. For example, there has only been a limited amount of speed testing done by the FCC looking at the performance of networks built with USF funding. A more rigorous set of testing starts over the next few years, but I think even the new testing won’t tell the FCC what they need to know. For example, the FCC just changed the rules to let the big telcos off the hook when they decided that USF recipients can help to decide which customers to test. The big telcos aren’t going to test where they didn’t build upgrades or where they know they can’t meet the FCC speed requirements.

The FCC will find many successes from USF funding. I’m aware of many rural communities that have gotten fiber that was partially funded by the ACAM program. These communities will have world-class broadband for the rest of this century. But ACAM money was also used in other places to build 25/3 DSL. I’m sure the rural homes that got this DSL are thankful because it’s far better than what they had before. But will they be happy in a decade or two as their copper networks approach being a century old? Are the areas that got the DSL a success or a failure?

Unfortunately, there are obvious failures with USF funding. Many of the failures come from the inadequate mapping that influenced USF funding decisions. There are millions of households for which carriers have been denied USF funding because the homes have been improperly classified as having broadband when they do not. Commissioner Stark said he was worried about using these same maps for the upcoming RDOF grants – and he should be.

Possibly the biggest failures come from what I call lack of vision by the FCC. The biggest example of this is when they awarded $11 billion to fund the CAF II program for the big telcos, requiring 10/1 Mbps speeds at a time when the FCC had already declared broadband to be 25/3 Mbps. That program was such a failure that the CAF II areas will be eligible for overbuilding using the RDOF grants, barely after the upgrades are slated to be completed. The Universal Service Fund should only support building broadband to meet future speed needs and not today’s needs. This FCC is likely to repeat this mistake if they award the coming RDOF grants to provide 25/3 Mbps speeds – a speed that’s arguably inadequate today and that clearly will be inadequate by the time the RDOF networks are completed seven years from now.

I hope the data-driven analysis asks the right questions. Again, consider CAF II. I think there are huge numbers of homes in the CAF II service areas where the big telcos made no upgrades, or upgraded to speeds far below 10/1 Mbps. I know that some of the big telcos didn’t even spend much of their CAF II funding and pocketed it as revenue. Is the audit going to look deep at such failures and take an honest look at what went wrong?

Commissioner Stark also mentioned the Lifeline program as a failure due to massive fraud. I’ve followed the Lifeline topic closely for years and the fraud has been nowhere near the magnitude that is being claimed by some politicians. Much of the blame for problems with the program came from the FCC because there was never any easy way for telcos to check if customers remained eligible for the program. The FCC is in the process of launching such a database – something that should have been done twenty years ago. The real travesty of the Lifeline program is that the big telcos have walked away. For example, AT&T has stopped offering Lifeline in much of its footprint. The FCC has also decided to make it exceedingly difficult for ISPs to join the program, and I know of numerous ISPs that would love to participate.

I try not to be cynical, and I hope an ‘audit’ isn’t just another way to try to kill the Lifeline program but is instead an honest effort to understand what has worked and not worked in the past. An honest evaluation of the fund’s problems will assign the blame for many of the fund’s problems to the FCC, and ideally, that would stop the current FCC from repeating the mistakes of the past.

The Problem with FTC Regulation

As part of the decision to kill Title II regulation, the FCC largely ceded its regulatory authority over broadband to the Federal Trade Commission. FTC regulation is exceedingly weak, meaning that broadband is largely unregulated.

A great example of this is the recent $60 million fine levied on AT&T by the FTC. This case stretched back to 2014 when the company advertised and charged a premium price for an unlimited cellular data plan. It turns out the plan was far from unlimited and once a customer reached an arbitrary amount of monthly usage, AT&T throttled download speeds to the point where the broadband was largely unusable.

This is clearly an unfair consumer practice and the FTC should be applauded for fining AT&T. Unfortunately, the authority to levy fine for bad behavior is the practical extent of the FTC’s regulatory authority.

A strong regulator would not have taken five years to resolve this issue. In today’s world, five years is forever, and AT&T has moved far past the network, the products, and the practices they used in 2014. In 2014 most of the cellular network was still 3G, moving towards 4G. It didn’t take a lot of cellular data usage to stress the network. It was a real crisis for the cellular networks when people started watching video on their phones, and the cellular companies tamped down on usage by enforcing small monthly data caps, and apparently by capping unlimited users as well.

A strong regulator would have ordered AT&T to stop the bad practice in 2014. The FTC doesn’t have that authority. The regulatory process at the FTC is to bring suit against a corporation for bad behavior. Often companies will stop bad behavior immediately to soften the size of potential fines – but they are not required to do so. The FTC suit is like any other lawsuit with discovery and testimony. Once the FTC finds the corporation guilty of bad behavior, the parties often negotiate a settlement, and it’s routine for corporations to  agrees to never undertake the same bad practices again.

A strong regulator would have ordered the whole cellular industry to stop throttling unlimited data customers. The FCC fine applied strictly to AT&T and not to any other cellular carriers. T-Mobile has advertised unlimited data plans for years that get throttled at some point, but this FTC action and the fine against AT&T has no impact on T-Mobile and the other wireless carriers. AT&T got their wrist slapped, but the FTC doesn’t have the authority to tell other cellular companies to not engage in the same bad behavior. The FCC regulates by punishing bad corporate actors and hopes that similar companies will modify their behavior.

A strong regulator would develop forward-thinking policies to head off bad behavior before it happens. One of the bulwarks of regulation is establishing policies that prohibit bad behavior and that reward corporations for good behavior. The FTC has no authority to create policy – only to police bad behavior.

Even if they wanted to regulate broadband more, the FTC doesn’t have the staffing needed to monitor all broadband companies. The agency is responsible for policing bad corporate behavior across all industries, so they only tackle the worst cases of corporate abuse, and more often than not they go after the largest corporations.

At some point, Congress will have to re-regulate broadband. Unregulated corporations inevitably abuse the public. Without regulation, broadband prices are going to go sky-high. Without regulation there will be ISP policies that unfairly punish customers. Without regulation the big ISPs will eventually engage in all of the practices that net neutrality tried to stop. Having the FTC occasionally levy a big fine against a few big ISPs will not deter bad behavior across the whole ISP sector.

What we really need is an FCC that does what it’s supposed to do. If the FCC refused to regulate broadband – the primary product under its umbrella – then the agency is reduced to babysitting spectrum auctions, and not much else of consequence.

FCC Further Defines Speed Tests

The FCC recently voted to tweak the rules for speed testing for ISPs who accept federal funding from the Universal Service Fund or from other federal funding sources. This would include all rate-of-return carriers including those taking ACAM funding, carriers that won the CAF II reverse auctions, recipients of the Rural Broadband Experiment (RBE) grants, Alaska Plan carriers, and likely carriers that took funding in the New York version of the CAF II award process. These new testing rules will also apply to carriers accepting the upcoming RDOF grants.

The FCC had originally released testing rules in July 2018 in Docket DA 18-710. Those rules applied to the carriers listed above as well as to all price cap carriers and recipients of the CAF II program. The big telcos will start testing in January of 2020 and the FCC should soon release a testing schedule for everybody else – the dates for testing were delayed until this revised order was issued.

The FCC made the following changes to the testing program:

  • Modifies the schedule for commencing testing by basing it on the deployment obligations specific to each Connect America Fund support mechanism;
  • Implements a new pre-testing period that will allow carriers to become familiar with testing procedures without facing a loss of support for failure to meet the requirements;
  • Allows greater flexibility to carriers for identifying which customer locations should be tested and selecting the endpoints for testing broadband connections. This last requirement sounds to me like the FCC is letting the CAF II recipients off the hook by allowing them to only test customers they know meet the 10/1 Mbps speeds.

The final order should be released soon and will hopefully answer carrier questions. One of the areas of concern is that the FCC seems to want to test the maximum speeds that a carrier is obligated to deliver. That might mean having to give customers the fastest connection during the time of the tests even if they have subscribed to slower speeds.

Here are some of the key provisions of the testing program that were not changed by the recent order:

  • ISPs can choose between three methods for testing. First, they may elect what the FCC calls the MBA program, which uses an external vendor, approved by the FCC, to perform the testing. This firm has been testing speeds for the network built by large telcos for many years. ISPs can also use existing network tools if they are built into the customer CPE that allows test pinging and other testing methodologies. Finally, an ISP can install ‘white boxes’ that provide the ability to perform the tests.
  • Testing, at least for now is perpetual, and carriers need to recognize that this is a new cost they have to bear due to taking federal funding.
  • The number of tests to be conducted will vary by the number of customers for which a recipient is getting support; With 50 or fewer households the test is for 5 customers; for 51-500 households the test is 10% of households. For 500 or more households the test is 50 households. ISPs declaring a high latency must test more locations with the maximum being 370.
  • Tests for a given customer are for one solid week, including weekends in each quarter. Tests must be conducted in the evenings between 6:00 PM and 12:00 PM. Latency tests must be done every minute during the six-hour testing window. Speed tests – run separately for upload speeds and download speeds – must be done once per hour during the 6-hour testing window.
  • ISPs are expected to meet latency standards 95% of the time. Speed tests must achieve 80% of the expected upland and download speed 80% of the time. An example of this requirement is that a carrier guaranteeing a gigabit of speed must achieve 800 Mbps 80% of the time. ISPs that meet the speeds and latencies for 100% of customers are excused from quarterly testing and only have to test once per year.
  • There are financial penalties for ISPs that don’t meet these tests.
  • ISPs that have between 85% and 100% of households that meet the test standards lose 5% of their FCC support.
  • ISPs that have between 70% and 85% of households that meet the test standards lose 10% of their FCC support.
  • ISPs that have between 55% and 75% of households that meet the test standards lose 15% of their FCC support.
  • ISPs with less than 55% of compliant households lose 25% of their support.
  • The penalties only apply to funds that haven’t yet been collected by an ISP.

Is OTT Service Effective Competition for Cable TV?

The FCC made an interesting ruling recently that signals the end of regulation of basic cable TV. Charter Communications had petitioned the FCC for properties in Massachusetts claiming that the properties have ‘effective competition’ for cable TV due to competition from OTT providers – in this case, due to AT&T DirecTV Now, a service that offers a full range of local and traditional cable channels.

The term effective communications is a very specific regulatory term and once a market reaches that status a cable company can change rates at will for basic cable. – the tiers that include local network stations.

The FCC agreed with Charter and said that the markets are competitive and granted Charter the deregulated status. This designation in the past has been granted in markets that have a high concentration of satellite TV or else that have a lot of alternative TV offered by a fiber or DSL overbuilder that has gained a significant share of the market.

In making this ruling the FCC effectively deregulated cable everywhere since there is no market today that doesn’t have a substantial amount of OTT content competing with cable companies. Cable providers will still have to go through the process of asking to deregulate specific markets, but it’s hard to think that after this ruling that the FCC can say no to any other petition.

From a regulatory perspective, this is probably the right ruling. Traditional cable is getting clobbered and it looks like the industry as a whole might lose 5-6 full percentage of market share this year and end up under a 65% national penetration rate. While we are in only the third year where cord cutting became a measurable trend, the cable industry customer losses are nearly identical to the market losses for landline telephone at the peak of that market decline.

There are two consequences for consumers in a market that is declared to be effectively competitive. First, it frees cable companies from the last vestiges of basic cable rate regulation. This is not a huge benefit because cable companies have been free for years to raise rates in higher tiers of service. In a competitive market, a cable provider is also no longer required to carry local network channels in the basic tier – although very few cable systems have elected this option.

I’ve seen several articles discussing this ruling that assume that this will result in an instant rate increase in these markets – and they might be right. It’s a headscratcher watching cable companies raising rates lately when higher rates are driving households to become cord cutters. But cable executives don’t seem to be able to resist the ability to raise rates, and each time they do, the overall revenue of a cable system increases locally, even with customer defections.

It’s possible that this ruling represents nothing more than the current FCC’s desire to deregulate as many things as possible. One interesting aspect of this ruling is that the FCC has never declared OTT services like SlingTV or DirecTV Now to be MVDPs (multichannel video program distributors) – a ruling that would pull these services into the cable TV regulatory regime. From a purely regulatory viewpoint, it’s hard to see how a non-MVDP service can meet the technical requirements of effective competition. However, from a practical perspective, it’s not hard to perceive the competition.

Interestingly, customers are not leaving traditional cable TV and flocking to the OTT services that emulate regular cable TV service. Those services have recently grown to become expensive and most households seem to be happy cobbling together packages of content from OTT providers like Netflix and Amazon Prime that don’t carry a full range of traditional channels. From that market perspective, one has to wonder how much of a competitor DirecTV Now was in the specific markets, or even how Charter was able to quantify the level of competition from a specific OTT service.

Mapping Cellular Data Speeds

AT&T recently filed comments in Docket 19-195, the docket that is looking to change broadband mapping, outlining the company’s proposal for reporting wireless data speeds to the FCC. I think a few of their recommendations are worth noting.

4G Reporting. Both AT&T and Verizon support reporting on 4G cellular speeds using a 5 Mbps download and 1 Mbps upload test with a cell edge probability of 90% and a loading of 50%. Let me dissect that recommendation a bit. First, this means that customer has a 90% chance of being able to make a data connection at the defined edge if a cell tower coverage range.

The more interesting reporting requirement is the 50% loading factor. This means the reported coverage area would meet the 5/1 Mbps speed requirement only when a cell site is 50% busy with customer connections. Loading is something you rarely see the cellular companies talk about. Cellular technology is like most other shared bandwidth technologies in that a given cell site shares bandwidth with all users. A cell site that barely meets the 5/1 Mbps data speed threshold when it’s 50% busy is going to deliver significantly slower lower speeds as the cell site gets busier. We’ve all experienced degraded cellular performance at rush hours – the normal peak times for many cell sites. This reporting requirement is a good reminder that cellular data speeds vary during the day according to how many people are using a cell site – something the cellular companies never bother to mention in their many ads talking about their speeds and coverage.

The recommended AT&T maps would show areas that meet the 5/1 Mbps speed threshold, with no requirement to report faster speeds. I find this recommendation surprising because Opensignal reports the average US speeds of 4G LTE across America is as follows:

2017 2018
AT&T 12.9 Mbps 17.87 Mbps
Sprint 9.8 Mbps 13.9 Mbps
T-Mobile 17.5 Mbps 21.1 Mbps
Verizon 14.9 Mbps 20.9 Mbps

I guess that AT&T favors the lowly 5/1 Mbps threshold since that will show the largest possible coverage area for wireless broadband. While many AT&T cell sites provide much faster speeds, my guess is that most faster cell sites are in urban areas and AT&T doesn’t want to provide maps showing faster speeds such as 15 Mbps because that would expose how slow their speeds are in most of the country. If AT&T offered faster speeds in most places, they would be begging to show multiple tiers of cellular broadband speeds.

Unfortunately, maps using the 5/1 Mbps criteria won’t distinguish between urban places with fast 4G LTE and more rural places that barely meet the 5 Mbps threshold – all AT&T data coverage will be homogenized into one big coverage map.

About the only good thing I can say about the new cellular coverage maps is that if the cellular companies report honestly, we’re going to see the lack of rural cellular broadband for the first time.

5G Broadband Coverage. I don’t think anybody will be shocked that AT&T (and the other big cellular companies) don’t want to report 5G. Although they are spending scads of money touting their roll-out of 5G they think it’s too early to tell the public where they have coverage.

AT&T says that requiring 5G reporting at this early stage of the new technology would reveal sensitive information about cell site location. I think customers who pony up extra for 5G want to know where they can use their new expensive handsets.

AT&T wants 5G coverage to fall under the same 5/1 Mbps coverage maps, even though the company is touting vastly faster speeds using new 5G phones.

It’s no industry secret that most of the announced 5G deployment announcements are mostly done for public relations purposes. For example, AT&T is loudly proclaiming the number of major cities that now have 5G, but this filing shows that they don’t want the public to know the small areas that can participate in these early market trials.

If 5G is a reasonable substitute for landline broadband, then the technology should not fall under the cellular reporting requirements. Instead, the cellular carriers should be forced to show where they offer speeds exceeding 10/1 Mbps, 25/3 Mbps and 100/10 Mbps, and 1 Gbps. I’m guessing a 5G map using these criteria would largely show a country that has no 5G coverage – but we’ll never know unless the FCC forces the wireless companies to tell the truth. I think that people should be cautious about speeding extra for 5G-capable phones until the cellular carriers are honest with them about the 5G coverage.

The FCC’s 15th Annual Broadband Deployment Report

The FCC just released its most recent annual report on the state of US broadband. This report is mandated by Section 706 of the Telecommunications Act of 1996 which requires the FCC to “determine whether advanced telecommunications capability is being deployed to all Americans in a reasonable and timely fashion”. The FCC concludes in this latest report that broadband deployment is reasonable and that actions taken by this Commission are helping to close the broadband gap.

I take exception to several findings in this latest report. First, everybody in the country now understands that the FCC’s conclusions are based upon dreadfully inaccurate 477 data reported by ISPs. There have been numerous studies undertaken at the state and local levels that show that the FCC maps undercount households without broadband. Even USTelecom, the group mostly representing the largest telcos showed that the FCC maps in Missouri and Virginia classified 38% of rural homes as being served when in fact they were unserved. Microsoft has been gathering credible data showing that well over 150 million homes aren’t connecting at the FCC’s defined broadband speed of 25/3 Mbps.

For the FCC to draw any conclusions based upon inaccurate 477 data is ridiculous. A few years ago the FCC could have claimed to not understand the extent to which their data is flawed, but they’ve been shown extensive evidence that the 477 data is incredibly bad, and yet they still plowed forward in this report pretending that statistics based upon 477 data have any meaning. There is not one number in this report that has even the slightest amount of credibility and the FCC knows this.

With the knowledge that the FCC now has about the inaccuracy of their data, this FCC should have humbly admitted that they don’t know the number of households that don’t have broadband. The report could have discussed ways that the Commission is trying to fix the bad data and described steps they have taken to improve rural broadband. But for this report to lead off with a claim that the number of homes without broadband fell by 18% in 2018 is a joke – there is zero chance that’s an accurate statistic. This report should have stated that external analysis has shown that the state of broadband is a lot worse than what they’ve reported in prior annual reports.

I also take exception to the opening statement of the report where the FCC claims that its top goal is “closing the digital divide and bringing the educational, healthcare, social, and civic benefits of connectivity to all Americans seeking broadband access.” This FCC’s top goal is clearly to eliminate regulatory rules that create any obligations for the largest carriers. This FCC already completely deregulated broadband – something an agency would never do if their goal was to improve broadband access. Most of the major dockets that have been approved by this FCC have made it easier for the big carriers to deploy 5G or to otherwise avoid any regulatory burdens.

It’s insulting to the American people for the agency to state that their top goal is improving broadband when their actions show that their priorities are elsewhere. Regulatory agencies are not supposed to engage in propaganda, and this document reeks of self-promotion.

Finally, this report trots out the oft-repeated message that broadband is improving because of this FCC’s effort to remove barriers to broadband investment. I don’t think Chairman Pai makes a speech or writes an opinion that doesn’t bring up this disproved argument. We know by now that those without broadband fall into two categories – rural homes that don’t have access to a broadband connection and urban households that can’t afford broadband. The big telcos aren’t spending any of their cash to solve these two problems.

There has been a lot of fiber built in recent years. AT&T built fiber to pass 12 million homes as a condition for its merger with DirecTV – an effort the company announced was done this past summer. Verizon has been building fiber to bolster their cellular network, including an expansion of small cell sites – largely as a way to reduce their reliance on paying transport to others. These fiber efforts have nothing to do with the repeal of net neutrality and the ending of broadband regulation. Chairman Pai probably ought to cut back on making this claim, because his real legacy is that he’s emboldened the big cable companies to begin regularly increasing broadband rates since there’s no threat of regulatory oversight. Chairman Pai and his light-touch regulation will get the credit for why broadband costs $100 per month a few years from now.

A New National Broadband Plan?

Christopher Terry recently published an article for the Benton Institute that details how the National Broadband Plan has failed. This plan was initiated by Congress in 2009, which instructed the FCC to develop a plan to make sure that every American had access to broadband within a decade. The article details the many spectacular ways that the plan has failed.

In my opinion, the National Broadband Plan never had the slightest chance of success because it didn’t have any teeth. Congress authorized the creation of the plan as a way for politicians to show that they were pro-broadband. The plan wasn’t much more than a big showy public relations stunt. Congress makes symbolic votes all of the time and this was just another gesture that demonstrated that Congress cared about broadband and that also served to quiet broadband proponents for a few years. If Congress cared about broadband they would have followed up the plan with a vote to force the FCC to implement at least some aspects of the plan.

I have no doubt that those who worked to develop the plan are likely offended by my post-mortem of the effort. I know that several people who worked on the plan still prominently display that fact in their resume a decade later. I’m sure that working on the plan was an exhilarating process, but at the end of the day, the effort must be measured in terms of success. The folks that created the plan and the rest of the country were duped by the FCC.

The FCC never had the slightest interest in adopting the big recommendations of the plan. There is probably no better evidence of this when the Tom Wheeler FCC awarded $11 billion to the big telcos in the CAF II process – an award that couldn’t have been more antithetical to the National Broadband Plan. To those that follow FCC dockets, there are dozens of examples over the last decade where the FCC sided with big carriers instead of siding with better rural broadband.

The fact is that the US government doesn’t do well with grandiose plans and lofty long-term goals. Government agencies like the FCC mostly implement things that are mandated by Congress – and even then they often do the bare minimum. Even without the National Broadband Plan, the FCC already has a Congressional mandate to make certain that rural broadband is equivalent to urban broadband – and we annually see them do a song and dance to show how they are complying with this mandate while they instead largely ignore it.

This is not to say that broadband plans are generically bad. For example, the state of Minnesota developed its own set of broadband goals, with the most prominent goal of defining broadband in the state as connections of at least 100 Mbps. The state has implemented that goal when awarding broadband grants, and unlike the FCC, the state has awarded grant funding to build real rural broadband solutions. They’ve refused to spend money on technologies that deliver speeds that the state doesn’t consider as broadband.

I fully expect to hear a plea to develop a new plan and I hope that most of the folks who are working for better broadband ignore any such effort. Compared to ten years ago there are now a lot of organizations working for better broadband. Hundreds of rural communities have created citizen broadband committees looking for a local solution. There are county governments all over the country making grants to help lure ISPs to serve their county. Statewide groups are working to solve the digital divide and the homework gap. There are a lot of people actively advocating for real broadband solutions.

These advocates don’t need a national goal document to tell them what they want. By now, communities understand good broadband in the simplest form – it’s something their community either has or doesn’t have. Communities now understand the digital divide and the homework gap. Wasting federal dollars to create a new National Broadband Plan wouldn’t move any community one inch closer to better broadband, and I hope we resist the temptation to go down that path.

Perverting the FCC Comment Process

In a recent article, BuzzFeed dug into the issue of the FCC receiving millions of bogus comments in the last two rounds of the net neutrality docket. During the 2015 net neutrality comment period, the agency received over 4 million comments. Many of these were legitimate comments such as many that were driven by HBO’s John Oliver, who prompted people to comment in favor of net neutrality.

When the new FCC wanted to reverse the original net neutrality order they had to again open up the docket for public comment. This second time the FCC got over 20 million comments. The comments were so voluminous that the FCC website crashed in May 2017.

There were fake comments filed on both sides of the issue. On the pro-net neutrality side were 8 million nearly identical comments that were tied to email addresses from FakeMailGenerator.com. There were another million comments from people with @pornhub.com email addresses. On the anti-net neutrality side Buzzfeed identified several organizations that uploaded millions of comments using names, addresses and email addresses that came from a major data breach. These fake comments were generated on behalf of real people who had no idea their name was being used in the FCC proceeding. The fake filings included comments from some people who had died and also some anti-net neutrality comments from a few Democrats in the House of Representatives who clearly were pro-net neutrality.

While the FCC’s net neutrality dockets received the largest number of fake comments, there are fake comments being filed in other FCC dockets and false comments are being made for legislation at state legislatures.

As somebody who often comments on FCC dockets, the fake comments give me heartburn. Flooding a docket with fake comments makes it likely that legitimate comments are not read or considered. What might be the most interesting thing about the net neutrality docket is that in both cases it was clear the FCC COmmissioners had already decided how they were going to vote – so the fake comments had no real impact. But most FCC dockets are not partisan. For example, there were a lot of fake comments filed in the docket that was considering changing the rules for cable cards – the devices that allow people to avoid paying for the cable company settop boxes. That kind of docket is not partisan and is more typical of the kinds of issues that the FCC has to wangle with.

Hopefully, legal action will be taken against the bad actors that were identified in the net neutrality filings. There are several companies that have been formed for the express purposes of generating large volumes of comments in government dockets. There is nothing wrong in working with organizations to generate comments to politicians. It’s almost a definition of the first amendment if AARP galvanizes members to comment against changes in social security. But it’s a perversion of democracy when fake comments are generated to try to influence the political process.

Fighting this issue was not made any easier when the current FCC under Ajit Pai ignored public records requests in 2017 that wanted to look deeper at the underlying fake comments. After a lawsuit was filed the FCC eventually responded to public records requests that led to investigations like the one described in the Buzzfeed article.

There are probably ways for the FCC and other agencies to restrict the volume of fake comments. For example, the FCC might end the process of allowing for large quantities of comments to be filed on a bulk basis. But federal agencies have to be careful to not kill legitimate comments. It’s not unusual for an organization to encourage members to file, and they often do so using the same language in multiple filings.

This is another example of how technology can be used for negative purposes – in essence, the FCC was hacked in these dockets. As long as there is a portal for citizens to make comments it’s likely that there will be fake comments made. Fake comments are often being made outside the government process and fake reviews are a big problem for web sites like Amazon and Yelp. We need to find a way to stop the fake comments from overwhelming the real comments.

Is Telephony a Natural Monopoly?

For my entire career, I’ve heard it said that telecommunications is a natural monopoly. That was the justification for creating monopoly exchange boundaries for telcos and for issuing exclusive franchise agreements for cable companies. This historic reasoning is why the majority of Americans in urban areas are still stuck with duopoly competition that is trending towards a cable monopoly.

I worked for Southwestern Bell pre-divestiture and they were proud of their monopoly. Folks at Ma Bell thought the telephone monopoly was the best possible deal for the public and they constantly bragged about the low rates for a residential telephone line, usually at something less than $15 per month. But when you looked closer, the monopoly was not benefitting the average household. Long distance was selling for 12 cents to 25 cents per minute and a major percentage of households had monthly phone bills over $100 per month.

I’ve been doing some reading on the history of the telephone industry and found some history I never knew about – and which is different than what Ma Bell told employees for 100 years.

Alexander Graham Bell was granted many patents for telephone service in 1876. During the 18-year life of the original patents, Bell telephone held a monopoly on telephone service. Bell Telephone mostly built to large businesses and to rich neighborhoods and the country still predominantly communicated via telegraph. Bell Telephone was not considered much of a success. By 1894 there was still less than 5 telephones in the country per 1,000 population, and there were only 37 average calls per day per 1,000 people.

As soon as the patents expired, numerous competitors entered the market. They built to towns that Bell Telephone had ignored but also built a competing network in many Bell Telephone markets. By the end of 1896, 80 competitors that had grabbed 5% of the total telephone market. By 1900 there were 3,000 competitive telephone companies.

By 1907 the competitors had grabbed 51% of the national market and had also driven down urban telephone rates. AT&T’s returns (AT&T had officially become the name of Bell Telephone) had dropped from 46% annually in the late 1800s to 8% by 1906. After 17 years of monopoly, the country had only 270,000 telephones. After 13 years of competition there were over 6 million phones in the country.

The death of telephone competition started when Theodore Vail became president of AT&T in 1907. By 1910 the company was buying competitors and lobbying for a monopoly scenario. Federal regulators stepped in to slow Bell’s the purchase of telephone companies after Vail tried to buy Western Union.

In a compromise reached with the federal government, AT&T agreed to stop buying telcos and to interconnect with independent telephone companies to create one nationwide network. That compromise was known as the Kingsbury Commitment. Vail used this compromise to carve out monopoly service areas by only agreeing to interconnect with companies that would create exchange boundaries and further agree not to compete in AT&T exchanges. With almost the opposite result that federal regulators had hoped for, the Kingsbury Commitment resulted in a country carved into AT&T monopoly telephone service areas.

From that time forward federal regulators supported the new monopoly borders, cementing the arrangement with the Telecommunications Act of 1934. State regulators liked the monopolies because they were easier to regulate – state regulation turned into rate-making procedures that raised rates on businesses to keep lower residential rates. AT&T thrived in this environment because they were guaranteed a rate of return, regardless of performance.

The history of telephone service shows that the industry is not a natural monopoly. A natural monopoly is one where one provider can produce lower rates than are achieved by allowing competition. Competing networks forced lower telephone rates at the turn of the last century. After the establishment of the AT&T monopoly we saw monopoly abuse through high long distance rates that didn’t drop until MCI challenged the monopoly status quo. Today we have a world full of multiple wires and networks and the idea of natural monopoly is no longer considered as valid. Unfortunately, many of the vestiges of the regulations that protect the big telcos are still in place and still create hurdles to unfettered competition.

CoBank Supports Telemedicine

For those who don’t know CoBank, it’s a bank that specializes in loans to telecom and electric cooperative but which also has funded numerous rural fiber projects for other borrowers over the years. In August CoBank filed comments In FCC Docket 18-213 in support of expanded use of the Universal Service Fund for rural telemedicine. CoBank is a huge supporter of telemedicine and has made substantial grants to telemedicine projects dealing with diabetes management, opioid abuse, prenatal maternity care, and veteran care.

As part of that filing, CoBank discussed a telemedicine trial they had sponsored in rural Georgia. The trial was conducted in conjunction with Perry Health, a software provider and Navicent Health, a healthcare provider in Macon, Georgia.  The trial was for 100 low-income patients with uncontrolled Type 2 diabetes. These patients were on a path towards kidney failure, amputation, loss of vision, and numerous other major related health problems. These are patients who would normally be making numerous emergency room visits and needing other costly medical procedures.

In the trial, the patients were provided with tablets containing Perry Health software that provided for daily interaction between patients and Navicent. Patients were asked to provide daily feedback on how they were sticking to the treatment regimen and provided information like the results of blood sugar tests, the food they ate each day, the amount of daily exercise, etc. The tablet portal also provided for communication from Navicent asking patients how they generally felt and providing recommendations when there was a perceived need.

The results of the trial were hugely positive. In the trial of 100 patents, 75% of the patients in the trial showed a marked improvement in their condition compared to the average diabetes patient. The improvements for these patients equated to reduced health care costs of $3,855 per patient per year through reduced doctor visits and reduced needs to make emergency room visits. The American Diabetes Association says that patients with Type 2 diabetes have 2-3 times the normally expected medical costs, which they estimate totals to $327 billion per year.

Patients in the trial liked the daily interaction which forced them to concentrate on following treatment plans. They felt like their health care provider cared about how they were doing, and that led them to do better. After the trial, Navicent Health expanded the telemedicine plan to hundreds of other patients with Type 2 diabetes, heart failure, and Chronic Obstructive Pulmonary Disease (COPD).

One of the interesting outcomes of the trial was that patents preferred to use cellphones rather than the special tablets. The trial also showed the need for better broadband. One of the challenges of the trial was the effort required by Navicent Health to make sure that a patient had the needed access to broadband. To some degree using cellphones gives patients easier access to broadband. However, there are plenty of rural areas with poor cellular data coverage, and even where patients can use cellular data, the cost of cellular data can be prohibitive if heavily used. Landline broadband is still the preferred connection to take advantage of unlimited WiFi connections to the healthcare portal.

One thing that struck me about this study is that this sounds like it would be equally useful in urban areas. I’ve read that a lot of healthcare costs are due to patients who don’t follow through on a treatment plan after they go home after a procedure. The Navicent Health process could be applied to patients anywhere since the biggest benefit of the trial looks to be due to the daily interface between patient and doctor.

The FCC has already pledged to increase funding for the rural medicine component of the Universal Service Fund. However, that funding is restricted. For example, funding can only be granted to rural non-profit health care providers.

Telemedicine has been picking up steam and is seeing exponential growth. But telemedicine still only represents just a few percentages of rural healthcare visits. The primary barrier seems to be acceptance of the process and the willingness of health care providers to tackle telemedicine.