Broadband Deserts

I recently spoke with a guy who lives 30 miles outside one of the larger cities in North Carolina. Where he lives there is zero broadband. It’s hilly and full of pine trees and they can’t even easily get satellite broadband. There is zero cellular coverage. Even though most of the people who live there work in the nearby city, from a broadband perspective it’s at the end of the earth. And there is very little prospect of anybody bringing broadband there without some sort of grant or other financial assistance to help pay for the construction.

This guy’s horror stories are familiar to the millions of others who live in similar situations. In order to send or receive an email he has to drive to the top of a hill 20 minutes from his house which is dangerous due to narrow road shoulders and heavy big truck traffic. If schoolkids or others in his area want a real broadband connection they have to go even further to find one of a handful of public hot spots.

Just recently there was a blog by two of the FCC Commissioners that suggested that the ‘wealthy’ people that live in rural areas ought to pay for their own broadband connections. But this guy’s neighborhood is not full of millionaires enjoying a mountain get-away retreat – they are everyday working people. The homes in his neighborhood were built decades ago before there was broadband, so nobody can be faulted for moving to the “sticks” and then demanding city services.

I would describe where he lives as a “broadband desert.” Broadband deserts are areas of our country that for some reason are unlikely to get broadband in any form. Sometimes they are remote (often due to terrain), and sometimes these are just the leftover places that never even got good telephone copper wires. These broadband deserts are not just in the most remote parts of the northern Rocky Mountains as those in the federal government might imagine – there are pockets of broadband deserts around all of us. A few years ago I was working with one of the most populous counties in Minnesota and they were shocked when our research identified a number of such pockets scattered around their county.

I believe these broadband desert areas are on the verge of being abandoned and blowing away if we can’t find a way to get them broadband. We have had three other major events in US history that have created ghost towns. The first was when many little towns were bypassed by the railroads in the late 1800’s and disappeared as a result.  Secondly, in the early 20th century, towns that didn’t get electricity faded away. And finally, later in the 20th Century, a number of little towns that were bypassed by the then-new Interstate highway system languished and many have now disappeared.

Broadband is next on this list of breakthrough technologies that will be vital for communities to continue to flourish. Towns without broadband are going to become irrelevant. People who live in these broadband deserts will soon be unable to sell their homes and will eventually walk away from them. And certainly nobody is going to build new homes or bring new businesses to places with no broadband. Families won’t be able to raise kids in the broadband deserts. People won’t be able to partake in the work-at-home economy that is proving to be a boon to a lot of rural America. Communities that don’t find a broadband solution are going to dry up and blow away just like the towns in the old west that were bypassed by railroads.

Probably the worst reality for this guy I was talking with was that he knows what broadband can do for a town. He runs a gigabit broadband network for a city that is used to connect city buildings. I met him at a meeting of CLIC NC, a local chapter of the Coalition for Local Internet Choice – a group that strongly advocates for gigabit fiber networks. This guy can see that many other similar neighborhoods in North Carolina are getting fiber to their homes because they happen to live in an area where a cooperative or small telco is willing to invest in their neighborhoods. But the majority of these rural broadband deserts are not so lucky and there is nobody even thinking about bringing them broadband.

The clock is ticking for these broadband deserts. If there is not a solution to help these folks within a decade it might be too late. People in these broadband deserts will reluctantly leave and it doesn’t take a whole lot of egress to push any town or neighborhood onto the irreversible path towards becoming an abandoned ghost town.

It seems to me that a lot of regulators and policy people are okay with this. These neighborhoods are often not large (although in some places entire counties have no real broadband) and by being scattered they don’t have much political clout. Without broadband and cellphone coverage they can’t even call or email to complain. They are easily ignored and easily forgotten.

But it doesn’t have to be that way. The Universal Service Fund was originally created to fix the problem of bringing telephone service to the most remote parts of the country. The original genesis of the USF was the belief that we are stronger as a nation when we are all connected. And that is probably truer for broadband than it was for telephone service. As a country we have the money to get broadband to everybody, but the question that most matters is if we have the political will.

False Advertising of Broadband Speeds

There is another new and interesting regulatory battle happening now at the FCC. The lobbying groups that represent the telcos and cable company – NCTA, USTelecom and the ACA – are asking the FCC to make it harder for states to sue ISPs for making misleading claims about broadband speeds.

This request was brought about from a lawsuit by the State of New York against Charter Spectrum. I wrote about that case in a March blog and won’t repeat all of the particulars. Probably the biggest reason for the suit was that Charter had not made the major upgrades and improvements that they had promised to the State. But among the many complaints, the one that worried the other ISPs the most was that Charter was not delivering the speeds that it was advertising.

This is an issue that affects many ISPs, except perhaps for some fiber networks that routinely deliver the speeds they advertise. Customer download speeds can vary for numerous reasons, with the primary reason being that networks bog down under heavy demand. But the New York complaint against Charter was not about data speeds that slowed down in the evenings; rather the complaint was that Charter advertised and sold data products that were not capable of ever reaching the advertised speeds.

Charter is perhaps the best poster child for this issue, not just because of the New York case. On their national website they only advertise a speed of 60 Mbps download, with the caveat that this is an ‘up to’ speed. I happen to have Charter in Asheville, NC and much of the time I am getting decent speeds at or near to the advertised speed. But I work all over the country and I am aware of a number of rural Charter markets where the delivered speeds are far below that 60 Mbps advertised speed. These markets appear to be like the situation in New York where the State accuses Charter of false advertising.

The filings to the FCC want a clarification that what Charter is doing is okay and that ISPs ought to be exempt from these kinds of suits. They argue that the ISP industry have always sold ‘up to’ speeds and that what they are doing fits under existing FCC regulations. And this is where it gets murky.

In the FCC’s first attempt to introduce net neutrality the FCC ordered ISPs to disclose a lot of information to customers about their broadband products, including telling them the real speeds they could expect for their purchased broadband product. Much of that first net neutrality decision was struck down due to a successful lawsuit by Verizon that claimed that the FCC didn’t have the authority to regulate broadband as laid forth by that order.

But not all of that first order was reversed by the lawsuit, including the provision that ISPs had to disclose information about their network performance, fees, data caps, etc. But since most of the original net neutrality order was reversed the FCC put the implementation of the remaining sections on hold. Last year the FCC finally decided to implement a watered-down version of the original rules, and in February of this year the FCC excused smaller ISPs from having to make the customer disclosures. But the large ISPs are now required to report specific kinds of information to customers. The ISPs interpret the current FCC rules to mean that selling ‘up to’ data products is acceptable.

Where this really gets interesting from a regulatory perspective is that the FCC might not long have the authority to deal with these sorts of requests from the ISPs. The bulk of the FCC’s authority to regulate broadband (and thus to potentially shield the ISPs in this case if they are complying with FCC regulations) comes from Title II regulation.

But the FCC seems likely to relinquish Title II authority and they have suggested that the authority to regulate ISP products should shift to the Federal Trade Commission. Unfortunately for the ISPs, the FTC has more often sided with consumers over big companies.

Since the FCC is in the process of eliminating Title II authority I wonder if they will even respond to the ISPs. Because to clarify that advertising ‘up to’ products is acceptable under Title II would essentially mean creating a new broadband regulation, something this FCC seems loathe to do. I’ve seen several other topics just recently that fall into this same no-man’s land – issues that seem to require Title II authority in order for the FCC to have jurisdiction. As much as the big ISPs complained about Title II, one has to wonder if they really want it to go away? They mostly feared the FCC using Title II to address pricing issues, but there are a lot of other issues, like this request, where broadband regulation by the FCC might be in the ISPs’ favor.

Tackling Pole Attachment Issues

In January the new FCC Commissioner Ajit Pai announced the formation of a new federal advisory committee  – the Broadband Deployment Advisory Committee (BDAC). This new group has broken into sub-groups to examine various ways that the deployment of broadband could be made easier.

I spoke last week to the Sub-Committee for Competitive Access to Broadband Infrastructure, i.e. poles and conduits. This group might have the hardest task of all because getting access to poles has remained one of the most challenging tasks of launching a new broadband network. Most of the issues raised by a panel of experts at the latest meeting of this committee are nearly the same issues that have been discussed since the 1996 Telecommunications Act that gave telecom competitors access to this infrastructure.

Here are some of the issues that still make it difficult for anybody to get onto poles. Each of these is a short synopsis of an issue, but pages could be written about the more detailed specifics involved each of these topics:

Paperwork and Processes. It can be excruciatingly slow to get onto poles for a fiber overbuilder, and time is money. There are processes and paperwork thrown at a new attacher that often seem to be done for no other reason than to slow down the process. This can be further acerbated when the pole owner (such as AT&T) is going to compete with the new attacher, giving the owner incentives to slow-roll the process as has been done in several cities with Google Fiber.

Cooperation Among Parties. Even if the paperwork needed to get onto poles isn’t a barrier, one of the biggest delays in the process of getting onto poles can be the requirement to coordinate with all of the existing attachers on a given pole. If the new work requires any changes to existing attachers they must be notified and they must then give permission for the work to be done. Attachers are not always responsive, particularly when the new attacher will be competing with them.

Who Does the Work? Pole owners or existing attachers often require that a new attacher use a contractor that they approve to make any changes to a pole. Getting into the schedule for these approved contractors can be another source of delay if they are already busy with other work. This process can get further delayed if the pole owner and the existing attachers don’t have the same list of approved contractors. There are also issues in many jurisdictions where the pole owner is bound by contract to only use union workers – not a negative thing, but one more twist that can sometimes slow down the process.

Access Everywhere. There are still a few groups of pole owners that are exempt from having to allow attachers onto their poles. The 1996 Act made an exception for municipalities and rural electric cooperatives for some reason. Most of these exempt pole owners voluntarily work with those that want access to their poles, but there are some that won’t let any telecom competitor on their poles. I know competitive overbuilders who were ready to bring fiber to rural communities only to be denied access by electric cooperatives. In a few cases the overbuilder decided to pay a higher price to bury new fiber, but in others the overbuilder gave up and moved on to other markets.

Equity. A new attacher will often find that much of the work needed to be performed to get onto poles is largely due to previous attachers not following the rules. Unfortunately, the new attacher is still generally on the hook for the full cost of rearranging or replacing poles even if that work is the result of poor construction practices in the past coupled with lax inspection of completed work by pole owners.

Enforcement. Perhaps one of the biggest flaws in the current situation is enforcement. While there are numerous federal and state laws governing the pole attachment process, in most cases there are no remedies other than a protracted lawsuit against a pole owner or against an existing attacher that refuses to cooperate with a new attacher. There is no reasonable and timely remedy to make a recalcitrant pole owner follow the rules.

And enforcement can go the other way. Many of my clients own poles and they often find that somebody has attached to their poles without notifying them or following any of the FCC or state rules, including paying for the attachments. There should be penalties, perhaps including the removal of maverick pole attachments.

Wireless Access. There is a whole new category of pole attachments for wireless devices that raise a whole new set of issues. The existing pole attachment rules were written for those that want to string wires from pole to pole, not for placing devices of various sizes and complexities on existing poles. Further, wireless attachers often want to attach to light poles or traffic signal poles, both for which there are no existing rules.

Solutions. It’s easy to list all of the problems and the Sub-Committee for Competitive Access to Broadband Infrastructure is tasked with suggesting some solutions to these many problems. Most of these problems have plagued the industry for decades and there are no easy fixes for them. Since many of the problems of getting onto poles are with pole or wire owners that won’t comply with the current attachment rules there is no easy fix unless there can be a way to force them to comply. I’ll be interested to see what this group recommends to the FCC. Since the sub-committee contains the many different factions from the industry it will be interesting to see if they can come to a consensus on any issue.

Do We Need International Digital Laws?

German Chancellor Angela Merkel said a few weeks ago that the world needs international regulation of digital technology, much like we have international regulation for financial markets and banking.

She says that without some kind of regulations that isolated ‘islands’ of bad digital actors can emerge that are a threat to the rest of the world. I am sure her analogy is a reference to the handful of islands around the globe that play that same maverick role in the banking arena.

We now live in a world where a relatively small number of hackers can cause incredible harm. For instance, while never definitely proven, it seems that North Korea hackers pulled off the major hack of Sony a few years ago. There are accusations across western democracies that the Russians have been using hacking to interfere with elections.

Merkel certainly has a valid point. Small ‘islands’ of hackers are one of the major threats we face in the world today. They can cause incredible economic harm. They threaten basic infrastructure like electric grids. They make it risky for anybody to be in the Internet at a time when broadband access is becoming an integral part of the lives of billions.

There currently aren’t international laws that are aimed at fighting the nefarious practices of bad hackers or at punishing them for their crimes. Merkel wasn’t specific about the possible remedies. She said that the US and Germany have undertaken discussions on the topic but that it hasn’t gone very far. There are certainly a few things that would make sense at the international level:

  • Make certain kinds of hacking an international crime so that hacker criminals can more easily be pursued across borders.
  • Create a forum for governments to better coordinate monitoring hackers and devising solutions for blocking or stopping them.
  • Make laws to bring cryptocurrency under the same international auspices as other currencies.

But as somebody who follows US telecom regulation in this blog I wonder how fruitful such regulations might be? We now live in a world where hackers always seem to be one step ahead of the security industry that works to block them. The cat and mouse game between hackers and security professionals is a constantly changing one and I have to wonder how any set of rules might be nimble nimble enough to make any difference.

This does not mean that we shouldn’t have an international effort to fight against the bad actors – but I wonder if that cooperation might best be technical cooperation rather than the creation of regulations that might easily be out of date as they are signed into law.

Any attempt to create security regulations also has to wrestle with that fact that a lot of what we think of as hacking is probably really government sponsored cyberwarfare. How do we tell the difference between cyber-criminals and cyber-warriors? In a murky world where it’s always going to be hard to know who specifically wrote a given piece of code I wonder how we tell the criminal bad guys from the government bad guys?

I also see a dilemma in that any agreed-upon international laws must, by definition filter back into US laws. We now have an FCC that is trying to rid itself of having to regulate broadband. Assuming that Title II regulation will be reversed I have to wonder if the FCC would be able to try to require ISPs to comply with any international laws at a time when there might not even be many US laws that can be enforced on them.

It makes sense to me that there ought to be international cooperation in identifying and stopping criminal hackers and others that would harm the web. But I don’t know if there has even been an issue where the governments of the world engage in many of the same practices as the bad actors – and that makes me wonder if there can ever be any real cooperation between governments to police or control bad practices on the web.

The Fastest ISPs

PC Magazine has been rating ISPs in terms of speed for a number of years. They develop their rankings based upon speed tests taken at their own speed test site. They had about 124,000 speed tests taken that led to this year’s rankings. The scoring for each ISP is a composite number based 80% on the download speed and 20% of upload speeds. To be included in the rankings an ISP needed to have 100 customers or more take the speed test.

You always have to take these kinds of rankings with a grain of salt for several reasons. For example speeds don’t only measure the ISP but also the customer. The time of day can affect the speed test, but probably the type of connection affects it the greatest. We know these days that a lot of people are using out-of-date or poorly located WiFi routers that affect the speeds at their computer.

Measured speeds vary between the different speed tests. In writing this blog I took four different speed tests just to see how they compare. I took the one at the PC Magazine site and it showed my speeds at 27.5 Mbps down / 5.8 Mbps up. I then used Ookla which showed 47.9 Mbps down / 5.8 Mbps up. The Speakeasy speed test showed 17.6 Mbps down and 5.8 Mbps up. Finally, I took the test from Charter Spectrum, my ISP, which showed 31.8 Mbps down / 5.9 Mbps up. That’s a pretty startling set of different speeds measured just minutes apart – and which demonstrates why speed test results are not a great measure of actual speeds. I look at these results and I have no idea what speed I actually am receiving. However, with that said, one would hope that any given speed test would probably be somewhat consistent in measuring the difference between ISPs.

The results of the speed test ‘contest’ are done for different categories of ISPs. For years the winner of the annual speed test for the large incumbents has been Verizon FiOS. However, in this year’s test they fell to third in their group. Leading that category now is Hotwire Communications which largely provides broadband to multi-tenant buildings, with a score of 91.3. Second was Suddenlink at 49.1 with Verizon, Comcast and Cox and closely behind. The lowest in the top 10 was Wow! at a score of 26.7.

Another interesting category is the competitive overbuilders and ISPs. This group is led by Google Fiber with a score of 324.5. EPB Communications, the municipal network in Chattanooga, is second at 136.1. Also in the top 10 are companies like Grande Communications,, RCN, and Comporium.

PC Magazine also ranks ISPs by region and it’s interesting to see how the speeds for a company like Comcast varies in different parts of the country.

Results are also ranked by state. I find some of the numbers on this list startling. For instance, Texas tops the list with a score of 100.3. Next is South Dakota at 80.3 and Vermont at 70.6. If anything this goes to show that the rankings are not any kind of actual random sample – it’s impossible to think that this represents the true composite speeds of all of the people living in those states. The results of this contest also differs from results shown by others like Ookla that looks at millions of actual connection speeds at Internet POPs. Consider Texas. Certainly there are fast broadband speeds in Austin due to Google Fiber where all of the competitors have picked up their game. There are rural parts of the state with fiber networks built by telcos and cooperatives. But a lot of the state looks much like anywhere else and there are a lot of people on DSL or using something less than the top speeds from the cable companies.

But there is one thing this type of study shows very well. It shows that over the years that the cable companies are getting significantly faster. Verizon FiOS used to be far faster than the cable companies and now lies in the middle of a pack with many of them.

This test is clearly not a statistically valid sample. And as I showed above with my results from various speed tests the results are not likely even very accurate. But ISPs care about these kinds of tests because it can give them bragging rights if they are near the top of one of the charts. And, regardless of the flaws, one would think the same shortcomings of this particular test are similar across the board, which means it does provide a decent comparison between ISPs. That is further validated by the fact the results of this exercise are pretty consistent from year to year.

Industry Shorts, June 2017

Following are some topics I found of interest but which don’t justify a whole blog.

Amazon Bringing Alexa to Settop Boxes. Amazon has created a develop kit that would allow any settop box maker to integrate their voice service Alexa. The Alexa voice platform is currently supporting the popular Echo home assistant device. It’s also being integrated into some new vehicles and Amazon has made it available for integration into a whole range of home automation devices. The Amazon Alexa platform is currently ahead of the competitors at Apple, Google and Microsoft mostly due to having made the product open to developers who have already created over 10,000 applications that will work on the platform. Adding Alexa to a settop box could make it a lot easier to use the settop box as the hub for a smart home.

Comcast Tried to Shut Down anti-Comcast Website. LookingGlass Cyber Security Center, a vendor for Comcast, sent a cease-and-desist letter to the advocacy group Fight for the Future. This group is operating a website called The advocacy group claims that Comcast has used bots to generate over a half million fake filings to the FCC in the network neutrality docket. These comments were all in favor of killing net neutrality and the group claims that Comcast used real people’s names to sign the filings, but without their permission. The website allows people to see if their name has been used. The cease-and-desist order was withdrawn after news of it got a lot of coverage in social media.

Net Neutrality Wins in Court. Not that it probably makes much difference now that the FCC is trying to undo Title II regulation, but the challenge filed by Verizon and other large ISPs against the FCC’s net neutrality decision was rejected at appeal. This affirms the ability of the FCC to use Title II rules for regulating broadband. The full U.S. Court of Appeals for the D.C. Circuit upheld an earlier court ruling that affirmed the FCC had the needed authority to implement the net neutrality decision.

Altice Buys Ad-Tech Company. Altice joins other big ISPs that want to take advantage of the end of the new FCC rules that allows ISPs to monetize customer’s private data. Altice, which is now the fourth largest US cable company after the acquisition of Cablevision, now joins the other big ISPs who have added the expertise to slice and dice customer data. Altice paid $300 million for Teads, a company specializing in targeting advertising based upon customer specific data.

Other large ISPs are already poised to take advantage of the new opportunity. For example, Verizon’s purchase of AOL and Yahoo brings this same expertise in-house. It has been widely speculated that the ISPs have been gathering customer data for many years and so are sitting on a huge treasure trove detailing customers web browsing usage, on-line purchasing habits, email and text information, and for the wireless ISPs the location data of cellphones.

Charter Rejects $100 Billion offer from Verizon. The New York Post reported that Charter rejected a purchase offer from Verizon. The Post reports that Charter thought the offer wasn’t high enough. It also came with some tax implications that would complicate the deal. Whether this particular offer is real or not, it points to the continuing consolidation of the industry ISPs, cable providers and cellular companies. The current administration is reportedly not against large mergers, so there’s no telling what other megadeals we might see over the next few years.

Top 7 Media CEOs made $343.8 Million in 2016. The CEOs of CBS, Comcast, Discovery Communications, Disney, Fox, Time Warner and Viacom collectively made a record salary last year, up 21.1% from 2015. It’s interesting in a time when the viewership of specific cable networks is dropping rapidly that the industry would be rewarding their leaders so handsomely. But all of these companies are compensating for losses of customers with continuing rate hikes for programming and most are having banner earnings.

Frontier Lays Off WV Senate President. Frontier just laid off Mitch Carmichael, the President of the Senate in West Virginia. This occurred right after the Senate passed a broadband infrastructure bill that was aimed at bringing more broadband competition to the state. The bill allows individuals or communities to create broadband cooperatives to build broadband infrastructure in areas with poor broadband coverage. Frontier is the predominant ISP in the state after its purchase of the Verizon property there. The West Virginia legislature is a part-time job that pays $20,000 per year and most legislators hold other jobs. West Virginia is at or near the bottom in most statistics concerning broadband speeds and customer penetration rates.

FCC to Investigate MDU Broadband

The FCC is launching an investigation into anticompetitive practices that are keeping broadband from coming to apartments and other multi-tenant buildings. They have issued a Notice of Inquiry in Docket in GN Docket 17-142 looking into the topic and are expected later this month to formally release it to the public. The docket specifically looks at barriers to competition in what the FCC is calling MTEs – multiple tenant environments, which includes apartments, condominiums, shopping malls and cooperatively owned buildings.

This is not the first time that the FCC has tackled the topic. Back in 2008 the commission banned some contractual arrangements that gave incumbent ISPs unfair advantage over competitors. However, that order didn’t go far enough, and ISPs basically shifted to arrangements that were not banned by the FCC. The FCC is looking into the topic because it’s become obvious that exclusive arrangements are harming the introduction of faster broadband into a sizable portion of the market. There are cities where half or more of residents live in apartments and don’t have the same competitive choices as those living in single family homes.

The FCC has an interesting legal challenge in looking at this issue. This docket specifically looks at the potential for regulating broadband access in MTEs, something that the FCC has the authority to do under Title II regulation. But assuming that the FCC moves forward this year with plans to scrap Title II regulation they might also be eliminating their authority to regulate MTEs in the manner suggested by the NOI. If they decide to act on the issue it will be interesting to see how they define their authority to regulate anything that is broadband related. That might be our first glimpse at what a regulatory regime without Title II looks like.

Further, Chairman Ajit Pai has shown a strong preference to lighten the regulations on ISPs and you have to wonder if he is willing to really tackle a new set of regulations. But he’s faced with the dilemma faced by all regulators in that sometimes the market will not automatically produce the results that are beneficial to society and sometimes regulations are the only way to get corporations and others to behave in the way that benefits everybody. It’s clear that residents in MTEs have little or no competition and choice and new regulations might be the only way to get it for them.

The NOI looks at specific issues related to MTE broadband competition:

  • It asks if the FCC should consider overriding state and local regulations that inhibit the deployment of broadband in MTEs. Some jurisdictions have franchising and other rules that make it hard for a smaller competitor to try to serve only MTEs or parts of markets.
  • It asks if the FCC should prohibit exclusive marketing and bulk billing arrangements by ISPs.
  • It asks if the FCC should prohibit revenue sharing and exclusive wiring arrangements with ISPs.
  • It asks if there are other kinds on non-contractual practices that should be prohibited or regulated.

The NOI is interesting in that it tackles all of the topics that the FCC left untouched in 2008. When that order came out I remember thinking about all of the loopholes the FCC had left available to ISPs that wanted to maintain an exclusive arrangement with apartment owners. For example, bulk billing arrangements are where a landlord buys wholesale connections from an ISP and then includes broadband or cable TV as part of the rent, at a mark-up. Landlords under such arrangements are unlikely to allow in another competitor since they are profiting from the exclusive arrangement. The FCC at the time didn’t feel ready to tackle the issues associated with regulating landlord behavior.

The NOI asks for comments on the non-contractual issues that prohibit competition. I’ve seen many such practices in the marketplace. For instance, a landlord may tell tenants that they are pro-competition and that they allow access to multiple ISPs, but then charge exorbitant fees to ISPs for gaining access to buildings or for wanting to collocate electronics or to run wiring. I can think of dozens of different roadblocks that I’ve seen that effectively keep out competitors.

I am heartened a bit by this docket in that it’s the first thing this new FCC has done to solve a problem. Most of the work they’ve done so far is to dismantle old rules to reduce regulation. There is nothing wrong with that in general and I have my own long shopping list of regulations that are out of date or unnecessary. But there are industry issues like this one where regulation is the only way to provide a needed fix to a problem. It’s clear that large ISPs and many landlords have no interest in bringing competition to their buildings. And if that is a goal that the FCC wants to foster, then they are going to have to create the necessary regulations to make it happen – even if they prefer to not regulate.

Means Testing for FCC Funding – Part II

Yesterday I wrote about the recent blog by FCC Commissioners Michael O’Rielly and Mignon Clyburn that suggests that there ought to be a means test for anybody accepting Universal Service Funds. Yesterday I looked at the idea of using reverse auctions for allocating funds – an idea that I think would only serve to shift broadband funds to slower technologies, most likely rural cellular service for broadband. Today I want to look at two other ideas suggested by the blog.

The blog suggests that rural customers ought to pay more for broadband since it costs more to provide broadband in sparsely populated areas. I think the FCC might want to do a little research and look at the actual prices charged today for broadband where commercial companies have built rural broadband networks. It’s something I look at all of the time and all over the country, and from what I can see the small telcos, cooperatives, WISPs and others that serve rural America today already charge more than what households pay for broadband in urban areas – sometimes considerably more. I am sure there are exceptions to this and perhaps the Commissioners have seen some low rural pricing from some providers. But I’ve looked at the prices of hundreds of rural ISPs and have never seen prices below urban rates.

The small rural ISPs have to make a commercial go of their broadband networks and they’ve realized for years that the only way to do that is to charge more. In most urban areas there is a decent broadband option starting around $40 per month and you rarely see a price close to that in rural America. If you see a low price in rural America it probably offers a very slow speed of perhaps a few Mbps, which certainly doesn’t compare to the 60 Mbps I get from Charter for $44.95 per month.

The issue of rural pricing does raise one policy issue. Historically the Universal Service Fund was used for precisely what this blog seems not to like – to hold telephone rates down in rural America so that everybody in the country could afford to be connected. That policy led to the country having telephone penetration rates for decades north of 98%. I’m not advocating that USF funds ought to be used to directly hold down rural broadband rates, but it’s worth a pause to remember that was the original reason that the Universal Service Fund was started and it worked incredibly well.

The second idea raised by the blog is that Universal Service Funds ought not be used to build broadband to wealthy customers. They suggest that perhaps federal funding ought not to be used to bring broadband to “very rich people who happen to live in the more rural portions of our nation.”  The blog worries that poor urban people will be subsidizing ‘some of the wealthiest communities in America.’  I am sure in making that statement that the Commissioners must have a few real-life communities in mind. But I work all over the country and there are not very many pockets of millionaires in rural America, except perhaps for farmers.

Farmers are an interesting case when it comes to broadband. By definition farmers are rural. But US agriculture is the largest industry in the country and the modern farmer needs broadband to be effective. We are headed soon towards a time when farm yields can increase dramatically by use of IoT sensors, farm robots and other high technology that is going to require broadband. I know that a lot of the rural communities that are clamoring for broadband are farming communities – because those farms are the economic engine that drives numerous counties and regions of the country. I don’t think it’s unreasonable if we are going to rethink policy to talk about bringing broadband to our largest industry.

The FCC blog suggests that perhaps wealthier individuals ought to pay for the cost of getting connected to a broadband network. It’s certainly an interesting idea, and there is precedent. Rural electric companies have always charged the cost of construction to connect customers that live too far from their grid. But with that said we also have to remember that rural electric grids were purposefully built to reach as many people as possible, often with the help of federal funding.

This idea isn’t practical for two reasons. It’s already incredibly hard today to finance a fiber network. I picture the practical problem of somehow trying to get commitments from farmers or other wealthy individuals as part of the process of funding and building a broadband network. As somebody who focuses mostly on financing fiber networks this would largely kill funding new networks. To get the primary borrower and all of the ‘rich’ people coordinated in order to close a major financing is something that would drive most lenders away – it’s too complicated to be practically applied. The FCC might want to consult with a few bankers before pushing this idea too far.

But there is a more fundamental issue and the FCC blog touches upon it. I’m trying to imagine the FCC passing a law that would require people to disclose their income to some commercial company that wants to build a fiber network. I’m not a lawyer, but that sounds like it would bump against all sorts of constitutional issues, let alone practical ones. For example, can you really picture having to report your income to AT&T?  And I then go back to the farmers again. Farmers don’t make a steady income – they have boom years and bust years. Would we put them on or off the hook for contributing towards a fiber network based upon their most recent year of income?

I certainly applaud the Commissioners for thinking outside the box, and that is a good thing when it leads to discussions of ways to improve the funding process. I will be the first to tell you that the current USF distributions are not always sensible and equitable and there is always room for improvement. Some of the ideas suggested by the blog have been discussed in the past and it never hurts to revisit ideas. But what most amazes me about the suggestions made by this blog is that the proposed solutions would require a heavy regulatory hand – and this FCC, or at least its new Chairman has the goal of reducing regulation. To impose a means test or income test would go in the opposite direction and would require a new layer of intrusive regulations.

Means Testing for FCC Funding – Part I

A recent blog by FCC Commissioners Michael O’Rielly and Mignon Clyburn asks if there should be a means test in federal high cost programs. This blog is something every telco, school, library or health care provider that gets any form of Universal Service funding needs to read.

There is already some means testing in the Universal Service Fund. For instance, the Lifeline program brings subsidized voice and broadband only to households that meet certain poverty tests. And the Schools and Libraries program uses a mean test to make certain that subsidies go to schools with the most low-income students. The FCC blog talks about now applying a means test to the Universal Service Funds that are used to promote rural broadband. There are several of these programs, with the biggest dollar ones being the CAF II funding for large telcos and the ACAM program for small telcos to expand rural broadband networks.

The blog brings up the latest buzzword at the FCC, which is reverse auction. The FCC embraces the concept that there should be a competition to get federal money to expand broadband networks, with the funding going to the carrier that is willing to accept the lowest amount of funding to expand broadband into an area. On the surface that sounds like a reasonable suggestion in that it would give money to the company that is the most efficient.

But in real-life practice reverse auctions don’t work, at least for building rural broadband networks. Today these FCC infrastructure programs are aimed at bringing broadband to places that don’t have it. And the reason they don’t have it is because the areas are largely rural and sparsely populated, meaning costly for building broadband infrastructure. In most of these places nobody is willing to build without significant government subsidy because there is no reasonable business plan using commercial financing.

If there was a reverse auction between two companies willing to bring fiber to a given rural area, then in my experience there wouldn’t be much difference between them in terms of the cost to build the network. They have to deploy the same technology over the same roads to reach the same customers. One might be slightly lower in cost, but not enough to justify going through the reverse auction process.

And that is the big gotcha with the preference for reverse auctions. A reverse auction will always favor somebody using a cheaper technology. And in rural broadband, a cheaper technology means an inferior technology. It means using federal funding to expand DSL or cellular wireless as is being done with big telco CAF II money instead of building fiber, as is being done by the small telcos accepting ACAM money.

Whether intentional or not, the FCC’s penchant for favoring reverse auctions would shift money from fiber projects – mostly being done by small telcos – to the wireless carriers. It’s clear that building cellular technology in rural areas is far cheaper than building fiber. But to use federal money to build inferior technology means relegating rural areas to dreadfully inadequate broadband for decades to come.

Forget all of the hype about how 5G cellular is going to bring amazing broadband speeds – and I hope the FCC Commissioners have not bought into cellular company’s press releases. Because in rural areas fast 5G requires bringing fiber very close to customers – and that means constructing nearly the same fiber networks needed to provide fiber into homes. The big cellular companies are not going to invest in rural 5G any more than the big telcos have ever invested in rural fiber. So a reverse auction would divert federal funds to Verizon and AT&T to extend traditional cellular networks, not for super-fast wireless networks.

We already know what it looks like to expand rural cellular broadband. It means building networks that deliver perhaps 20 Mbps to those living close to cell towers and something slower as you move away from the towers. That is exactly what AT&T is building with their CAF II funding today. AT&T is taking $426 million per year for six years, or $2.5 billion in total to expand cellular broadband in rural areas. As I’ve said many times in the past this is perhaps the worse use of federal telecom funding I have ever seen. Customers on these cellular networks are getting broadband on day one that is too slow and that doesn’t even meet the current FCC’s definition of broadband. And in the future these customers and rural communities are going to be light-years behind the rest of the country as household demand for broadband continues to grow at a torrid pace while these customers are stuck with an inadequate technology.

The FCC blog also mentions the concept of possibly re-directing future USF payments, and if I am a small telco that scares me to death. This sounds like the FCC may consider redirecting this already-committed ACAM funding. Numerous small telcos just accepted a 10-year commitment to receive ACAM funding from the USF Fund to expand broadband in rural areas, and many are already borrowing matching funds from banks based upon that commitment. Should that funding be redirected into a reverse auction these small companies will not be able to complete their planned expansion, and if they already borrowed money based upon the promise of that ACAM funding they could find themselves in deep financial trouble.