A New FCC Definition of Broadband?

Section 706 of the Telecommunications Act of 1996 requires that the FCC annually review broadband availability in the country. Further, that section of law then requires the FCC to take immediate action if they find that broadband is not being deployed fast enough. This is the law that in the past prompted the FCC to set a definition of broadband – first set at 4/1 Mbps a decade ago then updated to 25/3 Mbps in 2015. The FCC felt it couldn’t measure broadband deployment without a benchmark.

In this year’s annual proceeding the FCC has suggested a change in the definition of broadband. They are suggesting there should be a minimum benchmark of 10/1 Mbps used to define cellular broadband. That doesn’t sound like a bad idea since almost everybody uses cellular broadband at times and it would be good to know that the cellular companies have a speed target to shoot for.

But I am alarmed at how the FCC wants to use the new proposed cellular broadband standard. They are suggesting that cellular service that meets the 10/1 Mbps standard can be considered as a substitute for a landline broadband connection that meets the 25/3 Mbps test. This would represent a huge policy shift at the FCC because use of the cellular standard would allow them to claim that most Americans can get broadband. And that would eliminate them having to take any action to make broadband better in the country.

We can’t be particularly surprised by this shift in policy because now-Chairman Ajit Pai vociferously objected when the FCC increased the definition of broadband in January 2015 to 25/3 Mbps. He argued at the time that the speed definition of broadband should not be increased and that both satellite and cellular broadband ought to be considered as substitutes for landline broadband.

But as almost anybody with a broadband connection can tell you, speed is not the only parameter that matters with a broadband connection. Speed matters for folks in a busy broadband home like mine when different family members are trying to make simultaneous broadband connections. But even homes with lower broadband needs care about more than speed. The limiting factor with cellular data is the stingy amount of total downloads allowed in a month. The new ‘unlimited’ cellular plans are capped at 20 to 25 gigabytes per month. And satellite data not only has stingy data caps but also suffers from latency issues that means that a satellite customer can’t take part in any real-time activity on the web such as VoIP, distance learning or live streaming video.

There are several possible motives for this policy shift. First, this could just be an attempt by the FCC to take off the pressure of having to promote faster broadband everywhere. If their annual Section 706 examination concludes that most people in the country have broadband then they don’t have to push expensive federal programs to expand broadband coverage. But there is also the potential motive that this has been prompted by the cellular companies that want even more federal money to expand their rural cellular networks. AT&T has already been given billions in the CAF II proceeding to largely improve rural cellular towers.

Regardless of the motivation this would be a terrible policy shift. It would directly harm two huge groups of people – rural America and the many urban pockets without good broadband. This ruling would immediately mean that all urban areas would be considered to have broadband today along with a lot of rural America.

I don’t think this FCC has any concept of what it’s like living in rural America. There are already millions of households that already use cellular or satellite broadband. I’ve heard countless stories from households with schoolkids who spend upwards of $500 per month for cellular broadband – and even at that price these homes closely monitor and curtail broadband usage.

There are also huge swaths of rural America that barely have cellular voice service let alone 10/1 Mbps cellular broadband. I was recently in north-central Washington state and drove for over an hour with zero AT&T cell coverage. But even where there is cellular voice service the quality of broadband diminishes with distance from a cell tower. People living close by a tower might get okay cellular data speeds, but those even just a few miles away get greatly diminished broadband.

I know that Chairman Pai has two kids at home in Arlington Virginia. There he surely has fast broadband available from Comcast, and if he’s lucky he also has a second fast alternative from Verizon FiOS. Before the Chairman decides that cellular broadband ought to be a substitute for a landline connection I would challenge him to cut off his home broadband connection and use only cellular service for a few months. That would give him a taste of what it’s like living in rural America.

White Space Spectrum for Rural Broadband – Part II

Word travels fast in this industry, and in the last few days I’ve already heard from a few local initiatives that have been working to get rural broadband. They’re telling me that the naysayers in their communities are now pushing them to stop working on a broadband solution since Microsoft is going to bring broadband to rural America using white space spectrum. Microsoft is not going to be doing that, but some of the headlines could make you think they are.

Yesterday I talked about some of the issues that must be overcome in order to make white space spectrum viable. It certainly is no slam dunk that the spectrum is going to be viable for unlicensed use under the FCC spectrum plan. And as we’ve seen in the past, it doesn’t take a lot of uncertainty for a spectrum launch to fall flat on its face, something I’ve seen a few times just in recent decades.

With that in mind, let me discuss what Microsoft actually said in both their blog and whitepaper:

  • Microsoft will partner with telecom companies to bring broadband by 2022 to 2 million of the 23.4 million rural people that don’t have broadband today. I have to assume that these ‘partners’ are picking up a significant portion of the cost.
  • Microsoft hopes their effort will act as a catalyst for this to happen in the rest of the country. Microsoft is not themselves planning to fund or build to the remaining rural locations. They say that it’s going to take some combination of public grants and private money to make the numbers work. I just published a blog last Friday talking about the uncertainty of having a federal broadband grant program. Such funding may or may not ever materialize. I have to wonder where the commercial partners are going to be found who are willing to invest the $8 billion to $12 billion that Microsoft estimates this will cost.
  • Microsoft only thinks this is viable if the FCC follows their recommendation to allocate three channels of unlicensed white space spectrum in every rural market. The FCC has been favoring creating just one channel of unlicensed spectrum per market. The cellular companies that just bought this spectrum are screaming loudly to keep this at one channel per market. The skeptic in me says that Microsoft’s white paper and announcement is a clever way for Microsoft to put pressure on the FCC to free up more spectrum. I wonder if Microsoft will do anything if the FCC sticks with one channel per market.
  • Microsoft admits that for this idea to work that manufacturers must mass produce the needed components. This is the classic chicken-and-egg dilemma that has killed other deployments of new spectrum. Manufacturers won’t commit to mass producing the needed gear until they know there is a market, and carriers are going to be leery about using the technology until there are standardized mass market products available. This alone could kill this idea just as the FCC’s plans for the LMDS and MMDS spectrum died in the late 1990s.

I think it’s also important to discuss a few important points that this whitepaper doesn’t talk about:

  • Microsoft never mentions the broadband data speeds that can be delivered with this technology. The whitepaper does talk about being able to deliver broadband to about 10 miles from a given tower. One channel of white space spectrum can deliver about 30 Mbps up to 19 miles in a point-to-point radio shot. From what I know of the existing trials these radios can deliver speeds of around 40 Mbps at six miles in a point-to-multipoint network, and less speed as the distance increases. Microsoft wants multiple channels in a market, because bonding multiple channels could greatly increase speeds to perhaps 100 Mbps. But even with one channel this is great broadband for a rural home that’s never had broadband. But the laws of physics means these radios will never get faster and those will still be the speeds offered a decade and two from now when those speeds are going to feel like slow DSL does today. It seems like too many broadband technology plans fail to recognize the fact that our demand for broadband has been doubling every three years since 1980. What’s pretty good speeds today can become inadequate in a surprisingly short period of time.
  • Microsoft wants to be the company to operate the wireless databases behind this and other spectrum. That gives them a profit motive to spur the wireless spectrums to be used. There is nothing wrong with wanting to make money, but this is not a 100% altruistic offer on their part.

It’s hard to know what to conclude about this. Certainly Microsoft is not bringing broadband to all of rural America. But it sounds like they are willing to work towards making this work. But we can’t ignore the huge hurdles that must be overcome to realize the vision painted by Microsoft in the white paper.

  • First, the technology has to work and the interference issues I discussed in yesterday’s blogs need to be solved for anybody to trust using this spectrum on an unlicensed basis. Nobody will use this spectrum if unlicensed users constantly get bumped off by licensed ones. The trials done for this spectrum to date were not done in a busy spectrum environment.
  • Second, somebody has to be willing to fund the $8B to $12B Microsoft estimates this will cost. There may or may not be any federal grants ever available for this technology, and there may never be commercial investors willing to spend that much on a new technology in rural America. The fact that Microsoft thinks this needs grant funding tells me that a business plan based upon this technology might not stand on its own.
  • Third, the chicken-and-egg issue of getting over the hurdle to have mass-produced gear for the spectrum must be overcome.
  • Finally, the FCC needs to adopt Microsoft’s view that there should be 3 unlicensed channels available everywhere – something that the licensed holders are strongly resisting. And from what I see from the current FCC, there is a god chance that they are going to side with the big cellular companies.

White Space Spectrum for Rural Broadband – Part I

Microsoft has announced that they want to use white space spectrum to bring broadband to rural America. In today and tomorrow’s blog I’m going to discuss the latest thoughts on the white space spectrum. Today I’ll discuss the hurdles that must be overcome to use the spectrum and tomorrow I will discuss in more detail what I think Microsoft is really proposing.

This spectrum being called white space has historically been used for the transmission of television through the air. In the recent FCC incentive auction the FCC got a lot of TV stations to migrate their signals elsewhere to free up this spectrum for broadband uses. And in very rural America much of this spectrum has been unused for decades.

Before Microsoft or anybody can use this spectrum on a widespread basis the FCC needs to determine how much of the spectrum will be available for unlicensed use. The FCC has said for several years that they want to allocate at least one channel of the spectrum for unlicensed usage in every market. But Microsoft and others have been pushing the FCC to allocate at least three channels per market and argue that the white space spectrum, if used correctly, could become as valuable as WiFi. It’s certainly possible that the Microsoft announcement was aimed at putting pressure on the FCC to provide more than one channel of spectrum per market.

The biggest issue that the FCC is wrestling with is interference. One of the best characteristics of white space spectrum is that it can travel great distances. The spectrum passes easily through things that kill higher frequencies. I remember as a kid being able to watch UHF TV stations in our basement that were broadcast from 90 miles away from a tall tower in Baltimore. It is the ability to travel significant distances that makes the spectrum promising for rural broadband. Yet these great distances also exacerbate the interference issues.

Today the spectrum has numerous users. There are still some TV stations that did not abandon the spectrum. There are two bands used for wireless microphones. There was a huge swath of this spectrum just sold to various carriers in the incentive auction that will probably be used to provide cellular data. And the FCC wants to create the unlicensed bands. To confound things, the mix between the various users varies widely by market.

Perhaps the best way to understand white space interference issues is to compare it to WiFi. One of the best characteristics (and many would also say the worse characteristics) of WiFi is that it allows multiple users to share the bandwidth at the same time. These multiple uses cause interference and so no user gets full use of the spectrum, but this sharing philosophy is what made WiFi so popular – except for the most crowded environments anybody can create an application using WiFi and knows that in most cases the bandwidth will be adequate.

But licensed spectrum doesn’t work that way and the FCC is obligated to protect all spectrum license holders. The FCC has proposed to solve the interference issues by requiring that radios be equipped so that unlicensed users will first dynamically check to make sure there are no licensed uses of the spectrum in the area. If they sense interference they cannot broadcast, or, once broadcasting, if they sense a licensed use they must abandon the signal.

This would all be done by using a database that identifies the licensed users in any given area along with radios that can search for licensed usage before making a connection. This sort of frequency scheme has never been tried before. Rather than sharing spectrum, like WiFi, the unlicensed user will be only allowed to use the spectrum when there is no interference. As you can imagine the licensed cellular companies, which just spent billions for this spectrum are worried about interference. But there are also concerns by churches, city halls and musicians who use wireless microphones.

It seems unlikely to me that in an urban area with a lot of usage on the spectrum that unlicensed white space spectrum is going to be very attractive. If it’s hard to make or maintain an unlicensed connection then nobody is going to try to use the spectrum in a crowded-spectrum environment.

The question that has yet to be answered is if this kind of frequency plan will work in rural environments. There have been a few trials of this spectrum over the past five years, but those tests really proved the viability of the spectrum for providing broadband and did not test the databases or the interference issue in a busy spectrum environnment. We’ll have to see what happens in rural America once the cellular companies start using the spectrum they just purchased. Because of the great distances in which the spectrum is viable, I can imagine a scenario where the use of licensed white space in a county seat might make it hard to use the spectrum in adjoining rural areas.

And like any new spectrum, there is a chicken and egg situation with the wireless equipment manufacturers. They are not likely to commit to making huge amounts of equipment, which would make this affordable, until they know that this is really going to work in rural areas. And we might not know if this is going to work in rural areas until there have been mass deployments. This same dilemma largely sunk the use fifteen years ago of the LMDS and the MMDS spectrums.

The white space spectrum has huge potential. One channel can deliver 30 Mbps to the horizon on a point-to-point basis. But there is no guarantee that the unlicensed use of the spectrum is going to work well under the frequency plan the FCC is proposing.

AT&T’s CAF II Data Caps

AT&T recently launched its CAF II cellular data plan in a number of rural areas. This is being launched from the federal program that is giving AT&T $2.5 billion dollars spread over 6 years to bring broadband to about 1.1 million homes. That works out to $2,300 per home.

Customers are guaranteed speeds of at least 10 Mbps down and 1 Mbps up. The broadband product is priced at $60 per month with a contract or $70 per month with no contract. Installation is $99. The product comes with a WiFi router that also includes 4 Ethernet ports for wired connections.

For a rural household that has never had broadband this is finally going to get them connected to the web like everybody else. But the 10 Mbps speed of the product is already obsolete and in the footnotes to the product AT&T warns that a customer may not be able to watch two HD video streams at the same time.

But the real killer is the data cap which is set at 160 gigabytes per month. Extra data above this limit will cost a household $10 for each 50 gigabytes (or fraction thereof). AT&T has obviously set the data cap this low because that was the cap suggested by the FCC in the CAF II order.

Let me throw out some statistics that shed some light on how puny the 160 GB month cap is. Following are some statistics about data usage for common functions in the home:

  • The average desktop or laptop uses about 3 GB per month for basic functions like email, upgrading software, etc.
  • Cisco says that the average smartphone uses about 8 GB per month on WiFi.
  • Web browsing uses about 150 MB per hour.
  • Streaming music uses 1 GB for 24 hours of streaming
  • Facebook estimates that it’s average user uses the service for 20 hours per month, which consumes 2.5 GB.
  • Video is the real bandwidth eater. Netflix says that an SD video uses 0.7 GB per hour or 1.4 GB for a movie. They say HD video uses 3 GB per hour or 6 GB per movie.
  • The average online gamer uses at least 5 GB per month, and for some games much more than this.

So how does all of this stack up for an average family of three? It might look something like this:

3 computers / laptops                      9 GB

3 Smartphones                                24 GB

60 hours of web browsing               9 GB

3 social networks                              8 GB

60 hours of streaming music          3 GB

1 Gamer                                             5 GB

Schoolwork                                      10 GB

Subtotal                                            68 GB

This leaves 92 GB for watching video for a month. That will allow a home to watch 15 HD movies a month or 30 1-hour shows. That means one TV show per day for the whole household. Any more than that and you’d go over the data cap. The majority of video content on the web is now only available in HD and much of the content on Netflix and Amazon no longer come in SD. To make matters worse, these services are now starting to offer 4k video which is 4 times more data intensive than HD video.

Also note that this subtotal doesn’t include other normal functions. Working from home can use a lot of bandwidth. Taking online courses is data intensive. IoT devices like home security cameras can use a lot of bandwidth. And we are starting to see smart home devices add up to a pile of data that goes on behind the scenes without our knowledge.

The fact is that within a few years the average home is going to likely exceed the AT&T data cap without watching any video. The bandwidth used for everything we do on the web keeps increasing over time.

To show how ridiculously low this cap is, compare it to AT&T’s ‘access’ program which supplies broadband to low-income homes for speeds up to the same 10 Mbps and prices up to $10 per month. That low-income plan has a 1 terabyte data cap – over six times higher than the CAF II data cap. Since the company offers both products from the cellular network it’s impossible for the company to claim that the data caps are due to network constraints or any other technical issues. AT&T set the data cap at the low 160 GB because the FCC stupidly suggested that low amount in the CAF II order. The low data cap is clearly about money.

The last time we measured our home with 3 users we used over 700 GB per month. We are cord cutters and watch all video on the web. We work from home. And our daughter was taking on-line classes. Under the AT&T CAF II product our monthly bill would be $170 per month. And even then we would have a data product that would not allow us to do the things we want to do, because the 10 Mbps download speed would not allow all three of us to use the web at the same time. If you’ve been reading my blog you’ve heard me say often what a colossal waste of money the CAF II program is. The FCC gave AT&T $2.5 billion to foist this dreadful bandwidth product on rural America.

FCC Reverses 2016 Privacy Ruling

The FCC adopted an order that formally recognized that the privacy rules passed by the Tom Wheeler FCC are cancelled and that the FCC will revert to the previous privacy rules that were in effect in the past. The action is mostly a clarification because Congress passed H.J. Res 34, the Congressional Review Act that nullified the actions of the last FCC.

This FCC means a number of things. For regulated telephone providers (both LECs and CLECs) it means that all of the previous rules that were generally referred to as a Customer Proprietary Network Information (CPNI) are back in effect. Those rules are codified in FCC Rules Section 64.2009(e) and (c). Those rules include:

  • An obligation to not disclose telephone customer data without permission from the customer.
  • An annual compliance certification to demonstrate compliance with the CPNI rules. This filing will be due next year again, filed no later than March 1.
  • Compliance with various recordkeeping rules that would demonstrate compliance should a carrier ever be audited.

The FCC also reminded non-regulated ISPs that while they are not directly subject to the CPNI rules that they are still subject to Section 222 of the Communications Act that says that all carriers must take reasonable and good faith steps to protect customer privacy.

The rules passed by the last FCC would have brought all ISPs into the same regulations as telcos. And in doing so the rules went further than in the past and required that any service provider get customer buy-in before using their data. Customers were to have been provided with the option to allow ISPs to use data for any purpose, to allow ISPs to use data just for marketing to the customer, or customers could have opted out and chosen full privacy.

One of the big public fears that was voiced in opposition to the congressional action that reversed the privacy rules is that ISPs are now free to use customer information in any manner and that they could even go so far as to ‘sell the browsing history’ of customers on the open market. If ISPs misuse customer broadband data in too egregious of a manner I guess we’ll have to wait for a specific complaint using the Section 222 rules to see what level of protection data customers actually have.

All of the big ISPs have come out and said that they would never sell customer browsing data, and it’s probable that even under the older rules that are still in place that directly selling specific customer data might be illegal.

But we know that the big ISPs have all made plans to monetize customer data, and many of them have already been doing that for years. The most likely use of customer data will be for the biggest ISPs to engage in the same kind of advertising that is being done by Google and Facebook. The social media companies have built detailed profiles of their customers, something that advertisers find valuable. But the ISPs have a big advantage over the social media companies in that they know a lot more about customers including all of the web searches they make and all of the web sites they visit. The big ISPs all have branches of their business that are focusing on this kind of advertising, and even smaller ones like Altice recently purchased a company that creates targeted advertising based upon customer profiles.

There was an article in Forbes earlier this year by Thomas Fox-Brewster that speculated that targeted advertising is what the ISPs really want. They look at the gigantic revenues being earned by Google and Facebook and want a piece of that action. He doesn’t believe that the ISPs will directly sell data, which might invite retaliation from future regulators. But he does speculate that over time that customer information from the ISPs will leak into the public through the companies that use their data for targeted advertising. The web advertisers are not bound by any legal restrictions on using purchased data and over time, as they do various ad campaigns they could effectively build pretty detailed customer profiles based upon different a series of ad campaigns.

Certainly this is of concern to many people. People are free to avoid services like Facebook or Google if they want to maintain privacy, but it takes a lot of effort to hide from their ISP. And while ISPs are probably never going to market a database directly that shows a given customer’s browsing history, as they use our data for advertising purposes they are going to be providing bits of pieces about each of us, that over time can be reassembled to create incredibly detailed profiles. Folks who are savvy and concerned about this are going to thwart the ISPs as much as possible through the use of VPNs and other tools to hide their web activity. But it’s likely that most people won’t do this and I would expect over the next few years to see the ISPs pop onto the radar in a big way as advertisers.

Broadband Deserts

I recently spoke with a guy who lives 30 miles outside one of the larger cities in North Carolina. Where he lives there is zero broadband. It’s hilly and full of pine trees and they can’t even easily get satellite broadband. There is zero cellular coverage. Even though most of the people who live there work in the nearby city, from a broadband perspective it’s at the end of the earth. And there is very little prospect of anybody bringing broadband there without some sort of grant or other financial assistance to help pay for the construction.

This guy’s horror stories are familiar to the millions of others who live in similar situations. In order to send or receive an email he has to drive to the top of a hill 20 minutes from his house which is dangerous due to narrow road shoulders and heavy big truck traffic. If schoolkids or others in his area want a real broadband connection they have to go even further to find one of a handful of public hot spots.

Just recently there was a blog by two of the FCC Commissioners that suggested that the ‘wealthy’ people that live in rural areas ought to pay for their own broadband connections. But this guy’s neighborhood is not full of millionaires enjoying a mountain get-away retreat – they are everyday working people. The homes in his neighborhood were built decades ago before there was broadband, so nobody can be faulted for moving to the “sticks” and then demanding city services.

I would describe where he lives as a “broadband desert.” Broadband deserts are areas of our country that for some reason are unlikely to get broadband in any form. Sometimes they are remote (often due to terrain), and sometimes these are just the leftover places that never even got good telephone copper wires. These broadband deserts are not just in the most remote parts of the northern Rocky Mountains as those in the federal government might imagine – there are pockets of broadband deserts around all of us. A few years ago I was working with one of the most populous counties in Minnesota and they were shocked when our research identified a number of such pockets scattered around their county.

I believe these broadband desert areas are on the verge of being abandoned and blowing away if we can’t find a way to get them broadband. We have had three other major events in US history that have created ghost towns. The first was when many little towns were bypassed by the railroads in the late 1800’s and disappeared as a result.  Secondly, in the early 20th century, towns that didn’t get electricity faded away. And finally, later in the 20th Century, a number of little towns that were bypassed by the then-new Interstate highway system languished and many have now disappeared.

Broadband is next on this list of breakthrough technologies that will be vital for communities to continue to flourish. Towns without broadband are going to become irrelevant. People who live in these broadband deserts will soon be unable to sell their homes and will eventually walk away from them. And certainly nobody is going to build new homes or bring new businesses to places with no broadband. Families won’t be able to raise kids in the broadband deserts. People won’t be able to partake in the work-at-home economy that is proving to be a boon to a lot of rural America. Communities that don’t find a broadband solution are going to dry up and blow away just like the towns in the old west that were bypassed by railroads.

Probably the worst reality for this guy I was talking with was that he knows what broadband can do for a town. He runs a gigabit broadband network for a city that is used to connect city buildings. I met him at a meeting of CLIC NC, a local chapter of the Coalition for Local Internet Choice – a group that strongly advocates for gigabit fiber networks. This guy can see that many other similar neighborhoods in North Carolina are getting fiber to their homes because they happen to live in an area where a cooperative or small telco is willing to invest in their neighborhoods. But the majority of these rural broadband deserts are not so lucky and there is nobody even thinking about bringing them broadband.

The clock is ticking for these broadband deserts. If there is not a solution to help these folks within a decade it might be too late. People in these broadband deserts will reluctantly leave and it doesn’t take a whole lot of egress to push any town or neighborhood onto the irreversible path towards becoming an abandoned ghost town.

It seems to me that a lot of regulators and policy people are okay with this. These neighborhoods are often not large (although in some places entire counties have no real broadband) and by being scattered they don’t have much political clout. Without broadband and cellphone coverage they can’t even call or email to complain. They are easily ignored and easily forgotten.

But it doesn’t have to be that way. The Universal Service Fund was originally created to fix the problem of bringing telephone service to the most remote parts of the country. The original genesis of the USF was the belief that we are stronger as a nation when we are all connected. And that is probably truer for broadband than it was for telephone service. As a country we have the money to get broadband to everybody, but the question that most matters is if we have the political will.

False Advertising of Broadband Speeds

There is another new and interesting regulatory battle happening now at the FCC. The lobbying groups that represent the telcos and cable company – NCTA, USTelecom and the ACA – are asking the FCC to make it harder for states to sue ISPs for making misleading claims about broadband speeds.

This request was brought about from a lawsuit by the State of New York against Charter Spectrum. I wrote about that case in a March blog and won’t repeat all of the particulars. Probably the biggest reason for the suit was that Charter had not made the major upgrades and improvements that they had promised to the State. But among the many complaints, the one that worried the other ISPs the most was that Charter was not delivering the speeds that it was advertising.

This is an issue that affects many ISPs, except perhaps for some fiber networks that routinely deliver the speeds they advertise. Customer download speeds can vary for numerous reasons, with the primary reason being that networks bog down under heavy demand. But the New York complaint against Charter was not about data speeds that slowed down in the evenings; rather the complaint was that Charter advertised and sold data products that were not capable of ever reaching the advertised speeds.

Charter is perhaps the best poster child for this issue, not just because of the New York case. On their national website they only advertise a speed of 60 Mbps download, with the caveat that this is an ‘up to’ speed. I happen to have Charter in Asheville, NC and much of the time I am getting decent speeds at or near to the advertised speed. But I work all over the country and I am aware of a number of rural Charter markets where the delivered speeds are far below that 60 Mbps advertised speed. These markets appear to be like the situation in New York where the State accuses Charter of false advertising.

The filings to the FCC want a clarification that what Charter is doing is okay and that ISPs ought to be exempt from these kinds of suits. They argue that the ISP industry have always sold ‘up to’ speeds and that what they are doing fits under existing FCC regulations. And this is where it gets murky.

In the FCC’s first attempt to introduce net neutrality the FCC ordered ISPs to disclose a lot of information to customers about their broadband products, including telling them the real speeds they could expect for their purchased broadband product. Much of that first net neutrality decision was struck down due to a successful lawsuit by Verizon that claimed that the FCC didn’t have the authority to regulate broadband as laid forth by that order.

But not all of that first order was reversed by the lawsuit, including the provision that ISPs had to disclose information about their network performance, fees, data caps, etc. But since most of the original net neutrality order was reversed the FCC put the implementation of the remaining sections on hold. Last year the FCC finally decided to implement a watered-down version of the original rules, and in February of this year the FCC excused smaller ISPs from having to make the customer disclosures. But the large ISPs are now required to report specific kinds of information to customers. The ISPs interpret the current FCC rules to mean that selling ‘up to’ data products is acceptable.

Where this really gets interesting from a regulatory perspective is that the FCC might not long have the authority to deal with these sorts of requests from the ISPs. The bulk of the FCC’s authority to regulate broadband (and thus to potentially shield the ISPs in this case if they are complying with FCC regulations) comes from Title II regulation.

But the FCC seems likely to relinquish Title II authority and they have suggested that the authority to regulate ISP products should shift to the Federal Trade Commission. Unfortunately for the ISPs, the FTC has more often sided with consumers over big companies.

Since the FCC is in the process of eliminating Title II authority I wonder if they will even respond to the ISPs. Because to clarify that advertising ‘up to’ products is acceptable under Title II would essentially mean creating a new broadband regulation, something this FCC seems loathe to do. I’ve seen several other topics just recently that fall into this same no-man’s land – issues that seem to require Title II authority in order for the FCC to have jurisdiction. As much as the big ISPs complained about Title II, one has to wonder if they really want it to go away? They mostly feared the FCC using Title II to address pricing issues, but there are a lot of other issues, like this request, where broadband regulation by the FCC might be in the ISPs’ favor.

FCC to Investigate MDU Broadband

The FCC is launching an investigation into anticompetitive practices that are keeping broadband from coming to apartments and other multi-tenant buildings. They have issued a Notice of Inquiry in Docket in GN Docket 17-142 looking into the topic and are expected later this month to formally release it to the public. The docket specifically looks at barriers to competition in what the FCC is calling MTEs – multiple tenant environments, which includes apartments, condominiums, shopping malls and cooperatively owned buildings.

This is not the first time that the FCC has tackled the topic. Back in 2008 the commission banned some contractual arrangements that gave incumbent ISPs unfair advantage over competitors. However, that order didn’t go far enough, and ISPs basically shifted to arrangements that were not banned by the FCC. The FCC is looking into the topic because it’s become obvious that exclusive arrangements are harming the introduction of faster broadband into a sizable portion of the market. There are cities where half or more of residents live in apartments and don’t have the same competitive choices as those living in single family homes.

The FCC has an interesting legal challenge in looking at this issue. This docket specifically looks at the potential for regulating broadband access in MTEs, something that the FCC has the authority to do under Title II regulation. But assuming that the FCC moves forward this year with plans to scrap Title II regulation they might also be eliminating their authority to regulate MTEs in the manner suggested by the NOI. If they decide to act on the issue it will be interesting to see how they define their authority to regulate anything that is broadband related. That might be our first glimpse at what a regulatory regime without Title II looks like.

Further, Chairman Ajit Pai has shown a strong preference to lighten the regulations on ISPs and you have to wonder if he is willing to really tackle a new set of regulations. But he’s faced with the dilemma faced by all regulators in that sometimes the market will not automatically produce the results that are beneficial to society and sometimes regulations are the only way to get corporations and others to behave in the way that benefits everybody. It’s clear that residents in MTEs have little or no competition and choice and new regulations might be the only way to get it for them.

The NOI looks at specific issues related to MTE broadband competition:

  • It asks if the FCC should consider overriding state and local regulations that inhibit the deployment of broadband in MTEs. Some jurisdictions have franchising and other rules that make it hard for a smaller competitor to try to serve only MTEs or parts of markets.
  • It asks if the FCC should prohibit exclusive marketing and bulk billing arrangements by ISPs.
  • It asks if the FCC should prohibit revenue sharing and exclusive wiring arrangements with ISPs.
  • It asks if there are other kinds on non-contractual practices that should be prohibited or regulated.

The NOI is interesting in that it tackles all of the topics that the FCC left untouched in 2008. When that order came out I remember thinking about all of the loopholes the FCC had left available to ISPs that wanted to maintain an exclusive arrangement with apartment owners. For example, bulk billing arrangements are where a landlord buys wholesale connections from an ISP and then includes broadband or cable TV as part of the rent, at a mark-up. Landlords under such arrangements are unlikely to allow in another competitor since they are profiting from the exclusive arrangement. The FCC at the time didn’t feel ready to tackle the issues associated with regulating landlord behavior.

The NOI asks for comments on the non-contractual issues that prohibit competition. I’ve seen many such practices in the marketplace. For instance, a landlord may tell tenants that they are pro-competition and that they allow access to multiple ISPs, but then charge exorbitant fees to ISPs for gaining access to buildings or for wanting to collocate electronics or to run wiring. I can think of dozens of different roadblocks that I’ve seen that effectively keep out competitors.

I am heartened a bit by this docket in that it’s the first thing this new FCC has done to solve a problem. Most of the work they’ve done so far is to dismantle old rules to reduce regulation. There is nothing wrong with that in general and I have my own long shopping list of regulations that are out of date or unnecessary. But there are industry issues like this one where regulation is the only way to provide a needed fix to a problem. It’s clear that large ISPs and many landlords have no interest in bringing competition to their buildings. And if that is a goal that the FCC wants to foster, then they are going to have to create the necessary regulations to make it happen – even if they prefer to not regulate.

Means Testing for FCC Funding – Part II

Yesterday I wrote about the recent blog by FCC Commissioners Michael O’Rielly and Mignon Clyburn that suggests that there ought to be a means test for anybody accepting Universal Service Funds. Yesterday I looked at the idea of using reverse auctions for allocating funds – an idea that I think would only serve to shift broadband funds to slower technologies, most likely rural cellular service for broadband. Today I want to look at two other ideas suggested by the blog.

The blog suggests that rural customers ought to pay more for broadband since it costs more to provide broadband in sparsely populated areas. I think the FCC might want to do a little research and look at the actual prices charged today for broadband where commercial companies have built rural broadband networks. It’s something I look at all of the time and all over the country, and from what I can see the small telcos, cooperatives, WISPs and others that serve rural America today already charge more than what households pay for broadband in urban areas – sometimes considerably more. I am sure there are exceptions to this and perhaps the Commissioners have seen some low rural pricing from some providers. But I’ve looked at the prices of hundreds of rural ISPs and have never seen prices below urban rates.

The small rural ISPs have to make a commercial go of their broadband networks and they’ve realized for years that the only way to do that is to charge more. In most urban areas there is a decent broadband option starting around $40 per month and you rarely see a price close to that in rural America. If you see a low price in rural America it probably offers a very slow speed of perhaps a few Mbps, which certainly doesn’t compare to the 60 Mbps I get from Charter for $44.95 per month.

The issue of rural pricing does raise one policy issue. Historically the Universal Service Fund was used for precisely what this blog seems not to like – to hold telephone rates down in rural America so that everybody in the country could afford to be connected. That policy led to the country having telephone penetration rates for decades north of 98%. I’m not advocating that USF funds ought to be used to directly hold down rural broadband rates, but it’s worth a pause to remember that was the original reason that the Universal Service Fund was started and it worked incredibly well.

The second idea raised by the blog is that Universal Service Funds ought not be used to build broadband to wealthy customers. They suggest that perhaps federal funding ought not to be used to bring broadband to “very rich people who happen to live in the more rural portions of our nation.”  The blog worries that poor urban people will be subsidizing ‘some of the wealthiest communities in America.’  I am sure in making that statement that the Commissioners must have a few real-life communities in mind. But I work all over the country and there are not very many pockets of millionaires in rural America, except perhaps for farmers.

Farmers are an interesting case when it comes to broadband. By definition farmers are rural. But US agriculture is the largest industry in the country and the modern farmer needs broadband to be effective. We are headed soon towards a time when farm yields can increase dramatically by use of IoT sensors, farm robots and other high technology that is going to require broadband. I know that a lot of the rural communities that are clamoring for broadband are farming communities – because those farms are the economic engine that drives numerous counties and regions of the country. I don’t think it’s unreasonable if we are going to rethink policy to talk about bringing broadband to our largest industry.

The FCC blog suggests that perhaps wealthier individuals ought to pay for the cost of getting connected to a broadband network. It’s certainly an interesting idea, and there is precedent. Rural electric companies have always charged the cost of construction to connect customers that live too far from their grid. But with that said we also have to remember that rural electric grids were purposefully built to reach as many people as possible, often with the help of federal funding.

This idea isn’t practical for two reasons. It’s already incredibly hard today to finance a fiber network. I picture the practical problem of somehow trying to get commitments from farmers or other wealthy individuals as part of the process of funding and building a broadband network. As somebody who focuses mostly on financing fiber networks this would largely kill funding new networks. To get the primary borrower and all of the ‘rich’ people coordinated in order to close a major financing is something that would drive most lenders away – it’s too complicated to be practically applied. The FCC might want to consult with a few bankers before pushing this idea too far.

But there is a more fundamental issue and the FCC blog touches upon it. I’m trying to imagine the FCC passing a law that would require people to disclose their income to some commercial company that wants to build a fiber network. I’m not a lawyer, but that sounds like it would bump against all sorts of constitutional issues, let alone practical ones. For example, can you really picture having to report your income to AT&T?  And I then go back to the farmers again. Farmers don’t make a steady income – they have boom years and bust years. Would we put them on or off the hook for contributing towards a fiber network based upon their most recent year of income?

I certainly applaud the Commissioners for thinking outside the box, and that is a good thing when it leads to discussions of ways to improve the funding process. I will be the first to tell you that the current USF distributions are not always sensible and equitable and there is always room for improvement. Some of the ideas suggested by the blog have been discussed in the past and it never hurts to revisit ideas. But what most amazes me about the suggestions made by this blog is that the proposed solutions would require a heavy regulatory hand – and this FCC, or at least its new Chairman has the goal of reducing regulation. To impose a means test or income test would go in the opposite direction and would require a new layer of intrusive regulations.

Can the States Regulate Internet Privacy

Since Congress and the FCC have taken steps to remove restrictions on ISPs using customer data, a number of states and even some cities have taken legislative steps to reintroduce some sort of privacy restrictions on ISPs. This is bound to end up in the courts at some point to determine where the authority lies to regulate ISPs.

Congress just voted in March to end restrictions on the ways that ISPs can use customer data, leading to a widespread fear that ISPs could profit from selling customer browsing history. Since then all of the large telcos and cable companies have made public statements that they would not sell customer information in this way, but many of these companies have histories that would indicate otherwise.

Interestingly, a new bill has been introduced in Congress called the BROWSER Act of 2017 that would add back some of the restrictions imposed on ISPs and would also make those restrictions apply to edge providers like Google and Facebook. The bill would give the authority to enforce the privacy rules to the Federal Trade Commission rather than the FCC. The bill was introduced by Rep. Marsha Blackburn who was also one of the architects of the earlier removal of ISP restrictions. This bill doesn’t seem to be getting much traction and there is a lot of speculation that the bill was mostly offered to save face for Congress for taking away ISP privacy restrictions.

Now states have jumped in to fill the void. Interestingly the states looking into this are from both sides of the political spectrum which makes it clear that privacy is an issue that worries everybody. Here is a summary of a few of the state legislative efforts:

Connecticut. The proposed law would require consumer buy-in before any “telecommunication company, certified telecommunications provider, certified competitive video service provider or Internet service provider” could profit from selling such data.

Illinois. The privacy measures proposed would allow consumers to be able to ask what information about them is being shared. The bills would also require customer approval before apps can track and record location information on cellphones.

Massachusetts. The proposed legislation would require customer buy-in for sharing private information. It would also prohibit ISPs from charging more to customers who don’t want to share their personal information (something AT&T has done with their fiber product).

Minnesota. The proposed law would stop ISPs from even recording and saving customer information without their approval.

Montana. The proposed law there would prohibit any ISPs that share customer data from getting any state contracts.

New York. The proposed law would prohibit ISPs from sharing customer information without customer buy-in.

Washington. One proposed bill would require written permission from customers to share their data. The bill would also prohibit ISPs from denying service to customers that don’t want to share their private information.

Wisconsin. The proposed bill essentially requires the same restrictions on privacy that were included in the repealed FCC rules.

This has even made it down to the City level. For example, Seattle just issued new rules for the three cable providers with a city franchise telling them not to collect or sell customer data without explicit customer permission or else face losing their franchise.

A lot of these laws will not pass this year since the new laws were introduced late into the legislative sessions for most states. But it’s clear from the laws that have been proposed that this is a topic with significant bipartisan support. One would expect a lot of laws to be introduced and enacted in legislative sessions that will occur later this year or early next year.

There is no doubt that at some point this is going to result in lawsuits to resolve the conflict between federal and state rules. An issue of this magnitude will almost certainly will end up at the Supreme Court at some point. But as we have seen in the past, during the period of these kinds of legislative and legal fights the status of any rules is muddy. And that generally means that ISPs are likely to continue with the status quo until the laws become clear. That likely means that ISPs won’t openly be selling customer data for a few years, although one would think that the large ones have already been collecting data for future use.