Lifeline on the Line?

There are two bills currently in Congress worth monitoring:

Rural Spectrum. The first is called the Advancing Innovation and Reinvigorating Widespread Access to Viable Electromagnetic Spectrum (AIRWAVES) Act (S. 1682) in the Senate. This was introduced by Senators Cory Gardner (R-CO) and Maggie Hassan (D-NH) and aims to encourage the FCC to continue to free up spectrum for both commercial and unlicensed use.

There are still significant chunks of spectrum that are restricted to government use, and there are other blocks of spectrum that are underutilized due to various restrictions that the FCC has put in place on their usage in the past. The FCC needs to keep looking hard at every viable slice of spectrum to make sure we are getting the best use for each slice.

The one feature of this bill that bothers me is that it would set aside 10% of any future spectrum auctions to directly build wireless broadband infrastructure in rural areas. The goal of freeing more spectrum for rural broadband is obviously a good idea, particularly freeing more unlicensed spectrum. But as we’ve seen from the CAF II program, handing billions of dollars to a company like AT&T to beef up cellular towers does not automatically equate to providing better rural broadband. The federal government needs to stop subsidizing AT&T and Verizon and instead aim any federal funding towards getting real rural broadband. The 10% giveaway is nothing more than a subsidy for these giant companies.

Lifeline Reform. A group of nineteen republicans in Congress recently introduced a bill that would curtail the federal Lifeline program. That program is part of the Universal Service Fund and provides a $9.25 monthly subsidy for lower income homes for telecommunications, which today can be used for landline telephone, cellular telephone or landline broadband, with only one such subsidy allowed per household.

The legislation would eliminate the Lifeline subsidy for cellular service. The wording is not clear, but it seems to also eliminate the subsidy to be used for broadband. The discussion of the bill in the press makes it sound like the intent of this bill is to restore the program to its original goal of only subsidizing landline telephones. With the diminished interest in landlines that original goal now seems largely out of touch.

If enacted this would significantly reduce the payments from the Lifeline fund, and the most troublesome aspect of the bill would be that it would then send these excess lifeline dollars to the US treasury. The entire Universal Service Fund is funded by monthly surcharges on Interstate telecom services. Every telephone subscriber, cellular subscriber and interstate transport customer currently pays these fees. The current surcharge is at a whopping 17.1% added to interstate telecom services – far higher than when the fund first started. The surcharge is now so high that it pushes service providers to engage in arbitrage to define services as something other than Interstate – in effect to cheat and lie about what they are selling.

The FCC and Congress have always worked hard to avoid defining the USF surcharge as a tax – but instead it’s defined as a ‘fee’ that had the purpose of funding the same kinds of telecom services in rural America that are available in urban America. But if Congress raids the USF fund they will have openly made it clear that is just another federal tax. If they are going to reduce the outflow from the USF fund then they need to then reduce the monthly fees.

It’s been obvious for years that the USF funding base needs to be expanded, with the logical expansion to add the fee to broadband services. If that was done the fee would be reduced to a tiny percentage of the cost of broadband instead of the 17.1% surcharge on interstate telecom service. But for reasons I can’t understand it still seems to be off-limits to put tax on broadband services, even though this is now the primary product sold by the industry.

It offends me as both a taxpayer and as a telecom industry guy if the payments made into the Universal Service Fund are now just going to become another hidden federal tax. The USF has done a lot of good over the years to bring better telecom and broadband to rural America. If this funding is not going to be used for that mission then the fee charged to customers needs to be reduced.

I think that current bad press might have prompted this bill. It’s been reported that there is fraud in some of the cellular Lifeline programs. But the way to clean that up is to cut off the offending service providers, not to penalize people for whom this is their only source of Internet connectivity. Studies have shown that for most of the people in the USF cellular programs the cellphone is their only source of Internet connectivity. These subsidies are not being used to subsidize expensive iPhones, as the press sometimes insinuates, but the companies providing the service provide inexpensive basic phones that come with a tiny capped amount of voice minutes and data downloads. A lot of the subsidized phones go to the homeless and other marginalized parts of society and the Lifeline phones provide them with a connection to services that would otherwise be out of their reach. I just can’t see the logic behind keeping the subsidy for landlines but not cellphones.

A New FCC Definition of Broadband?

Section 706 of the Telecommunications Act of 1996 requires that the FCC annually review broadband availability in the country. Further, that section of law then requires the FCC to take immediate action if they find that broadband is not being deployed fast enough. This is the law that in the past prompted the FCC to set a definition of broadband – first set at 4/1 Mbps a decade ago then updated to 25/3 Mbps in 2015. The FCC felt it couldn’t measure broadband deployment without a benchmark.

In this year’s annual proceeding the FCC has suggested a change in the definition of broadband. They are suggesting there should be a minimum benchmark of 10/1 Mbps used to define cellular broadband. That doesn’t sound like a bad idea since almost everybody uses cellular broadband at times and it would be good to know that the cellular companies have a speed target to shoot for.

But I am alarmed at how the FCC wants to use the new proposed cellular broadband standard. They are suggesting that cellular service that meets the 10/1 Mbps standard can be considered as a substitute for a landline broadband connection that meets the 25/3 Mbps test. This would represent a huge policy shift at the FCC because use of the cellular standard would allow them to claim that most Americans can get broadband. And that would eliminate them having to take any action to make broadband better in the country.

We can’t be particularly surprised by this shift in policy because now-Chairman Ajit Pai vociferously objected when the FCC increased the definition of broadband in January 2015 to 25/3 Mbps. He argued at the time that the speed definition of broadband should not be increased and that both satellite and cellular broadband ought to be considered as substitutes for landline broadband.

But as almost anybody with a broadband connection can tell you, speed is not the only parameter that matters with a broadband connection. Speed matters for folks in a busy broadband home like mine when different family members are trying to make simultaneous broadband connections. But even homes with lower broadband needs care about more than speed. The limiting factor with cellular data is the stingy amount of total downloads allowed in a month. The new ‘unlimited’ cellular plans are capped at 20 to 25 gigabytes per month. And satellite data not only has stingy data caps but also suffers from latency issues that means that a satellite customer can’t take part in any real-time activity on the web such as VoIP, distance learning or live streaming video.

There are several possible motives for this policy shift. First, this could just be an attempt by the FCC to take off the pressure of having to promote faster broadband everywhere. If their annual Section 706 examination concludes that most people in the country have broadband then they don’t have to push expensive federal programs to expand broadband coverage. But there is also the potential motive that this has been prompted by the cellular companies that want even more federal money to expand their rural cellular networks. AT&T has already been given billions in the CAF II proceeding to largely improve rural cellular towers.

Regardless of the motivation this would be a terrible policy shift. It would directly harm two huge groups of people – rural America and the many urban pockets without good broadband. This ruling would immediately mean that all urban areas would be considered to have broadband today along with a lot of rural America.

I don’t think this FCC has any concept of what it’s like living in rural America. There are already millions of households that already use cellular or satellite broadband. I’ve heard countless stories from households with schoolkids who spend upwards of $500 per month for cellular broadband – and even at that price these homes closely monitor and curtail broadband usage.

There are also huge swaths of rural America that barely have cellular voice service let alone 10/1 Mbps cellular broadband. I was recently in north-central Washington state and drove for over an hour with zero AT&T cell coverage. But even where there is cellular voice service the quality of broadband diminishes with distance from a cell tower. People living close by a tower might get okay cellular data speeds, but those even just a few miles away get greatly diminished broadband.

I know that Chairman Pai has two kids at home in Arlington Virginia. There he surely has fast broadband available from Comcast, and if he’s lucky he also has a second fast alternative from Verizon FiOS. Before the Chairman decides that cellular broadband ought to be a substitute for a landline connection I would challenge him to cut off his home broadband connection and use only cellular service for a few months. That would give him a taste of what it’s like living in rural America.

Broken Promises by Big ISPs

One of the most frustrating things for regulators has to be when giant ISPs renege on regulatory deals they’ve negotiated and don’t follow through with their promises. Books could be written listing all of the times when big ISPs have promised to do something and then never did it.

I am reminded of one such deal when I read that New York City is suing Verizon over its broken promise to bring FiOS fiber to the city. The lawsuit states that almost a million households are still unable to get FiOS, although the company had promised full coverage when they got a franchise from the city in 2008. In that agreement Verizon promised to bring fiber service to the whole city by 2014. The agreement with the city required that Verizon bring fiber, in conduit, directly in front of, behind, or otherwise adjacent to every residential building in the City.

Verizon had a similar longstanding dispute with the State of Pennsylvania. Back in 2002 the company made a promise to bring DSL service to cover 80% of the state as a prerequisite for the company being relieved of a lot of regulatory oversight by the state. But Verizon never completed a lot of the needed upgrades and huge parts of rural Pennsylvania still didn’t have DSL a decade later.

I wrote a blog a few months back about Charter in New York. There the state had found that the cable modems deployed by the company were not technically capable of delivering anything close to the speeds that the company was advertising. Charter agreed to fix the problem, but five years later had made almost no upgrades and was recently sued by the State.

I could list more examples all day long and there have been disputes all across the country with major telcos and cable companies that have made deals with regulators and then either ignored the agreements or only implemented them in a half-hearted manner.

The problem is that there are really no regulatory penalties that are big enough to penalize an ISP for not doing what it promised. There have been fines levied, but those fines are never nearly as big as the profits or savings realized by the ISPs for ignoring the agreements with regulators. For example, it’s unlikely that lawsuits or penalties will be able to force Verizon to finish the FiOS build in New York City. I am sure the company built to the parts of NYC that made economic sense and decided, for whatever reason, that there is not sufficient payback to justify building to the remaining parts of the city.

And that’s what regulators fail to recognize – big ISPs make decisions based upon the anticipated return for stockholders. I think it’s likely that in many of these cases that the big ISPs had no intention of complying with their agreements from the start. The cynical side of me says that they are often willing to take the upsides associated with these kinds of deals – be that decreased regulation or the ability to complete a merger – while knowing up front that they are unlikely to ever complete whatever they have agreed to do.

I think we are likely to see another round of broken promises in a few years as we start moving towards the end of the FCC’s CAF II program. The big telcos accepted over $9 billion over six years to improve rural broadband to speeds of at least 10 Mbps. I’ve been getting feedback from a lot of areas in the country that those deployments seem to be behind schedule. It will certainly come as no surprise if one or more of the big telcos spends the CAF II funding without bringing broadband to the promised households, or else will deliver speeds under the promised levels. The FCC recently issued a warning to carriers telling them that it expects them to fulfill the CAF II commitments – and I suspect that warning is due to the same kind of rumblings I’ve been hearing.

But ultimately the FCC doesn’t really have any way to make these telcos complete the builds. They might withhold future funding from the telcos, but as the FCC keeps eliminating regulation it is going to have very little ability to enforce the original CAF II agreements or to take any steps to really penalize the telcos.

The saddest part of these various broken promises is that millions of real people get hurt. It’s been reported that there are significant pockets of residents in urban areas like New York City that still don’t have even one broadband provider. There are huge rural swaths of the country that are desperate for any kind of broadband, which is what CAF II is supposed to deliver for the first time. But I think we need to be realistic in that big ISPs often do not meet their promises – whether deliberately or not. And perhaps it’s finally time to stop making these big deals with companies that have a history of broken promises.

Regulating Edge Providers

The year is only half over and already it seems like this might be the most interesting year for regulations we’ve had in my lifetime. It seems like a lot of the telecom regulations we’ve lived with for decades are being reconsidered and that nothing is guaranteed to stay the same.

Perhaps the most novel new idea I’ve heard comes from Steve Bannon in the White House. He believes that Google and Facebook have become so dominant that they should be regulated as utilities. He envisions this being done in much the same manner as is done with telephone and cable companies.

It’s not an entirely novel concept and the European Union has kicked around ideas for curbing the power of big software companies like Microsoft, Google and Facebook. I find the concept to be a little strange coming out of this administration since they seem to be largely anti-regulation and seem to be intent on lowering regulations for both telephone and cable companies. Trying to regulate these companies would have to mean a lot of new regulations.

The first question that popped into my head when I heard this was to ask what a government might regulate with these companies. The European Union went after Google in 2016 for their practices of requiring that cellphones default to the Google search engine and to the Chrome browser. In 2015 they objected that Google used its market power to insist that cellphones use the Android operating system. But these kinds of issues are related to abuse of monopoly power and there are already rules in the US that can tackle these issues, should the government care to do so. I don’t think this is what Bannon has in mind.

It seems like it would be a real challenge to regulate the main business lines of the two companies. You can’t regulate prices because Google and Facebook are free to users. They don’t directly sell anything to their users on their core platforms. If these companies are large it’s because they have a platform that a lot of people want to use. People have a lot of options for using alternate social media platforms or search engines. People seem to use these two companies because they offer something people want – and I really can’t imagine how you can regulate that.

It’s also hard to envision a single country really regulating these entities. We already know what that looks like today by seeing how these big companies operate in China. Probably lesser known is there are many other countries where the companies offer something different that what we see in the US. My guess is that regulation wouldn’t fundamentally change these companies – but it could make them modify the public face of the company if we tried to regulate – something that their many users would probably strongly resent.

I think perhaps the best argument against regulating these two companies is that there is no guarantee that they are going to maintain their current market dominance, or even survive as companies for the long-haul.

The online world has proven to be fickle and people’s collective tastes change over time. Already today US teenagers have largely bailed on Facebook and view it as a platform for their parents and grandparents. I know my daughter only maintains a presence on the platform to communicate with older relatives and that she communicates with her friends elsewhere. Facebook may have over a billion users today, but that is not to say that over a few decades that something better might come along and that they could lose a lot of that market power.

Google faces an even bigger long-term problem. Google relies on people making searches on computers and cellphones. There are a lot of tech experts predicting that search engines will be passe within only a few decades. They predict that people will begin talking directly to an AI-based personal assistant to perform most of the tasks that cellphones do today.

Both Google and Facebook make most their money today from advertising. But in a future world where everybody communicates through a smart personal assistant the direct interface between people and web platforms like Google or Facebook might nearly disappear. The advertising aspect of the Google search engine will disappear if your smart personal assistant is making choices for you based upon your preferences. In an AI-driven future both search engines and social media are likely to be replaced by something drastically different.

The conclusion I reach is that government is not really in a position to regulate the ever-changing world of the big edge providers. Facebook or Google may have a dominant position in their market niches today but in a decade could be in a different place. Just go back and make a list of the big technology players of twenty years ago. It would have been a waste of time to regulate AOL, Compuserve or the other platforms that dominated the web then. Those companies were usurped by something people found to be of more value. Regulation, by definition, assumes a predictable world – something that is unlikely in the edge provider world.

Local, State or Federal Regulation?

Last week the FCC clarified its intentions for the Broadband Deployment Advisory Committee (BDAC). This group was tasked with exploring a wide range of topics with the goal of finding ways to lower barriers for broadband deployment.

The BDAC was divided into subgroups with each examining issues such as speeding up access to poles and conduits, or how to streamline the morass of local regulations of such things as rights-of-ways that can slow down fiber deployment.

There has been a huge amount of buzz in the industry since the expectation has been that the FCC would act to impose federal rules that ‘fix’ some of the most important impediments to competition. That expectation was bolstered on several occasions by speeches made by new FCC Chairman Ajit Pai that hinted that the FCC was willing to take steps to lower barriers to broadband deployment.

But FCC Senior Counsel Nicholas Degani just clarified that the FCC’s intentions are not to create new regulations, but rather to create ‘model codes’ that they hope that cities and states around the country will use to make it easier to deploy broadband.

We’ll have to wait a while to see if the FCC really can refrain from issuing new regulations. Chairman Pai has said many times that he is in favor of ‘light touch’ regulation and the agency is in the process of relaxing or undoing many of the regulations from the past. But one thing that I have repeatedly seen from regulators over the years is that they love to regulate. It will take major restraint for the FCC to not try to ‘fix’ the many problems that the BDAC is highlighting. This will be the ultimate test to see if they really are anti-regulation.

Frankly, some of the issues that the BDAC has been exploring cry out for some sort of regulatory relief. For example, in some parts of the country it takes so long and is so expensive to get onto poles that it’s nearly impossible to implement a business plan that needs pole access. And it is extremely frustrating for a national company that deploys fiber everywhere to work with local rules that vary widely from city to city.

Part of what is pushing this effort is the fact that everybody expects a massive investment in new fiber over the next decade as fiber is built to bring bandwidth to homes and as we deploy 5G networks. Everybody recognizes that there are impediments that add delay costs to those deployments.

At the same time that the FCC has been looking at the issues there are numerous state attempts to create state regulatory rules to fix some of these problems. A number of states have already created regulations that are aimed at making it easier to do things like get access to poles. But state efforts vary widely in the motivation for new regulations. There are some states that are looking hard at imposing statewide rules that balance the needs of competitors, network owners and municipalities.

But there are other attempts prompted by the big cellular companies and ISPs to run roughshod over the rights of pole owners and municipalities. These efforts are being driven, in part, by model legislation developed by ALEC and funded by the big companies. Many of these rules are attempting to set low nationwide rates for pole attachments and also to force streamlined timelines that ignore local conditions.

Finally, there are efforts being made by many cities to make it easier to deploy broadband. Most cities understand that they need fiber everywhere to remain competitive with other cities. Yet these efforts are often ineffective because cities, by definition, have a lot of stakeholders to satisfy. When a City looks at changing local rules they end up have to give a lot of weight to issues such as the environment, aesthetics, historic preservation, safety, unions and others that make it impossible to create rules that favor fiber deployment over these other concerns.

Fixing these issues is a problem that may never find the right solution. We live in a country where cities across the board have been granted varying degrees of controlling things like rights-of-way that affect network deployments. Fiber deployment is not the first issue that has come along that has pitted federal, state and local regulators against each other when trying to solve the same problems. It’s not unlikely that if either the FCC or the states try to strongarm cities that we will see a pile of lawsuits challenging any egregious decisions. And that just leads to delays since disputed laws don’t go into effect. I hope we can find solutions that don’t lead to those lawsuits, because the worst kind of regulation is one that is in limbo in some court for years. Nobody is likely to make any significant new investment in that environment.

The Consequences of Killing Network Neutrality

It looks almost certain that the FCC is going to kill Title II regulation, and with it net neutrality. Just as happened the last go around the FCC has already received millions of comments asking it to not kill net neutrality. And if you read all of the press you find dire predictions of the consequences that will result from the death of net neutrality. But as somebody who has a decent understanding of the way that broadband and the associated money flows in the industry I don’t think it will be as dire as critics predict, and I think there will also be unanticipated consequences.

Impact on Start-ups – the Cost of Access. One of the dire predictions is that a new start-up company that uses a lot of broadband – the next Netflix, Vine or Snapchat – won’t be able to gain the needed access with carriers, or that their access will be too expensive. Let me examine that conjecture:

  • Let me follow the flow of money that a start-up needs to spend to be on the web. Their direct largest cost is the cost of uploading their content onto the web through an ISP. The pricing for bulk access has always favored the bigger players and it’s more expensive today for a company that wants to upload a gigabyte per day compared to somebody that uploads a terabyte.
  • The normal web service doesn’t pay anything to then deliver their content to customers. Customers buy various speeds of download and use the product at will. Interestingly, it’s only the largest content providers that might run into issues without net neutrality. The big fights a few years ago on this issue were between Netflix and the largest ISPs. The Netflix volumes had grown so gigantic that the big ISPs wanted Netflix to somehow contribute to the big cost of electronics the ISPs were expending to distribute the service. The only way that there would be some cost to start-ups to terminate content would be if the ISPs somehow created some kind of access fee to get onto their network. But that sounds largely impractical. Bytes are bytes and they don’t exactly contain the name and billing address of the party that dumped the traffic on the web.
  • Some content like live video is a complicated web product. You can’t just dump it on the web at one location in the country and hope it maintains quality everywhere it ends up. There are already companies that act as the intermediary for streaming video to carry out the caching and other functions needed to maintain video quality. Even the big content providers like SlingTV don’t tackle this alone.
  • Finally, there will arise new vendors that will assist start-ups by aggregating their traffic with others. We already see that today with Amazon which is bundling the content of over 90 content providers on its video platform. The content providers benefit by taking advantage of the delivery mechanisms that Amazon has in place. This is obviously working and it’s hard to see how the end of net neutrality would stop somebody like Amazon from being a super-bundler. I think wholesalers like Amazon would fill the market gap for start-ups.

Paid Prioritization. The other big worry voiced by fans of Title II regulation is that it stops paid prioritization, or Internet fast lanes. There are both good and bad possible consequences of that.

  • It’s silly to pretend that we don’t already have significant paid prioritization – it’s called peering. The biggest content providers like Google, Netflix and Amazon have negotiated peering arrangements where they deliver traffic directly to ISPs in specific markets. The main benefits of this for the content providers is that it reduces latency and delay, but it also saves them from buying normal uploads into the open Internet. For example, instead of dumping content aimed at Comcast in Chicago onto the open web these big companies will directly deliver the Chicago-bound traffic to Comcast. These arrangements save money for both parties. And they are very much paid prioritization since smaller content providers have to instead route through the major Internet POPs.
  • On the customer side of the network, I can envision ISPs offering paid prioritization as a product to customers. Customer A may choose to have traffic for a medical monitoring company always get a priority, customer B might choose a gaming service and customer C might choose a VoIP connection. People have never had the option of choosing what broadband connections they value the most and I could see this being popular – if it really works.
  • And that leads into the last big concern. The big fear about paid prioritization is that any service that doesn’t have priority is going to suffer in quality. But will that really happen? I have a fairly good broadband connection at 60 Mbps. That connection can already deliver a lot of different things at the same time. Let’s say that Netflix decided to pay my ISP extra to get guaranteed priority to my house. That might improve my Netflix reception, although it already seems pretty good. But on my 60 Mbps connection would any other service really suffer if Netflix has priority? From what I understand about the routing of Internet traffic, any delays caused by such prioritization would be miniscule, probably in microseconds, which would be nearly imperceptible to me. I can already crash my Internet connection today if I try to download more content than it can handle at the same time. But as long as a customer isn’t doing that, I have a hard time seeing how prioritization will cause much problem – or even why somebody like Netflix would pay an ISP extra for it. They are already making sure they have a quality connection through peering and other network arrangements and I have a hard time understanding how anything at the customer end of the transaction would make much difference. This could be important for those on slow broadband connections – but their primary problem is lack of broadband speed and they are already easily overwhelmed by too much simultaneous traffic.

I am not as fearful of the end of net neutrality as many because I think the Internet operates differently than what people imagine. I truly have a hard time seeing how the ending net neutrality will really change the way I receive broadband at my home. However, I do have big concerns about the end of Title II regulation and fear things like data caps and of my ISP using my personal information. I think most of folks real concern is about Title II regulation, but that’s too esoteric for most folks and we all seem to be using the term ‘network neutrality’ as a substitute for that.

FCC Reverses 2016 Privacy Ruling

The FCC adopted an order that formally recognized that the privacy rules passed by the Tom Wheeler FCC are cancelled and that the FCC will revert to the previous privacy rules that were in effect in the past. The action is mostly a clarification because Congress passed H.J. Res 34, the Congressional Review Act that nullified the actions of the last FCC.

This FCC means a number of things. For regulated telephone providers (both LECs and CLECs) it means that all of the previous rules that were generally referred to as a Customer Proprietary Network Information (CPNI) are back in effect. Those rules are codified in FCC Rules Section 64.2009(e) and (c). Those rules include:

  • An obligation to not disclose telephone customer data without permission from the customer.
  • An annual compliance certification to demonstrate compliance with the CPNI rules. This filing will be due next year again, filed no later than March 1.
  • Compliance with various recordkeeping rules that would demonstrate compliance should a carrier ever be audited.

The FCC also reminded non-regulated ISPs that while they are not directly subject to the CPNI rules that they are still subject to Section 222 of the Communications Act that says that all carriers must take reasonable and good faith steps to protect customer privacy.

The rules passed by the last FCC would have brought all ISPs into the same regulations as telcos. And in doing so the rules went further than in the past and required that any service provider get customer buy-in before using their data. Customers were to have been provided with the option to allow ISPs to use data for any purpose, to allow ISPs to use data just for marketing to the customer, or customers could have opted out and chosen full privacy.

One of the big public fears that was voiced in opposition to the congressional action that reversed the privacy rules is that ISPs are now free to use customer information in any manner and that they could even go so far as to ‘sell the browsing history’ of customers on the open market. If ISPs misuse customer broadband data in too egregious of a manner I guess we’ll have to wait for a specific complaint using the Section 222 rules to see what level of protection data customers actually have.

All of the big ISPs have come out and said that they would never sell customer browsing data, and it’s probable that even under the older rules that are still in place that directly selling specific customer data might be illegal.

But we know that the big ISPs have all made plans to monetize customer data, and many of them have already been doing that for years. The most likely use of customer data will be for the biggest ISPs to engage in the same kind of advertising that is being done by Google and Facebook. The social media companies have built detailed profiles of their customers, something that advertisers find valuable. But the ISPs have a big advantage over the social media companies in that they know a lot more about customers including all of the web searches they make and all of the web sites they visit. The big ISPs all have branches of their business that are focusing on this kind of advertising, and even smaller ones like Altice recently purchased a company that creates targeted advertising based upon customer profiles.

There was an article in Forbes earlier this year by Thomas Fox-Brewster that speculated that targeted advertising is what the ISPs really want. They look at the gigantic revenues being earned by Google and Facebook and want a piece of that action. He doesn’t believe that the ISPs will directly sell data, which might invite retaliation from future regulators. But he does speculate that over time that customer information from the ISPs will leak into the public through the companies that use their data for targeted advertising. The web advertisers are not bound by any legal restrictions on using purchased data and over time, as they do various ad campaigns they could effectively build pretty detailed customer profiles based upon different a series of ad campaigns.

Certainly this is of concern to many people. People are free to avoid services like Facebook or Google if they want to maintain privacy, but it takes a lot of effort to hide from their ISP. And while ISPs are probably never going to market a database directly that shows a given customer’s browsing history, as they use our data for advertising purposes they are going to be providing bits of pieces about each of us, that over time can be reassembled to create incredibly detailed profiles. Folks who are savvy and concerned about this are going to thwart the ISPs as much as possible through the use of VPNs and other tools to hide their web activity. But it’s likely that most people won’t do this and I would expect over the next few years to see the ISPs pop onto the radar in a big way as advertisers.

False Advertising of Broadband Speeds

There is another new and interesting regulatory battle happening now at the FCC. The lobbying groups that represent the telcos and cable company – NCTA, USTelecom and the ACA – are asking the FCC to make it harder for states to sue ISPs for making misleading claims about broadband speeds.

This request was brought about from a lawsuit by the State of New York against Charter Spectrum. I wrote about that case in a March blog and won’t repeat all of the particulars. Probably the biggest reason for the suit was that Charter had not made the major upgrades and improvements that they had promised to the State. But among the many complaints, the one that worried the other ISPs the most was that Charter was not delivering the speeds that it was advertising.

This is an issue that affects many ISPs, except perhaps for some fiber networks that routinely deliver the speeds they advertise. Customer download speeds can vary for numerous reasons, with the primary reason being that networks bog down under heavy demand. But the New York complaint against Charter was not about data speeds that slowed down in the evenings; rather the complaint was that Charter advertised and sold data products that were not capable of ever reaching the advertised speeds.

Charter is perhaps the best poster child for this issue, not just because of the New York case. On their national website they only advertise a speed of 60 Mbps download, with the caveat that this is an ‘up to’ speed. I happen to have Charter in Asheville, NC and much of the time I am getting decent speeds at or near to the advertised speed. But I work all over the country and I am aware of a number of rural Charter markets where the delivered speeds are far below that 60 Mbps advertised speed. These markets appear to be like the situation in New York where the State accuses Charter of false advertising.

The filings to the FCC want a clarification that what Charter is doing is okay and that ISPs ought to be exempt from these kinds of suits. They argue that the ISP industry have always sold ‘up to’ speeds and that what they are doing fits under existing FCC regulations. And this is where it gets murky.

In the FCC’s first attempt to introduce net neutrality the FCC ordered ISPs to disclose a lot of information to customers about their broadband products, including telling them the real speeds they could expect for their purchased broadband product. Much of that first net neutrality decision was struck down due to a successful lawsuit by Verizon that claimed that the FCC didn’t have the authority to regulate broadband as laid forth by that order.

But not all of that first order was reversed by the lawsuit, including the provision that ISPs had to disclose information about their network performance, fees, data caps, etc. But since most of the original net neutrality order was reversed the FCC put the implementation of the remaining sections on hold. Last year the FCC finally decided to implement a watered-down version of the original rules, and in February of this year the FCC excused smaller ISPs from having to make the customer disclosures. But the large ISPs are now required to report specific kinds of information to customers. The ISPs interpret the current FCC rules to mean that selling ‘up to’ data products is acceptable.

Where this really gets interesting from a regulatory perspective is that the FCC might not long have the authority to deal with these sorts of requests from the ISPs. The bulk of the FCC’s authority to regulate broadband (and thus to potentially shield the ISPs in this case if they are complying with FCC regulations) comes from Title II regulation.

But the FCC seems likely to relinquish Title II authority and they have suggested that the authority to regulate ISP products should shift to the Federal Trade Commission. Unfortunately for the ISPs, the FTC has more often sided with consumers over big companies.

Since the FCC is in the process of eliminating Title II authority I wonder if they will even respond to the ISPs. Because to clarify that advertising ‘up to’ products is acceptable under Title II would essentially mean creating a new broadband regulation, something this FCC seems loathe to do. I’ve seen several other topics just recently that fall into this same no-man’s land – issues that seem to require Title II authority in order for the FCC to have jurisdiction. As much as the big ISPs complained about Title II, one has to wonder if they really want it to go away? They mostly feared the FCC using Title II to address pricing issues, but there are a lot of other issues, like this request, where broadband regulation by the FCC might be in the ISPs’ favor.

Tackling Pole Attachment Issues

In January the new FCC Commissioner Ajit Pai announced the formation of a new federal advisory committee  – the Broadband Deployment Advisory Committee (BDAC). This new group has broken into sub-groups to examine various ways that the deployment of broadband could be made easier.

I spoke last week to the Sub-Committee for Competitive Access to Broadband Infrastructure, i.e. poles and conduits. This group might have the hardest task of all because getting access to poles has remained one of the most challenging tasks of launching a new broadband network. Most of the issues raised by a panel of experts at the latest meeting of this committee are nearly the same issues that have been discussed since the 1996 Telecommunications Act that gave telecom competitors access to this infrastructure.

Here are some of the issues that still make it difficult for anybody to get onto poles. Each of these is a short synopsis of an issue, but pages could be written about the more detailed specifics involved each of these topics:

Paperwork and Processes. It can be excruciatingly slow to get onto poles for a fiber overbuilder, and time is money. There are processes and paperwork thrown at a new attacher that often seem to be done for no other reason than to slow down the process. This can be further acerbated when the pole owner (such as AT&T) is going to compete with the new attacher, giving the owner incentives to slow-roll the process as has been done in several cities with Google Fiber.

Cooperation Among Parties. Even if the paperwork needed to get onto poles isn’t a barrier, one of the biggest delays in the process of getting onto poles can be the requirement to coordinate with all of the existing attachers on a given pole. If the new work requires any changes to existing attachers they must be notified and they must then give permission for the work to be done. Attachers are not always responsive, particularly when the new attacher will be competing with them.

Who Does the Work? Pole owners or existing attachers often require that a new attacher use a contractor that they approve to make any changes to a pole. Getting into the schedule for these approved contractors can be another source of delay if they are already busy with other work. This process can get further delayed if the pole owner and the existing attachers don’t have the same list of approved contractors. There are also issues in many jurisdictions where the pole owner is bound by contract to only use union workers – not a negative thing, but one more twist that can sometimes slow down the process.

Access Everywhere. There are still a few groups of pole owners that are exempt from having to allow attachers onto their poles. The 1996 Act made an exception for municipalities and rural electric cooperatives for some reason. Most of these exempt pole owners voluntarily work with those that want access to their poles, but there are some that won’t let any telecom competitor on their poles. I know competitive overbuilders who were ready to bring fiber to rural communities only to be denied access by electric cooperatives. In a few cases the overbuilder decided to pay a higher price to bury new fiber, but in others the overbuilder gave up and moved on to other markets.

Equity. A new attacher will often find that much of the work needed to be performed to get onto poles is largely due to previous attachers not following the rules. Unfortunately, the new attacher is still generally on the hook for the full cost of rearranging or replacing poles even if that work is the result of poor construction practices in the past coupled with lax inspection of completed work by pole owners.

Enforcement. Perhaps one of the biggest flaws in the current situation is enforcement. While there are numerous federal and state laws governing the pole attachment process, in most cases there are no remedies other than a protracted lawsuit against a pole owner or against an existing attacher that refuses to cooperate with a new attacher. There is no reasonable and timely remedy to make a recalcitrant pole owner follow the rules.

And enforcement can go the other way. Many of my clients own poles and they often find that somebody has attached to their poles without notifying them or following any of the FCC or state rules, including paying for the attachments. There should be penalties, perhaps including the removal of maverick pole attachments.

Wireless Access. There is a whole new category of pole attachments for wireless devices that raise a whole new set of issues. The existing pole attachment rules were written for those that want to string wires from pole to pole, not for placing devices of various sizes and complexities on existing poles. Further, wireless attachers often want to attach to light poles or traffic signal poles, both for which there are no existing rules.

Solutions. It’s easy to list all of the problems and the Sub-Committee for Competitive Access to Broadband Infrastructure is tasked with suggesting some solutions to these many problems. Most of these problems have plagued the industry for decades and there are no easy fixes for them. Since many of the problems of getting onto poles are with pole or wire owners that won’t comply with the current attachment rules there is no easy fix unless there can be a way to force them to comply. I’ll be interested to see what this group recommends to the FCC. Since the sub-committee contains the many different factions from the industry it will be interesting to see if they can come to a consensus on any issue.

Do We Need International Digital Laws?

German Chancellor Angela Merkel said a few weeks ago that the world needs international regulation of digital technology, much like we have international regulation for financial markets and banking.

She says that without some kind of regulations that isolated ‘islands’ of bad digital actors can emerge that are a threat to the rest of the world. I am sure her analogy is a reference to the handful of islands around the globe that play that same maverick role in the banking arena.

We now live in a world where a relatively small number of hackers can cause incredible harm. For instance, while never definitely proven, it seems that North Korea hackers pulled off the major hack of Sony a few years ago. There are accusations across western democracies that the Russians have been using hacking to interfere with elections.

Merkel certainly has a valid point. Small ‘islands’ of hackers are one of the major threats we face in the world today. They can cause incredible economic harm. They threaten basic infrastructure like electric grids. They make it risky for anybody to be in the Internet at a time when broadband access is becoming an integral part of the lives of billions.

There currently aren’t international laws that are aimed at fighting the nefarious practices of bad hackers or at punishing them for their crimes. Merkel wasn’t specific about the possible remedies. She said that the US and Germany have undertaken discussions on the topic but that it hasn’t gone very far. There are certainly a few things that would make sense at the international level:

  • Make certain kinds of hacking an international crime so that hacker criminals can more easily be pursued across borders.
  • Create a forum for governments to better coordinate monitoring hackers and devising solutions for blocking or stopping them.
  • Make laws to bring cryptocurrency under the same international auspices as other currencies.

But as somebody who follows US telecom regulation in this blog I wonder how fruitful such regulations might be? We now live in a world where hackers always seem to be one step ahead of the security industry that works to block them. The cat and mouse game between hackers and security professionals is a constantly changing one and I have to wonder how any set of rules might be nimble nimble enough to make any difference.

This does not mean that we shouldn’t have an international effort to fight against the bad actors – but I wonder if that cooperation might best be technical cooperation rather than the creation of regulations that might easily be out of date as they are signed into law.

Any attempt to create security regulations also has to wrestle with that fact that a lot of what we think of as hacking is probably really government sponsored cyberwarfare. How do we tell the difference between cyber-criminals and cyber-warriors? In a murky world where it’s always going to be hard to know who specifically wrote a given piece of code I wonder how we tell the criminal bad guys from the government bad guys?

I also see a dilemma in that any agreed-upon international laws must, by definition filter back into US laws. We now have an FCC that is trying to rid itself of having to regulate broadband. Assuming that Title II regulation will be reversed I have to wonder if the FCC would be able to try to require ISPs to comply with any international laws at a time when there might not even be many US laws that can be enforced on them.

It makes sense to me that there ought to be international cooperation in identifying and stopping criminal hackers and others that would harm the web. But I don’t know if there has even been an issue where the governments of the world engage in many of the same practices as the bad actors – and that makes me wonder if there can ever be any real cooperation between governments to police or control bad practices on the web.