California Lowers the Definition of Broadband

California Governor Jerry Brown just signed a bill into law that lowers the official definition of broadband in the state while also providing state funding to upgrade rural broadband. The bill, AB 1665, goes into effect immediately. It lowers the definition of broadband in the state to 10 Mbps down and 1 Mbps up. But it goes even further and lowers the definition of an unserved customer to somebody who can’t get speeds of 6 Mbps and 1 Mbps up.

The bill reinstates a telecom tax that will provide a $300 million fund intended to be used to improve rural broadband. The California press believes that the fund will largely go to AT&T and Frontier, which both lobbied hard for the bill. My reading of the bill is that the incumbent carriers have first shot at the funding and anybody else only gets it when they don’t take it. In practical terms, assuming those two companies take the funding, almost none of this money would be made available to anybody who wants to build something faster in unserved areas.

We know that state funding done the right way can be a tremendous boon to broadband expansion. Consider, for example, the Minnesota DEED grants that have coaxed dozens of telecom providers to expand fiber networks deep into unserved and underserved areas of the state. It’s commonly understood that it can be hard to justify bringing fiber to rural areas, but some grant funding can be an effective tool to attract private money to fund the rest.

We also understand today that there are huge economic benefits for areas that have good broadband. The farmers in Minnesota that benefit from the grant program there are going to have a competitive advantage over farmers elsewhere that have little or no broadband. I’ve been looking at the IOT and other fiber-based technologies on the horizon for farming that are going to vastly increase productivity.

We also know that having good broadband benefits the small communities in rural America as well. These communities have been experiencing brain drain and economic flight as people are forced to go to metropolitan areas to find work. But broadband opens up work-at-home opportunities that ought to make it possible for families to thrive in rural America.

This move by California is a poor decision on many levels. First, it funnels money to the incumbent providers to make tiny tweaks to the existing networks so that existing broadband is just a little better. The new 10/1 Mbps broadband definition is also nothing more than a legislative definition of broadband and has no relevance in the real world. Many homes need more broadband than that, and as household broadband demand grows, a 10/1 Mbps connection will become inadequate for every home.

Another reason this is a bad idea is that the incumbents there are already making improvements to increase broadband to the 10/1 Mbps level. AT&T took $361.4 million of FCC CAF II funding that is to be used to upgrade broadband to 141,500 homes in California. That works out to $2,554 per home passed. Frontier took another $36.6 million, or $2,853 per home passed to improve broadband to 12,800 homes. That federal money requires that speeds increase to the 10/1 Mbps speed. This state funding will be an additive to those large federal amounts that these two companies have already received from the government.

AT&T has also already said that it plans to meet its CAF II obligations by upgrading rural cellular speeds. Frontier is mostly going to improve DSL on ancient copper and also is now looking at using point-to-point wireless technology to meet the CAF II obligations.

I don’t know how much it’s going to cost these companies to upgrade their rural customers to 10/1 Mbps. But the federal funding might be enough to pay for all of it. Adding the state funding means it’s likely that these two companies will make an immediate profit from upgrading rural customers to barely adequate broadband speeds. As we’ve seen many times in the past, this bill is good evidence that the big companies get value out of their lobbying efforts. The losers in all of this are the homes that won’t get anything faster than CAF II broadband. This $300M could have been used as matching grants to bring much faster broadband to many of these homes.

 

CAF II and Wireless

Frontier Communications just announced that they are testing the use of wireless spectrum to complete the most rural portions of their CAF II build-out requirement. The company accepted $283 million per year for six years ($1.7 billion total) to upgrade broadband to 650,000 rural homes and businesses. That’s a little over $2,600 per location passed. The CAF II program requires that fund recipients increase broadband to speeds of at least 10 Mbps down and 1 Mbps up.

Frontier will be using point-to-multipoint radios where a transmitter is mounted on a tower with the broadband signal then sent to a small antenna at each customer’s location. Frontier hasn’t said what spectrum they are using, but in today’s environment it’s probably a mix of 2.4 GHz and 5 GHz WiFi spectrum and perhaps also some 3.65 GHz licensed spectrum. Frontier, along with CenturyLink and Consolidated told the FCC a year ago that they would be interested in using the spectrum in the ‘citizens’ radio band’ between 3.7 MHz and 4.2 MHz for this purpose. The FCC opened a docket looking into this spectrum in August and comments in that docket were due to the FCC last week.

I have mixed feelings about using federal dollars to launch this technology. On the plus side, if this is done right this technology can be used to deliver bandwidth up to 100 Mbps, but in a full deployment speeds can be engineered to deliver consistent 25 Mbps download speeds. But those kinds of speeds require an open line-of-sight to customers, tall towers that are relatively close to customers (within 3 – 4 miles) and towers that are fiber fed.

But when done poorly the technology delivers much slower broadband. There are WISPs using the technology to deliver speeds that don’t come close to the FCC’s 10/1 Mbps requirement. They often can’t get fiber to their towers and they will often serve customers that are much further than the ideal distance from a tower. Luckily there are many other WISPs using the technology to deliver great rural broadband.

The line-of-sight issue is a big one and this technology is a lot harder to make work in places with lots of trees and hills, making it a difficult delivery platform in Appalachia and much of the Rockies. But the technology is being used effectively in the plains and open desert parts of the country today.

I see downsides to funding this technology with federal dollars. The primary concern is that the technology is not long-lived. The electronics are not generally expected to last more than seven years and then the radios must be replaced. Frontier is using federal dollars to get this installed, and I am sure that the $2,600 per passing is enough to completely fund the deployment. But are they going to keep pouring capital into replacing radios regularly over time? If not, these deployments would be a sick joke to play on rural homes – giving them broadband for a few years until the technology degrades. It’s hard to think of a worse use of federal funds.

Plus, in many of areas where the technology is useful there are already WISPs deploying point-to-multipoint radios. It seems unfair to use federal dollars to compete against firms who have made private investments to build the identical technology. The CAF money ought to be used to provide something better.

I understand Frontier’s dilemma. In the areas where they took CAF II money they are required to serve everybody who doesn’t have broadband today. My back-of-the envelope calculations tells me that the CAF money was not enough for them to extend DSL into the most rural parts of the CAF areas since extending DSL means building fiber to feed the DSLAMs.

As I have written many times I find the whole CAF program to be largely a huge waste of federal dollars. Using up to $10 billion to expand DSL, point-to-multipoint, and in the case of AT&T cellular wireless is a poor use of our money. That same amount of money could have seeded matching broadband that could be building a lot of fiber to these same customers. We only have to look at state initiatives like the DEED grants in Minnesota to see that government grant money induces significant private investment in fiber. And as much as the FCC doesn’t want to acknowledge it, building anything less than fiber is nothing more than a Band-aid. We can and should do better.

When Customers Use Their Data

In a recent disturbing announcement ,Verizon Wireless will be disconnecting service to 8,500 rural customers this month for using too much data on their cellphones. The customers are scattered around 13 states and are a mix those with both unlimited and limited data plans.

Verizon justifies this because these customers are using data where Verizon has no direct cell towers, meaning that these customers are roaming on cellular data networks owned by somebody else. Since Verizon pays for roaming the company say that these customers are costing them more in roaming charges than what the company collects in monthly subscription fees.

Verizon may well have a good business case for discontinuing these particular data customers if they are losing money on each customer. But the act of disconnecting them opens up a lot of questions and ought to be a concern to cellular customers everywhere.

This immediately raises the question of ‘carrier of last resort’. This is a basic principle of utility regulation that says that utilities, such as traditional incumbent telephone companies, must reasonably connect to everybody within their service territory. Obviously cellular customers don’t fall under this umbrella since the industry is competitive and none of the cellular companies have assigned territories.

But the lines between cellular companies and telcos are blurring. As AT&T and Verizon take down rural copper they are offering customers a wireless alternative. But in doing so they are shifting these customers from being served by a regulated telco to a cellular company that doesn’t have any carrier of last resort obligations. And that means that once converted to cellular that Verizon or AT&T would be free to then cut these customers loose at any time and for any reason. That should scare anybody that loses their rural copper lines.

Secondly, this raises the whole issue of Title II regulation. In 2015 the FCC declared that broadband is a regulated service, and that includes cellular data. This ruling brought cable companies and wireless companies under the jurisdiction of the FCC as common carriers. And that means that customers in this situation might have grounds for fighting back against what Verizon is doing. The FCC has the jurisdiction to regulate and to intervene in these kinds of situations if they regulate the ISPs as common carriers. But the current FCC is working hard to reverse that ruling and it’s doubtful they would tackle this case even if it was brought before them.

Probably the most disturbing thing about this is that it’s scary for these folks being disconnected. Rural homes do not want to use cellular data as their only broadband connection because it’s some of the most expensive broadband in the world. But many rural homes have no choice since this is their only broadband alternative to do the things they need to do with broadband. While satellite data is available almost everywhere, the incredibly high latency on satellite data means that it can’t be used for things like maintaining a connection to a school server to do homework or to connect to a work server to work at home.

One only has to look at rural cellular networks to understand the dilemma many of these 8,500 households might face. The usable distance for a data connection from a cellular tower is only a few miles at best, much like the circles around a DSL hub. It is not hard to imagine that many of these customers actually live within range of a Verizon tower but still roam on other networks.

Cellular roaming is an interesting thing. Every time you pick up your cellphone to make a voice or data connection, your phone searches for the strongest signal available and grabs it. This means that the phones of rural customers that don’t live right next to a tower must choose between competing weaker signals. Customers in this situation might be connected to a non-Verizon tower without it being obvious to them. Most cellphones have a tiny symbol that warns when users are roaming, but since voice roaming stopped being an issue most of us ignore it. And it’s difficult or impossible on most phones to choose which tower to connect to. Many of these customers being disconnected might have always assumed they actually were using the Verizon network. But largely it’s not something that customers have much control over.

I just discussed yesterday how we are now in limbo when it comes to regulating the broadband practices of the big ISPs. This is a perfect example of that situation because it’s doubtful that the customers being disconnected have any regulatory recourse to what is happening to them. And that bodes poorly to rural broadband customers in general – just one more reason why being a rural broadband customer is scary.

Broadband Regulation is in Limbo

We have reached a point in the industry where it’s unclear who regulates broadband. I think a good argument can be made that nobody is regulating broadband issues related to the big ISPs.

Perhaps the best evidence of this is a case that is now in Ninth Circuit Court of Appeals in San Francisco. This case involves a 2014 complaint against AT&T by the Federal Trade Commission based on the way that AT&T throttled unlimited wireless data customers. The issue got a lot of press at the time when AT&T started restricting data usage in 2011 for customers when they hit some arbitrary (and unpublished) data threshold in a month. Customers got shuttled back to 3G and even 2G data speeds and basically lost the ability to use their data plans. The press and the FTC saw this as an attempt by AT&T to drive customers off their grandfathered unlimited data plans (which were clearly not unlimited).

AT&T had argued at the FTC that they needed to throttle customers who use too much data as a way to manage and protect the integrity of their networks. The FTC didn’t buy this argument ruled against AT&T. As they almost always do the company appealed the decision. The District Court in California affirmed the lower court ruling and AT&T appealed again, which is the current case in front of the Ninth Circuit. AT&T is making some interesting claims in the case and is arguing that the Federal Trade Commission rules don’t allow the FTC to regulate common carriers.

There are FTC rules called the ‘common carrier exemption’ that were established in Part 5 of the original FTC Act that created the agency. These exemptions are in place to recognize that telecom common carriers are regulated instead by the FCC. There are similar carve-outs in the FTC rules for other industries that are regulated in part by other federal agencies.

The common carrier exemption doesn’t relieve AT&T and other telecom carriers from all FTC regulation – it just means that the FTC can’t intercede in areas where the FCC has clear jurisdiction. But any practices of telecom carriers that are not specifically regulated by the FCC then fall under FTC regulations since the agency is tasked in general with regulating all large corporations.

AT&T is making an interesting argument in this appeals case. They argue since they are now deemed to be a common carrier for their data business under the Title II rules implemented in the net neutrality order that they should be free of all FTC oversight.

But there is an interesting twist to this case because the current FCC filed an amicus brief in the appeal saying that they think that the FTC has jurisdiction over some aspects of the broadband business such as privacy and data security issues. It is this FCC position that creates uncertainty about who actually regulates broadband.

We know this current FCC wants to reverse the net neutrality order, and so they are unwilling right now to tackle any major issues that arise from those rules. In this particular case AT&T’s throttling of customers occurred before the net neutrality decision and at that time the FCC would not have been regulating cellular broadband practices.

But now that the FCC is considered to be a common carrier it’s pretty clear that the topic is something that the FCC has jurisdiction of today. But we have an FCC that is extremely reluctant to take on this issue because it would give legitimacy to the net neutrality rules they want to eliminate.

The FCC’s position in this case leads me to the conclusion that, for all practical purposes, companies like AT&T aren’t regulated at all for broadband issues. The prior FCC made broadband a common carrier service and gave themselves the obligation to regulate broadband and to tackle issues like the one in this case. But the new FCC doesn’t want to assert that authority and even goes so far as to argue that many broadband related issues ought to be regulated by the FTC.

This particular case gets a little further muddled by the timing since AT&T’s practices predate Title II regulation – but the issue at the heart of the case is who regulates the big ISPs. The answer seems to be nobody. The FCC won’t tackle the issue and AT&T may be right that the FTC is now prohibited from doing so. This has to be a huge challenge for a court because they are now being asked who is responsible for regulating the case in front of them. That opens up all sorts of possible problems. For example, what happens if the court rules that the FCC must decide this particular case but the agency refuses to do so? And of course, while this wrangling between agencies and the courts is being settled it seems that nobody is regulating AT&T and other broadband providers.

Lowering the Official Speed of Broadband

The FCC’s intention to kill net neutrality is getting all of the headlines, but there is another quieter battle going on at the FCC that has even bigger implications for rural America.

The last FCC under Chairman Tom Wheeler raised the definition of broadband in 2015 to 25/3 Mbps, up from the archaic definition of 4/1. In doing so the FCC set the speed based upon the way that an average household uses broadband. At the time many people argued that the FCC’s way of measuring broadband need was somewhat contrived – and perhaps it was because it’s really a challenge to define how much broadband a home needs. It’s not as easy as just adding up the various web connections as I described in a recent blog.

The FCC is now considering lowering the definition of broadband down to 10/1 Mbps. That would be a dagger to the heart of rural broadband, as I will discuss below.

One only has to look at the big ISPs to see that the FCC is ignoring the realities of the market. The big cable companies have all set minimum broadband speeds above the 25/3 Mbps current FCC broadband definition. Charter’s base broadband product for a new customer is 60 Mbps. Depending upon the market Comcast’s base speeds are 50 Mbps or 75 Mbps. AT&T says they are starting to back out of their DSL business because their fastest U-verse product only has speeds up to 50 Mbps. These big ISPs all get it and they know that customers are only happy with their broadband connection when it works without problems. And providing more speed than 25/3 Mbps is how these companies are satisfying that customer demand.

Unfortunately the FCC’s definition of broadband has huge real life implications. The big urban ISPs won’t change what they are doing, but a lower threshold could kill attempts to improve rural broadband. The FCC has a mandate from Congress to take steps to make sure that everybody in the country has adequate broadband. When the FCC increased the definition to 25/3 Mbps they instantly recognized that 55 million people didn’t have broadband. And that forced them to take steps to fix the problem. Since 2015 there has been a lot of rural broadband construction and upgrades made by cable networks in small town America and the latest estimates I’ve seen say that the number of those without 25/3 Mbps broadband is now down to around 39 million. That’s still a lot of people.

If the current FCC lowers the definition to 10/1 Mbps then many of those 39 million people will instantly be deemed to have broadband after all. That would take the FCC off the hook to try to solve the rural broadband gap. To really show that this is just a political decision, the FCC is also contemplating counting a cellular broadband connection as an acceptable form of broadband. In doing so they will then be able to declare that anybody that can get this new slower speed on a cellphone has an adequate broadband solution.

Of course, when I say this is all just politics there are those that said the same thing when the Wheeler FCC raised the definition to 25/3 Mbps. At that time critics might have been right. In 2015 there were a lot of homes that were happy with speeds less than 25/3 Mbps and that definition might have been a little bit of a stretch for the average home.

But when you take all of the politics out of it, the reality is that the amount of broadband that homes need keeps growing. Any attempt to define broadband will be obsolete within a few years as broadband usage continues on the path of doubling every three years. A home that needed 15 or 20 Mbps download in 2015 might now easily need more than 25/3 Mbps. That’s how the math behind geometric growth is manifested. .

It is disheartening to see the FCC playing these kinds of political games. They only need to go visit any rural kid trying to do homework to understand that 10/1 Mbps broadband on a cellphone is not broadband. The FCC only needs to go talk to somebody in rural America who can’t take a good-paying work-at-home job because they don’t have good broadband. They only need to go and talk to farmers who are losing productivity due to lack of a good broadband connection. And they only need to talk to rural homeowners who can’t find a buyer for their home that doesn’t have broadband.

This is too critical of an economic issue for the country to let the definition of broadband change according to the politics of the administration in office. Rather than political haggling over the official definition of broadband we ought to try something new. For example, we could set a goal that rural America ought to at least have half of the average speeds of broadband available in urban America. Using some kind of metric that people can understand would take the politics out of this. This is a metric that companies like Akamai already quantify and measure. The amount of broadband that homes needs is a constantly growing figure and pinning it down with one number is always going to be inadequate. So maybe it’s time to remove politics from the issue and make it fact based.

Decommissioning Rural Copper, Part 2

In the last blog I wrote about my belief that AT&T and Verizon want out of the rural wireline business. They both have plans to largely walk away from their rural copper networks and replace landline copper services with cellular service. Today I want to talk about what regulators ought to do with those networks.

When these two giant telcos walk away from rural copper they will inevitably harm rural America. While many homes will get the ‘privilege’ of now buying highly-priced cellular-based broadband, other homes are going to find themselves without telephone service if they happen to live in one of the many cellular dead zones. Such homes will not only be unable to benefit from cellular broadband, but if they have poor cell service they will find themselves cut off from voice communications as well.

As somebody who has traveled extensively in rural America I can tell you that there are a lot more cellular dead zones than people realize. And it’s not only farms, and there are county seats in rural America where it’s difficult to get a working cellphone signal inside of buildings.

As part of this transition both companies are going to walk away from a huge amount of existing copper cable. I think this copper cable is an incredibly valuable asset and that regulators ought not to allow them to tear it down.

The copper wire network today goes almost everywhere in rural America. Congressional laws and FCC policies led to most homes in the country getting access the the copper network. These copper wires occupy a valuable space on existing telephone poles – on the majority of rural poles the only two wires are the power lines at the top and the telephone wires at the bottom.

If these copper wires are kept in place they could greatly reduce the cost of building rural fiber. It is far cheaper when building fiber to ‘lash’ the fiber onto an existing set of cables than to hang fiber from scratch. It was this construction technique that allowed Verizon to build a lot of its FiOS fiber network – they lashed fiber onto existing telephone wires. And my guess is that when Verizon decommissions urban copper they are still going to leave a lot of the copper wires in place as a guidewire for their fiber.

If these telcos are going to walk away from these copper wires, then they ought to be required to keep them in place for use by somebody else to hang fiber. Many states might force the big telcos to tear down the copper wires since they will eventually create safety hazards as they break away from poles if they aren’t maintained. But if somebody else is willing to take over that maintenance then it shouldn’t be an issue.

I can picture a regulatory process whereby some other carrier is allowed to come in and ‘claim’ the abandoned wires once they are empty of customers. That would provide fiber overbuilders or rural communities to claim this copper as an asset.

There is some salvage value to copper wires and and it’s possible, but not probable that the value of the copper could exceed the cost to tear it down. So I can see the telcos fighting such an idea as a confiscation of their assets. But these rural wires have been fully depreciated for decades and the telcos have earned back the cost of these copper lines many times over. I believe that by the act of abandoning the wires and depriving some homes of wireline service that the big telcos will have forfeited any rights they might have to the remaining assets.

Anybody claiming the abandoned copper could use it in two ways. First, in many cases there is still existing life left in the copper, as witnessed by Frontier and CenturyLink rehabbing old rural copper with upgraded DSL. Local communities or small carriers could use the copper to bring the better services that the big telcos have refused to do over the last few decades.

But more importantly these wires represent the cheapest path forward for building rural fiber. Anybody taking over the old copper can save a lot of fiber construction costs by lashing fiber onto the existing copper. If our nationwide goal is really to get better broadband to rural America, then offering abandoned copper to fiber builders might be one of the easiest tools available to help the process along.

The big telcos abandoned rural America dacades ago. They stopped doing routine maintenance on rural copper and slashed the number of rural technicians. They now want to walk away from that copper and instead force rural America to buy cellular services at inflated prices. We owe it to the folks who paid for this copper many times over to get some benefit from it and to offer an alternative to the new rural cellular monopolies.

Regulating Online Video Content

Recently the Kommission für Zulassung und Aufsicht der Medienanstalten (ZAK) – the German equivalent of our FCC – recently concluded that OTT services ought to be regulated the same way as other broadcast radio and television networks. Specifically they were looking at Twitch.tv the web gaming service, but the ruling could have far-reaching consequences.

I think the ruling raises two questions. First, should any regulatory body be regulating video content on the Internet? Second, why are we still heavily regulating cable TV?

The European press is lambasting the order as nothing more than a money grab. One of the benefits of regulating anything is to charge fees for that regulation. Like many regulatory bodies around the world the ZAK is largely funded by fees charged to the companies that it regulates (which is also largely true for the FCC as well). This means that regulators have a perverse incentive to regulate things, even if they don’t need to be regulated.

The idea of regulating a worldwide web ‘channel’ like a TV station is absurd. For those of you that may not know about Twitch.tv, it’s the primary gaming network for worldwide gamers. It’s owned by Amazon. It’s a huge platform and works like YouTube where over 17,000 ‘partners’ post gaming content into ‘channels.’ The platform averages 625,000 simultaneous viewers at any given time, making it one of the most popular web platforms in the world.

So regulating Twitch.tv would be the same as regulating YouTube. It’s a platform where virtually all of its content is created by others. Other than extracting fees from the platform for the privilege of regulating it, it’s hard to understand what else the ZAK could regulate. Twitch.tv and YouTube are open platforms and only function because they allow anybody to post content. Both platforms will take down offensive content or content that violates copyrights if they are asked to do so. But the platforms, by definition of the way they operate, have no control of the content that is posted. I’m at a total loss what the ZAK thinks they can regulate.

You have to also wonder how effective any regulation would be. There are a huge number of smaller web platforms that might fall into the same category as Twitch.TV. It’s hard to imagine anybody being able to launch a new platform if they are expected to comply with different rules in a hundred countries. But it’s also hard to envision the ZAK doing anything other than somehow trying to ban the content from the whole country of a platform that refuses to comply with their regulations. I don’t think the ZAK understands the political ramifications of banning a platform used by all the young tech-savvy programmers (and hackers) in their country!

But thinking about this makes me ask why we are still regulating cable companies in the US. There are slews of FCC rules that dictate things like channel line-ups. It’s FCC rules that force cable companies to still offer basic, expanded basic, and premium tiers of service. It’s now pretty clear that few consumers are happy with this structure. The average household only watches about a dozen channels monthly regardless of the size of the tiers they purchase. It is the requirement for these tiers that has allowed the programmers to force programs onto cable companies that they don’t really want.

It is the cable tiers that have forced up the price of cable. Households spend huge monthly bills to watch a dozen channels – all because the regulations force channel line-ups that contain a hundred or more channels that the household isn’t interested in.

And cable companies are now competing against companies that don’t have these same restraints. Companies like SlingTV can put together any channel line-up they want with no regulatory constraints telling them what they can or can’t offer. Surveys have always shown that people would rather buy just those channels that they want to watch. And yet cable companies in the US are not allowed to compete head-on with OTT providers.

It would be easy to blame the FCC for not keeping up with the times. However, the most draconian cable rules come directly from Congress and the FCC’s hands are tied from deviating from rules that are embedded in law. We are now at a time when we really need to consider these old rules. The cable companies are being forced to sell programming that customers don’t want to pay for. The whole industry would benefit if cable companies were free to pursue packages that people actually want to buy. Freeing up all video providers to offer what customers want is a far better solution than trying to drag web companies into becoming regulated cable companies.

How Much Speed Do We Really Need?

There is a lot of buzz floating around in the industry that the FCC might lower the official definition of broadband from 25 Mbps down and 3 Mbps up. Two of the current FCC commissioners including the chairman opposed setting that definition a few years back. Lowering the speeds would let the FCC off the hook for the requirement by law to make sure that the whole country can get broadband. If they lower the definition, then voila, millions more Americans would be declared to have adequate broadband.

So today I thought I’d take a look at the download speeds we really need at our homes. You may recall that back when the FCC set the 25/3 Mbps definition that they made a list of the broadband speed needed to do typical activities. And in doing so they tried to create profiles of some typical American households. That attempt was awkward, but it was a good starting point for examining household bandwidth needs. I’m updating their list a bit for things that people do today, which is already different than just a few years ago. Consider the following web activities:

  • Web Background 5 Mbps
  • Web Browsing 1 – 2 Mbps
  • Online Class 1 – 2 Mbps
  • Social Media 1 – 2 Mbps
  • Streaming Music 3 Mbps
  • Voice over IP 2 Mbps
  • SD Video stream 1 – 3 Mbps
  • HD Video Stream 4 – 6 Mbps
  • 4K Video Stream 15 – 20 Mbps
  • Gaming 1 – 3 Mbps
  • Skype / Video Conference 1 – 3 Mbps
  • Big File Downloader 50 Mbps

People don’t agree with all of these listed speeds because there are no standards for how the web works. For example, by using different compression schemes a video stream from Netflix is not identical to one from Amazon. And even from one source there is variation since an action move takes more bandwidth than something like a stand-up comedy routine.

It’s important to remember that broadband demand can come from any device in your house – desktop, laptop, smartphone, tablet, etc. It’s also important to note that these are speed requirements for a single user. If two people in the house are watching an separate video, then you have to double the above number.

What the FCC failed to consider back when they set the speed definition is that households need enough bandwidth to handle the busiest times of the day. What matters is the number of simultaneous activities a home can do at the same time on the web, with most families being busiest in the evenings. There might be somebody on social media, somebody watching an HD movie, while somebody else is doing homework while also using a smartphone to swap pictures.

There is another issue to consider when trying to do simultaneous tasks on the Internet – packet loss. The connection between the ISP and a customer gets more congested when it’s trying to process multiple data streams at the same time. Engineers describe this as packet collision – which sounds like some kind of bumper-car ride – but it’s an apt way to describe the phenomenon. Most home routers are not sophisticated enough to simultaneously handle too many multiple streams at once. Packets get misdirected or lost and the router requests the missing packets to be sent again from the originator. The busier the router, the more packet interference. This is also sometimes called ‘overhead’ in the industry and this overhead can easily grow to 15% or more of the total traffic on a busy connection, meaning it takes 15% more bandwidth to complete a task than if that task was the only thing occurring on the broadband connection.

There is another kind of interference that happens in homes that have a WiFi network. This is a different kind of interference that has to do with the way that WiFi works. When a WiFi network gets multiple requests for service, meaning that many devices in the home are asking for packets, the WiFi router gets overwhelmed easily and shuts down. It then reinitiates and sends packets to the first device that gets its attention. In a busy network environment the WiFi router will shut down and restart constantly as it tries to satisfy the many needed devices. This kind of interference was designed into the WiFi specification as a way to ensure that WiFi could satisfy the needs of multiple devices. This WiFi overhead can also easily add 15% or more to the network demand.

Anybody who lives in a home with active users understands how networks can get overwhelmed. How many of you have been frustrated trying to watch a movie when others in the house are using the Internet? Even big bandwidth can be overwhelmed. I have a friend who has a 100 Mbps fiber connection on Verizon FiOS. He went to watch a video and it wouldn’t stream. He found that his two teenage sons were each using half a dozen gaming streams at the same time and had basically exhausted his fast bandwidth pipe.

The FCC can tinker with the official definition of broadband since that is their prerogative. But what they can’t do is to define for any given home how much bandwidth they really need. The funny thing is that the big ISPs all understand this issue. The cable companies have unilaterally increased speeds across-the-board to urban customers several times in recent years and in most markets offer speeds considerably faster than the current FCC definition of broadband. These ISPs know that if they were only delivering 25 Mbps that they would be overwhelmed with customers complaining about the connection. Those complaints are the real proof of how much bandwidth many homes need. If the FCC lowers the definition of broadband then they have on blinders and are ignoring how homes really use broadband today. If they lower the speed definition it’s hard to see it as anything other than a political move.

FCC Takes a New Look at 900 MHz

The FCC continues its examination of the best use of spectrum and released a Notice of Inquiry on August 4 looking at the 900 MHz band of spectrum. They want to know if there is some better way to use the spectrum block. They are specifically looking at the spectrum between 896–901 MHz and 935-940 MHz.

The FCC first looked at this frequency in 1986 and the world has changed drastically since then. The frequency is currently divided into 399 narrowband channels grouped into 10-channel blocks. This licensed use of the spectrum varies by MTA (Major Trading Area), where channels have been allocated according to local demand from commercial users.

One of the more common uses of the spectrum is for SMR service (Specialized Mobile Radio), which is the frequency that’s been used in taxis and other vehicle fleets for many years. The other use is more commonly referred to as B/ILT purposes (Business/Industrial Land Transportation). This supports radios in work fleets, and is used widely to monitor and control equipment (such as monitoring water pumps in a municipal water system). The frequency was also widely used historically for public safety / police networks using push-button walkie-talkies (although cellphones have largely taken over that function).

The FCC currently identifies 2,700 sites used by 500 licensees in the country that are still using B/ILT radios and technologies. These uses include security at nuclear power plants including public alert notifications, flood warning systems, smart grid monitoring for electric networks, and for monitoring petroleum refineries and natural gas distribution systems.

But we live in a bandwidth hungry world. One of the characteristics of this spectrum is that it’s largely local in nature (good for distances of up to a few miles, at most). When mapping the current uses of the frequency it’s clear that there are large portions of the country where the spectrum is not being used. And this has prompted the FCC to ask if there is a better use of the spectrum.

Typically the FCC always finds ways to accommodate existing users and regardless of any changes made it’s unlikely that they are going to cut off use of the spectrum in nuclear plants, electric grids and water systems. But to a large degree the spectrum is being underutilized. Many of the older uses of the spectrum such as walkie-talkies and push-to-talk radios have been supplanted by newer technologies using other spectrum. With that said, there are still some places where the old radios of this type are still in use.

The FCC’s action was prompted by a joint proposal by the Enterprise Wireless Alliance (EWA) and Pacific DataVision (PDV). This petition asks for the frequency to be realigned into three 3 MHz bands that can be used for wireless broadband and two 2 MHz bands that could be used to continue to support the current narrowband uses of the spectrum. They propose that the broadband channels be auctioned to a single user in each BTA but that the narrowband uses continue to be licensed upon request in the same manner as today.

This docket is a perfect example of the complexities that the FCC always has deal with in changing the way that we use spectrum. The big question that has to always be addressed by the FCC is what to do with existing users of the spectrum. Any new allocation plan is going to cause many existing users to relocate their spectrum within the 900 MHz block or to spectrum elsewhere. And it’s generally been the practice of the FCC to make new users of spectrum pay to relocate older uses of spectrum that must be moved. And so the FCC must make a judgement call about whether it makes monetary sense to force relocation.

The FCC also has to always deal with technical issues like interference. Changing the way the spectrum will be used from numerous narrowband channels to a few wideband channels is going to change the interference patterns with other nearby spectrum. And so the FCC must make a determination of the likelihood of a spectrum change not causing more problems than it solves.

This particular band is probably one of the simpler such tasks the FCC can tackle. While the users of the spectrum perform critical tasks with the current spectrum, there is not an unmanageable number of current users and there are also large swaths of the US that have no use at all. But still, the FCC does not want to interfere with the performance at nuclear plants, petroleum refineries or electric grids.

For anybody that wants to read more about how the FCC looks at spectrum, here is the FCC Docket 17-200. The first thing you will immediately notice is that this document, like most FCC documents dealing with wireless spectrum, is probably amongst the most jargon-heavy documents produced by the FCC. But when talking about spectrum the jargon is useful because the needed discussions must be precise. But it is a good primer on the complications involved in changing the way we use spectrum. There has been a recent clamor from the Congress to free up more spectrum for cellular broadband, but this docket is a good example of how complex of an undertaking that can be.

Big ISPs Want to be Regulated

I’ve always contended that the big ISPs, regardless of their public howling, want to be regulated. It is the nature of any company that is regulated to complain about regulation. For the last decade as AT&T and Verizon made the biggest telecom profits ever they have released press release after press release decrying how regulation was breaking their backs. The big telcos and cable companies spent the last few years declaring loudly that Title II regulation was killing incentives to make investments, while spending record money on capital.

A few months ago Comcast, Charter, and Cox filed an amicus brief in a lawsuit making its way through the US. Court of Appeals for the Ninth Circuit. In that brief they asked the federal appeals court to restore the Federal Trade Commission’s jurisdiction over AT&T. The specific case being reviewed had to do with deceptive AT&T marketing practices when they originally offered unlimited cellular data plans. It turns out that AT&T throttled customer speeds once customers reached the meager threshold of 3 – 5 GB per month.

In 2014 the FTC sued AT&T for the practice and that’s the case now under appeal. It’s a bit extraordinary to see big ISPs siding with the government over another ISP, and the only reason that can be attributed to the suit is that these companies want there to be a stable regulatory environment. In the brief the cable companies expressed the desire to “reinstate a predictable, uniform, and technology-neutral regulatory framework that will best serve consumers and businesses alike.”

That one sentence sums up very well the real benefit of regulation to big companies. As much as they might hate to be regulated, they absolutely hate making huge investments in new product lines in an uncertain regulatory environment. When a big ISP knows the rules, they can plan accordingly.

One scenario that scares the big ISPs is living in an environment where regulations can easily change. That’s where we find ourselves today. It’s clear that the current FCC and Congress are planning on drastically reducing the ‘regulatory burden’ for the big ISPs. That sounds like an ideal situation for the ISPs, but it’s not. It’s clear that a lot of the regulations are being changed for political purposes and big companies well understand that the political pendulum swings back and forth. They dread having regulations that change with each new administration.

We only have to go back a few decades to see this in action. The FCC got into and then back out of the business of regulating cable TV rates several times in the late 1970s and the 1980s. This created massive havoc for the cable industry. It created uncertainty, which hurt their stock prices and made it harder for them to raise money to expand. The cable industry didn’t become stable and successful until Congress finally passed several pieces of cable legislation to stop these regulatory swings.

Big companies also are not fond of being totally deregulated. That is the basis for the amicus brief in the AT&T case. The big ISPs would rather be regulated by the FTC instead of being unregulated. The FTC might occasionally slap them with big fines, but the big companies are smart enough to know that they have more exposure without regulations. If the FTC punishes AT&T for its marketing practices that’s the end of the story. But the alternative is for AT&T to have to fend off huge class action lawsuits that will seek damages far larger than what the FTC will impose. There is an underlying safety net by being regulated and the big ISPs understand and can quantify the risk of engaging in bad business practices.

In effect, as much as they say that hate being regulated, big companies like the safety of hiding behind regulators who protect them as much as they protect the public. It’s that safety net that can allow a big ISP to invest billions of capital dollars.

I really don’t think the FCC is doing the big ISPs any favors if they eliminate Title II regulations. Almost every big ISP has said publicly that they are not particularly bothered by the general principles of net neutrality – and I largely believe them. Once those rules were put into place the big companies made plans based upon those rules. The big ISPs did fear that some future FCC might use Title II rules to impose rate regulation – much as the disaster with the cable companies in the past. But overall the regulation gives them a framework to safely invest in the future.

I have no doubt that the political pendulum will eventually swing the other way – because it always does. And when we next get a democratic administration and Congress, we are likely to see much of the regulations being killed by the current FCC put back into place by a future one. That’s the nightmare scenario for a big ISP – to find that they have invested in a business line that might be frowned upon by future regulators.