California Lowers the Definition of Broadband

California Governor Jerry Brown just signed a bill into law that lowers the official definition of broadband in the state while also providing state funding to upgrade rural broadband. The bill, AB 1665, goes into effect immediately. It lowers the definition of broadband in the state to 10 Mbps down and 1 Mbps up. But it goes even further and lowers the definition of an unserved customer to somebody who can’t get speeds of 6 Mbps and 1 Mbps up.

The bill reinstates a telecom tax that will provide a $300 million fund intended to be used to improve rural broadband. The California press believes that the fund will largely go to AT&T and Frontier, which both lobbied hard for the bill. My reading of the bill is that the incumbent carriers have first shot at the funding and anybody else only gets it when they don’t take it. In practical terms, assuming those two companies take the funding, almost none of this money would be made available to anybody who wants to build something faster in unserved areas.

We know that state funding done the right way can be a tremendous boon to broadband expansion. Consider, for example, the Minnesota DEED grants that have coaxed dozens of telecom providers to expand fiber networks deep into unserved and underserved areas of the state. It’s commonly understood that it can be hard to justify bringing fiber to rural areas, but some grant funding can be an effective tool to attract private money to fund the rest.

We also understand today that there are huge economic benefits for areas that have good broadband. The farmers in Minnesota that benefit from the grant program there are going to have a competitive advantage over farmers elsewhere that have little or no broadband. I’ve been looking at the IOT and other fiber-based technologies on the horizon for farming that are going to vastly increase productivity.

We also know that having good broadband benefits the small communities in rural America as well. These communities have been experiencing brain drain and economic flight as people are forced to go to metropolitan areas to find work. But broadband opens up work-at-home opportunities that ought to make it possible for families to thrive in rural America.

This move by California is a poor decision on many levels. First, it funnels money to the incumbent providers to make tiny tweaks to the existing networks so that existing broadband is just a little better. The new 10/1 Mbps broadband definition is also nothing more than a legislative definition of broadband and has no relevance in the real world. Many homes need more broadband than that, and as household broadband demand grows, a 10/1 Mbps connection will become inadequate for every home.

Another reason this is a bad idea is that the incumbents there are already making improvements to increase broadband to the 10/1 Mbps level. AT&T took $361.4 million of FCC CAF II funding that is to be used to upgrade broadband to 141,500 homes in California. That works out to $2,554 per home passed. Frontier took another $36.6 million, or $2,853 per home passed to improve broadband to 12,800 homes. That federal money requires that speeds increase to the 10/1 Mbps speed. This state funding will be an additive to those large federal amounts that these two companies have already received from the government.

AT&T has also already said that it plans to meet its CAF II obligations by upgrading rural cellular speeds. Frontier is mostly going to improve DSL on ancient copper and also is now looking at using point-to-point wireless technology to meet the CAF II obligations.

I don’t know how much it’s going to cost these companies to upgrade their rural customers to 10/1 Mbps. But the federal funding might be enough to pay for all of it. Adding the state funding means it’s likely that these two companies will make an immediate profit from upgrading rural customers to barely adequate broadband speeds. As we’ve seen many times in the past, this bill is good evidence that the big companies get value out of their lobbying efforts. The losers in all of this are the homes that won’t get anything faster than CAF II broadband. This $300M could have been used as matching grants to bring much faster broadband to many of these homes.

 

When Disaster Strikes

As many of you know who read this blog, I lived for nearly a decade on St. Croix in the US Virgin Islands. Since all three of the US Virgin Islands as well Puerto Rico got devastated by the two recent hurricanes I thought I’d talk a bit about how our industry responds to disasters.

Disasters have always been with us in the telecom industry. There have been other hurricanes that have knocked down wires and poles in the past. I have a number of clients who have experienced crippling ice storms. I have clients in the West who have lost network from wildfires. And I’ve had clients all over the country who have suffered from massive flooding.

I witnessed the impact of a hurricane in St. Croix when category 3 hurricane Omar hit the island in 2008. The hurricane itself was bad enough. A wall of water came down the hill behind my house, burst through the french doors at the rear of my house and streamed through and out my front door. Then, at the very end of the storm I took a direct hit from a tornado – one of the impacts of hurricanes often forgotten about.

As bad as these storms are, it’s the aftermath that is the most devastating. I was without power for over six weeks, meaning that my consulting work came to a screeching halt. But it took those whole six weeks anyway to clean the mud out of my house and to cut up the hundred dead trees around the property, including a magnificent hundred-year old mahogany tree. And while there are always mosquitoes in the Caribbean, after the flooding from a hurricane they come in dense clouds, making it miserable to work outside. What I remember most about that period is that my world shrank and all of my energy was needed to deal with the effects of the storm. I also learned how much I rely on electricity, refrigeration and lights in a place that gets dark at 6:00 PM every day. It’s mind-boggling to think that there are millions of Americans that will be without power for months.

A category 3 hurricane is strong enough to send trees crashing through overhead wires, and so there were wires down all over the island. But there was a minimal number of poles broken, and so the task of restoring power and telephone wires just needed lots of crews with cherry-pickers. Our island was the only place hit by Omar and crews from St. Thomas, Tortola and Puerto Rico came to help with the recovery. The island was so grateful we threw a huge well-deserved parade and party for repair crews when they were finished.

It was the response from work crews from other islands that made all of the difference. We see the same thing here in the US all of the time. One of my clients got devastated by hurricane Katrina and work crews from all over the US rushed to help. We see this after every stateside disaster as telecom and power crews from elsewhere rush to aid a utility in trouble.

And that is the big problem right now in the Caribbean. St. Thomas and the British Virgin Islands got devastated by hurricane Irma. The storm was so strong that it snapped the majority of the utility poles in St Thomas, meaning the work effort needed to restore the island is going to be massive. Since St. Croix got only minor damage in that storm it become the staging area for the work effort to help St. Thomas and St. John. But then two weeks later St. Croix and Puerto Rico were flattened by hurricane Maria.

We now have the unprecedented situation where all of the islands in the region lost their utility infrastructure at the same time. This presents an almost unsolvable logistical challenge of somehow getting the resources in place to get the islands back up and running. As bad as the Virgin Islands are right now, it’s almost impossible for the mind to grasp the amount of damage in Puerto Rico with it’s rough terrain and 3 million people still without power.

No utility can shoulder the cost of the repair efforts from a bad natural disasters. In the US the federal government has always jumped in to fund some of the needed recovery. The crews that rush in to help don’t ask first about getting paid and they assume they will eventually reimbursed for their costs. The FCC quickly approved $76.9 million towards the recovery effort for the Virgin Island and Puerto Rico. But that’s just a start on the cost of fixing the damage – I have colleagues working on St. Thomas and their first quick estimate of the utility damage there was almost $60 million. I imagine the final number for all of the islands is going to be astronomical.

I know that if there was an easy way to get there that many of the telco and power companies in the US would be sending crews to help the islands. It’s going to be hard enough just getting the needed poles, cables and electronics to the islands. It’s frustrating to know that the logistics challenges means that the repair will take a long time. It won’t be surprising to still see parts of Puerto Rico without electricity six months from now – and that is heartbreaking.

When Customers Use Their Data

In a recent disturbing announcement ,Verizon Wireless will be disconnecting service to 8,500 rural customers this month for using too much data on their cellphones. The customers are scattered around 13 states and are a mix those with both unlimited and limited data plans.

Verizon justifies this because these customers are using data where Verizon has no direct cell towers, meaning that these customers are roaming on cellular data networks owned by somebody else. Since Verizon pays for roaming the company say that these customers are costing them more in roaming charges than what the company collects in monthly subscription fees.

Verizon may well have a good business case for discontinuing these particular data customers if they are losing money on each customer. But the act of disconnecting them opens up a lot of questions and ought to be a concern to cellular customers everywhere.

This immediately raises the question of ‘carrier of last resort’. This is a basic principle of utility regulation that says that utilities, such as traditional incumbent telephone companies, must reasonably connect to everybody within their service territory. Obviously cellular customers don’t fall under this umbrella since the industry is competitive and none of the cellular companies have assigned territories.

But the lines between cellular companies and telcos are blurring. As AT&T and Verizon take down rural copper they are offering customers a wireless alternative. But in doing so they are shifting these customers from being served by a regulated telco to a cellular company that doesn’t have any carrier of last resort obligations. And that means that once converted to cellular that Verizon or AT&T would be free to then cut these customers loose at any time and for any reason. That should scare anybody that loses their rural copper lines.

Secondly, this raises the whole issue of Title II regulation. In 2015 the FCC declared that broadband is a regulated service, and that includes cellular data. This ruling brought cable companies and wireless companies under the jurisdiction of the FCC as common carriers. And that means that customers in this situation might have grounds for fighting back against what Verizon is doing. The FCC has the jurisdiction to regulate and to intervene in these kinds of situations if they regulate the ISPs as common carriers. But the current FCC is working hard to reverse that ruling and it’s doubtful they would tackle this case even if it was brought before them.

Probably the most disturbing thing about this is that it’s scary for these folks being disconnected. Rural homes do not want to use cellular data as their only broadband connection because it’s some of the most expensive broadband in the world. But many rural homes have no choice since this is their only broadband alternative to do the things they need to do with broadband. While satellite data is available almost everywhere, the incredibly high latency on satellite data means that it can’t be used for things like maintaining a connection to a school server to do homework or to connect to a work server to work at home.

One only has to look at rural cellular networks to understand the dilemma many of these 8,500 households might face. The usable distance for a data connection from a cellular tower is only a few miles at best, much like the circles around a DSL hub. It is not hard to imagine that many of these customers actually live within range of a Verizon tower but still roam on other networks.

Cellular roaming is an interesting thing. Every time you pick up your cellphone to make a voice or data connection, your phone searches for the strongest signal available and grabs it. This means that the phones of rural customers that don’t live right next to a tower must choose between competing weaker signals. Customers in this situation might be connected to a non-Verizon tower without it being obvious to them. Most cellphones have a tiny symbol that warns when users are roaming, but since voice roaming stopped being an issue most of us ignore it. And it’s difficult or impossible on most phones to choose which tower to connect to. Many of these customers being disconnected might have always assumed they actually were using the Verizon network. But largely it’s not something that customers have much control over.

I just discussed yesterday how we are now in limbo when it comes to regulating the broadband practices of the big ISPs. This is a perfect example of that situation because it’s doubtful that the customers being disconnected have any regulatory recourse to what is happening to them. And that bodes poorly to rural broadband customers in general – just one more reason why being a rural broadband customer is scary.

Broadband Regulation is in Limbo

We have reached a point in the industry where it’s unclear who regulates broadband. I think a good argument can be made that nobody is regulating broadband issues related to the big ISPs.

Perhaps the best evidence of this is a case that is now in Ninth Circuit Court of Appeals in San Francisco. This case involves a 2014 complaint against AT&T by the Federal Trade Commission based on the way that AT&T throttled unlimited wireless data customers. The issue got a lot of press at the time when AT&T started restricting data usage in 2011 for customers when they hit some arbitrary (and unpublished) data threshold in a month. Customers got shuttled back to 3G and even 2G data speeds and basically lost the ability to use their data plans. The press and the FTC saw this as an attempt by AT&T to drive customers off their grandfathered unlimited data plans (which were clearly not unlimited).

AT&T had argued at the FTC that they needed to throttle customers who use too much data as a way to manage and protect the integrity of their networks. The FTC didn’t buy this argument ruled against AT&T. As they almost always do the company appealed the decision. The District Court in California affirmed the lower court ruling and AT&T appealed again, which is the current case in front of the Ninth Circuit. AT&T is making some interesting claims in the case and is arguing that the Federal Trade Commission rules don’t allow the FTC to regulate common carriers.

There are FTC rules called the ‘common carrier exemption’ that were established in Part 5 of the original FTC Act that created the agency. These exemptions are in place to recognize that telecom common carriers are regulated instead by the FCC. There are similar carve-outs in the FTC rules for other industries that are regulated in part by other federal agencies.

The common carrier exemption doesn’t relieve AT&T and other telecom carriers from all FTC regulation – it just means that the FTC can’t intercede in areas where the FCC has clear jurisdiction. But any practices of telecom carriers that are not specifically regulated by the FCC then fall under FTC regulations since the agency is tasked in general with regulating all large corporations.

AT&T is making an interesting argument in this appeals case. They argue since they are now deemed to be a common carrier for their data business under the Title II rules implemented in the net neutrality order that they should be free of all FTC oversight.

But there is an interesting twist to this case because the current FCC filed an amicus brief in the appeal saying that they think that the FTC has jurisdiction over some aspects of the broadband business such as privacy and data security issues. It is this FCC position that creates uncertainty about who actually regulates broadband.

We know this current FCC wants to reverse the net neutrality order, and so they are unwilling right now to tackle any major issues that arise from those rules. In this particular case AT&T’s throttling of customers occurred before the net neutrality decision and at that time the FCC would not have been regulating cellular broadband practices.

But now that the FCC is considered to be a common carrier it’s pretty clear that the topic is something that the FCC has jurisdiction of today. But we have an FCC that is extremely reluctant to take on this issue because it would give legitimacy to the net neutrality rules they want to eliminate.

The FCC’s position in this case leads me to the conclusion that, for all practical purposes, companies like AT&T aren’t regulated at all for broadband issues. The prior FCC made broadband a common carrier service and gave themselves the obligation to regulate broadband and to tackle issues like the one in this case. But the new FCC doesn’t want to assert that authority and even goes so far as to argue that many broadband related issues ought to be regulated by the FTC.

This particular case gets a little further muddled by the timing since AT&T’s practices predate Title II regulation – but the issue at the heart of the case is who regulates the big ISPs. The answer seems to be nobody. The FCC won’t tackle the issue and AT&T may be right that the FTC is now prohibited from doing so. This has to be a huge challenge for a court because they are now being asked who is responsible for regulating the case in front of them. That opens up all sorts of possible problems. For example, what happens if the court rules that the FCC must decide this particular case but the agency refuses to do so? And of course, while this wrangling between agencies and the courts is being settled it seems that nobody is regulating AT&T and other broadband providers.

Lowering the Official Speed of Broadband

The FCC’s intention to kill net neutrality is getting all of the headlines, but there is another quieter battle going on at the FCC that has even bigger implications for rural America.

The last FCC under Chairman Tom Wheeler raised the definition of broadband in 2015 to 25/3 Mbps, up from the archaic definition of 4/1. In doing so the FCC set the speed based upon the way that an average household uses broadband. At the time many people argued that the FCC’s way of measuring broadband need was somewhat contrived – and perhaps it was because it’s really a challenge to define how much broadband a home needs. It’s not as easy as just adding up the various web connections as I described in a recent blog.

The FCC is now considering lowering the definition of broadband down to 10/1 Mbps. That would be a dagger to the heart of rural broadband, as I will discuss below.

One only has to look at the big ISPs to see that the FCC is ignoring the realities of the market. The big cable companies have all set minimum broadband speeds above the 25/3 Mbps current FCC broadband definition. Charter’s base broadband product for a new customer is 60 Mbps. Depending upon the market Comcast’s base speeds are 50 Mbps or 75 Mbps. AT&T says they are starting to back out of their DSL business because their fastest U-verse product only has speeds up to 50 Mbps. These big ISPs all get it and they know that customers are only happy with their broadband connection when it works without problems. And providing more speed than 25/3 Mbps is how these companies are satisfying that customer demand.

Unfortunately the FCC’s definition of broadband has huge real life implications. The big urban ISPs won’t change what they are doing, but a lower threshold could kill attempts to improve rural broadband. The FCC has a mandate from Congress to take steps to make sure that everybody in the country has adequate broadband. When the FCC increased the definition to 25/3 Mbps they instantly recognized that 55 million people didn’t have broadband. And that forced them to take steps to fix the problem. Since 2015 there has been a lot of rural broadband construction and upgrades made by cable networks in small town America and the latest estimates I’ve seen say that the number of those without 25/3 Mbps broadband is now down to around 39 million. That’s still a lot of people.

If the current FCC lowers the definition to 10/1 Mbps then many of those 39 million people will instantly be deemed to have broadband after all. That would take the FCC off the hook to try to solve the rural broadband gap. To really show that this is just a political decision, the FCC is also contemplating counting a cellular broadband connection as an acceptable form of broadband. In doing so they will then be able to declare that anybody that can get this new slower speed on a cellphone has an adequate broadband solution.

Of course, when I say this is all just politics there are those that said the same thing when the Wheeler FCC raised the definition to 25/3 Mbps. At that time critics might have been right. In 2015 there were a lot of homes that were happy with speeds less than 25/3 Mbps and that definition might have been a little bit of a stretch for the average home.

But when you take all of the politics out of it, the reality is that the amount of broadband that homes need keeps growing. Any attempt to define broadband will be obsolete within a few years as broadband usage continues on the path of doubling every three years. A home that needed 15 or 20 Mbps download in 2015 might now easily need more than 25/3 Mbps. That’s how the math behind geometric growth is manifested. .

It is disheartening to see the FCC playing these kinds of political games. They only need to go visit any rural kid trying to do homework to understand that 10/1 Mbps broadband on a cellphone is not broadband. The FCC only needs to go talk to somebody in rural America who can’t take a good-paying work-at-home job because they don’t have good broadband. They only need to go and talk to farmers who are losing productivity due to lack of a good broadband connection. And they only need to talk to rural homeowners who can’t find a buyer for their home that doesn’t have broadband.

This is too critical of an economic issue for the country to let the definition of broadband change according to the politics of the administration in office. Rather than political haggling over the official definition of broadband we ought to try something new. For example, we could set a goal that rural America ought to at least have half of the average speeds of broadband available in urban America. Using some kind of metric that people can understand would take the politics out of this. This is a metric that companies like Akamai already quantify and measure. The amount of broadband that homes needs is a constantly growing figure and pinning it down with one number is always going to be inadequate. So maybe it’s time to remove politics from the issue and make it fact based.

Are You Texting Your Customers?

In the last year I’ve found all sorts of my outside interactions now involve texting. I get texts from the dentist affirming an appointment, texts from a furniture company making sure I was home during a delivery, and texts from AT&T wireless for my cellular billing. All these various businesses have found that texting saves them money. Yet I have only a few ISP clients that make wide use of texting. I find that a bit surprising because I can think of a number of ways that texting can be a big money saver for an ISP.

The most obvious one is that it can save from making unneeded truck rolls. Every ISP I know says that truck rolls are expensive, and there is nothing more wasteful than making a truck roll to a customer who is not at home. I’m sure that is why the furniture company made the text and they would not have tried to deliver if I wasn’t at home. Better yet, texting puts a technician into direct contact with the customer and allows them to work out a plan if a customer isn’t home.

But there is probably even a bigger savings in the way that AT&T uses texting. They send me a text each month when they bill me and invite me to view my bill online. This saves them from having to mail a paper bill – something that makes no sense to somebody like me that uses autopay to pay my cellular bill. I can’t imagine I would ever open an AT&T paper bill and they would be spending money and margin to send me one. Many of my clients tell me that today that over half of their customers pay by bank debit or credit card and there is a huge savings from not mailing paper bills to these customers.

I do have a few clients that use texting and they report some other significant savings. For example, they say that texting has greatly reduced their uncollectible billing. They say that it’s far more effective to prompt customers immediately if they are late in paying their bills, and that most customers promptly pay when reminded. That’s particularly effective if you give them an immediate opportunity to pay the bill by credit card.

But the savings that surprised me a bit is the fact that companies that allow interactive texting with customers report that they have significantly reduced the number of calls to customer service. There are a two primary issues that prompt the majority of calls to customer service – outages and billing inquiries.

I have a client who uses texts to inform customers about outages. Customers can get quickly frustrated if they don’t know what’s happening and when service will be restored. This client has tied texting into their OSS and network mapping system and can send texts to only those customers that have outages. And they can inform customers proactively of planned maintenance outages. They say this largely eliminates calls about outages and particularly works great after hours when they are not answering the phones.

Texting can also be a good way to answer a lot of billing inquiries. Texting can be a great tool for answering simple customer questions like their outstanding balance or the due date of their payment. It takes a lot less time for both the customer and the company to answer a simple question by text. This is a great way to communicate with customers (like me) who would always choose an option other than making a call and getting into a customer service queue.

There are a few issues with texting to be aware of. There are some archaic FCC rules that define requirements for when customers text you. This harkens back to the day when many people paid for each text message – something that barely exists any longer. But the rules are still in place and are something to be aware of. There are also rules about using texting as a form of marketing – again, something that can be done in a way that doesn’t violate the FCC rules.

There are a wide range of texting solutions. At one end of the spectrum your technicians can text customers from their cellphones. But in order to get all of the advantages listed above you will want a fully interactive texting platform that’s integrated into your OSS/BSS. Feel free to contact me and I can describe the best solutions on the market.

Regulating Online Video Content

Recently the Kommission für Zulassung und Aufsicht der Medienanstalten (ZAK) – the German equivalent of our FCC – recently concluded that OTT services ought to be regulated the same way as other broadcast radio and television networks. Specifically they were looking at Twitch.tv the web gaming service, but the ruling could have far-reaching consequences.

I think the ruling raises two questions. First, should any regulatory body be regulating video content on the Internet? Second, why are we still heavily regulating cable TV?

The European press is lambasting the order as nothing more than a money grab. One of the benefits of regulating anything is to charge fees for that regulation. Like many regulatory bodies around the world the ZAK is largely funded by fees charged to the companies that it regulates (which is also largely true for the FCC as well). This means that regulators have a perverse incentive to regulate things, even if they don’t need to be regulated.

The idea of regulating a worldwide web ‘channel’ like a TV station is absurd. For those of you that may not know about Twitch.tv, it’s the primary gaming network for worldwide gamers. It’s owned by Amazon. It’s a huge platform and works like YouTube where over 17,000 ‘partners’ post gaming content into ‘channels.’ The platform averages 625,000 simultaneous viewers at any given time, making it one of the most popular web platforms in the world.

So regulating Twitch.tv would be the same as regulating YouTube. It’s a platform where virtually all of its content is created by others. Other than extracting fees from the platform for the privilege of regulating it, it’s hard to understand what else the ZAK could regulate. Twitch.tv and YouTube are open platforms and only function because they allow anybody to post content. Both platforms will take down offensive content or content that violates copyrights if they are asked to do so. But the platforms, by definition of the way they operate, have no control of the content that is posted. I’m at a total loss what the ZAK thinks they can regulate.

You have to also wonder how effective any regulation would be. There are a huge number of smaller web platforms that might fall into the same category as Twitch.TV. It’s hard to imagine anybody being able to launch a new platform if they are expected to comply with different rules in a hundred countries. But it’s also hard to envision the ZAK doing anything other than somehow trying to ban the content from the whole country of a platform that refuses to comply with their regulations. I don’t think the ZAK understands the political ramifications of banning a platform used by all the young tech-savvy programmers (and hackers) in their country!

But thinking about this makes me ask why we are still regulating cable companies in the US. There are slews of FCC rules that dictate things like channel line-ups. It’s FCC rules that force cable companies to still offer basic, expanded basic, and premium tiers of service. It’s now pretty clear that few consumers are happy with this structure. The average household only watches about a dozen channels monthly regardless of the size of the tiers they purchase. It is the requirement for these tiers that has allowed the programmers to force programs onto cable companies that they don’t really want.

It is the cable tiers that have forced up the price of cable. Households spend huge monthly bills to watch a dozen channels – all because the regulations force channel line-ups that contain a hundred or more channels that the household isn’t interested in.

And cable companies are now competing against companies that don’t have these same restraints. Companies like SlingTV can put together any channel line-up they want with no regulatory constraints telling them what they can or can’t offer. Surveys have always shown that people would rather buy just those channels that they want to watch. And yet cable companies in the US are not allowed to compete head-on with OTT providers.

It would be easy to blame the FCC for not keeping up with the times. However, the most draconian cable rules come directly from Congress and the FCC’s hands are tied from deviating from rules that are embedded in law. We are now at a time when we really need to consider these old rules. The cable companies are being forced to sell programming that customers don’t want to pay for. The whole industry would benefit if cable companies were free to pursue packages that people actually want to buy. Freeing up all video providers to offer what customers want is a far better solution than trying to drag web companies into becoming regulated cable companies.

How Much Speed Do We Really Need?

There is a lot of buzz floating around in the industry that the FCC might lower the official definition of broadband from 25 Mbps down and 3 Mbps up. Two of the current FCC commissioners including the chairman opposed setting that definition a few years back. Lowering the speeds would let the FCC off the hook for the requirement by law to make sure that the whole country can get broadband. If they lower the definition, then voila, millions more Americans would be declared to have adequate broadband.

So today I thought I’d take a look at the download speeds we really need at our homes. You may recall that back when the FCC set the 25/3 Mbps definition that they made a list of the broadband speed needed to do typical activities. And in doing so they tried to create profiles of some typical American households. That attempt was awkward, but it was a good starting point for examining household bandwidth needs. I’m updating their list a bit for things that people do today, which is already different than just a few years ago. Consider the following web activities:

  • Web Background 5 Mbps
  • Web Browsing 1 – 2 Mbps
  • Online Class 1 – 2 Mbps
  • Social Media 1 – 2 Mbps
  • Streaming Music 3 Mbps
  • Voice over IP 2 Mbps
  • SD Video stream 1 – 3 Mbps
  • HD Video Stream 4 – 6 Mbps
  • 4K Video Stream 15 – 20 Mbps
  • Gaming 1 – 3 Mbps
  • Skype / Video Conference 1 – 3 Mbps
  • Big File Downloader 50 Mbps

People don’t agree with all of these listed speeds because there are no standards for how the web works. For example, by using different compression schemes a video stream from Netflix is not identical to one from Amazon. And even from one source there is variation since an action move takes more bandwidth than something like a stand-up comedy routine.

It’s important to remember that broadband demand can come from any device in your house – desktop, laptop, smartphone, tablet, etc. It’s also important to note that these are speed requirements for a single user. If two people in the house are watching an separate video, then you have to double the above number.

What the FCC failed to consider back when they set the speed definition is that households need enough bandwidth to handle the busiest times of the day. What matters is the number of simultaneous activities a home can do at the same time on the web, with most families being busiest in the evenings. There might be somebody on social media, somebody watching an HD movie, while somebody else is doing homework while also using a smartphone to swap pictures.

There is another issue to consider when trying to do simultaneous tasks on the Internet – packet loss. The connection between the ISP and a customer gets more congested when it’s trying to process multiple data streams at the same time. Engineers describe this as packet collision – which sounds like some kind of bumper-car ride – but it’s an apt way to describe the phenomenon. Most home routers are not sophisticated enough to simultaneously handle too many multiple streams at once. Packets get misdirected or lost and the router requests the missing packets to be sent again from the originator. The busier the router, the more packet interference. This is also sometimes called ‘overhead’ in the industry and this overhead can easily grow to 15% or more of the total traffic on a busy connection, meaning it takes 15% more bandwidth to complete a task than if that task was the only thing occurring on the broadband connection.

There is another kind of interference that happens in homes that have a WiFi network. This is a different kind of interference that has to do with the way that WiFi works. When a WiFi network gets multiple requests for service, meaning that many devices in the home are asking for packets, the WiFi router gets overwhelmed easily and shuts down. It then reinitiates and sends packets to the first device that gets its attention. In a busy network environment the WiFi router will shut down and restart constantly as it tries to satisfy the many needed devices. This kind of interference was designed into the WiFi specification as a way to ensure that WiFi could satisfy the needs of multiple devices. This WiFi overhead can also easily add 15% or more to the network demand.

Anybody who lives in a home with active users understands how networks can get overwhelmed. How many of you have been frustrated trying to watch a movie when others in the house are using the Internet? Even big bandwidth can be overwhelmed. I have a friend who has a 100 Mbps fiber connection on Verizon FiOS. He went to watch a video and it wouldn’t stream. He found that his two teenage sons were each using half a dozen gaming streams at the same time and had basically exhausted his fast bandwidth pipe.

The FCC can tinker with the official definition of broadband since that is their prerogative. But what they can’t do is to define for any given home how much bandwidth they really need. The funny thing is that the big ISPs all understand this issue. The cable companies have unilaterally increased speeds across-the-board to urban customers several times in recent years and in most markets offer speeds considerably faster than the current FCC definition of broadband. These ISPs know that if they were only delivering 25 Mbps that they would be overwhelmed with customers complaining about the connection. Those complaints are the real proof of how much bandwidth many homes need. If the FCC lowers the definition of broadband then they have on blinders and are ignoring how homes really use broadband today. If they lower the speed definition it’s hard to see it as anything other than a political move.

More Pressure on WiFi

As if we really needed more pressure put onto our public WiFi spectrum, both Verizon and AT&T are now launching Licensed Assisted Access (LAA) broadband for smartphones. This is the technology that allows cellular carriers to mix LTE spectrum with the unlicensed 5 GHz spectrum for providing cellular broadband. The LAA technology allows for the creation of ‘fatter’ data pipes by combining multiple frequencies, and the wider the data pipe the more data that makes it to the end-user customer.

When carriers combine frequencies using LAA they can theoretically create a data pipe as large as a gigabit while only using 20 MHz of licensed frequency. The extra bandwidth for this application comes mostly from the unlicensed 5 GHz band and is similar to the fastest speeds that we can experience at home using this same frequency with 802.11AC. However, such high-speed bandwidth is only useful for a short distance of perhaps 150 feet and the most practical use of LAA is to boost cellphone data signals for customers closest to a cell tower. That’s going to make LAA technology most beneficial in dense customer environments like busy downtown areas, stadiums, etc. LAA isn’t going to provide much benefit to rural cellphone towers or those along interstate highways.

Verizon recently did a demonstration of the LAA technology that achieved a data speed of 953 Mbps. They did this using three 5 GHz channels combined with one 20 megahertz channel of AWS spectrum. Verizon used a 4X4 MIMO (multiple input / multiple output) antenna array and 256 QAM modulation to achieve this speed. The industry has coined the new term of four-carrier aggregation for the technology since it combines 4 separate bands of bandwidth into one data pipe. A customer would need a specialized MIMO antenna to receive the signal and also would need to be close to the transmitter to receive this kind of speed.

Verizon is starting to update selected cell sites with the technology this month. AT&T has announced that they are going to start introducing LAA technology along with 4-way carrier aggregation by the end of this year. It’s important to note that there is a big difference between the Verizon test with 953 Mbps speeds and what customers will really achieve in the real world. There are numerous factors that will limit the benefits of the technology. First, there aren’t yet any handsets with the right antenna arrays and it’s going to take a while to introduce them. These antennas look like they will be big power eaters, meaning that handsets that try to use this bandwidth all of the time will have short battery lives. But there are more practical limitations. First is the distance limitation and many customers will be out of range of the strongest LAA signals. A cellular company is also not going to try to make this full data connection using all 4 channels to one customer for several reasons, the primary one being the availability of the 5 GHz frequency.

And that’s where the real rub comes in with this technology. The FCC approved the use of this new technology last year. They essentially gave the carriers access to the WiFi spectrum for free. The whole point of licensed spectrum is to provide data pipes for all of the many uses not made by licensed wireless carriers. WiFi is clearly the most successful achievement of the FCC over the last few decades and providing big data pipes for public use has spawned gigantic industries and it’s hard to find a house these days without a WiFi router.

The cellular carriers have paid billions of dollars for spectrum that only they can use. The rest of the public uses a few bands of ‘free’ spectrum, and uses it very effectively. To allow the cellular carriers to dip into the WiFi spectrum runs the risk of killing that spectrum for all of the other uses. The FCC supposedly is requiring that the cellular carriers not grab the 5 GHz spectrum when it’s already busy in use. But to anybody that understands how WiFi works that seems like an inadequate protection, because any of the use of this spectrum causes interference by definition.

In practical use if a user can see three or more WiFi networks they experience interference, meaning that more than one network is trying to use the same channel at the same time. It is the nature of this interference that causes the most problems with WiFi performance. When two signals are both trying to use the same channel, the WiFi standard causes all competing devices to go quiet for a short period of time, and then both restart and try to grab an open channel. If the two signals continue to interfere with each other, the delay time between restarts increases exponentially in a phenomenon called backoff. As there are more and more collisions between competing networks, the backoff increases and the performance of all devices trying to use the spectrum decays. In a network experiencing backoff the data is transmitted in short bursts between the times that the connection starts and stops from the interference.

And this means that when the cellular companies use the 5 GHz spectrum they will be interfering with the other users of that frequency. That’s what WiFi was designed to do and so the interference is unavoidable. This means other WiFi users in the immediate area around an LAA transmitter will experience more interference and it also means a degraded WiFi signal for the cellular users of the technology – and they reason they won’t get speeds even remotely close to Verizon’s demo speeds. But the spectrum is free for the cellular companies and they are going to use it, to the detriment of all of the other uses of the 5 GHz spectrum. With this decision the FCC might well have nullified the tremendous benefits that we’ve seen from the 5 GHz WiFi band.

FCC Takes a New Look at 900 MHz

The FCC continues its examination of the best use of spectrum and released a Notice of Inquiry on August 4 looking at the 900 MHz band of spectrum. They want to know if there is some better way to use the spectrum block. They are specifically looking at the spectrum between 896–901 MHz and 935-940 MHz.

The FCC first looked at this frequency in 1986 and the world has changed drastically since then. The frequency is currently divided into 399 narrowband channels grouped into 10-channel blocks. This licensed use of the spectrum varies by MTA (Major Trading Area), where channels have been allocated according to local demand from commercial users.

One of the more common uses of the spectrum is for SMR service (Specialized Mobile Radio), which is the frequency that’s been used in taxis and other vehicle fleets for many years. The other use is more commonly referred to as B/ILT purposes (Business/Industrial Land Transportation). This supports radios in work fleets, and is used widely to monitor and control equipment (such as monitoring water pumps in a municipal water system). The frequency was also widely used historically for public safety / police networks using push-button walkie-talkies (although cellphones have largely taken over that function).

The FCC currently identifies 2,700 sites used by 500 licensees in the country that are still using B/ILT radios and technologies. These uses include security at nuclear power plants including public alert notifications, flood warning systems, smart grid monitoring for electric networks, and for monitoring petroleum refineries and natural gas distribution systems.

But we live in a bandwidth hungry world. One of the characteristics of this spectrum is that it’s largely local in nature (good for distances of up to a few miles, at most). When mapping the current uses of the frequency it’s clear that there are large portions of the country where the spectrum is not being used. And this has prompted the FCC to ask if there is a better use of the spectrum.

Typically the FCC always finds ways to accommodate existing users and regardless of any changes made it’s unlikely that they are going to cut off use of the spectrum in nuclear plants, electric grids and water systems. But to a large degree the spectrum is being underutilized. Many of the older uses of the spectrum such as walkie-talkies and push-to-talk radios have been supplanted by newer technologies using other spectrum. With that said, there are still some places where the old radios of this type are still in use.

The FCC’s action was prompted by a joint proposal by the Enterprise Wireless Alliance (EWA) and Pacific DataVision (PDV). This petition asks for the frequency to be realigned into three 3 MHz bands that can be used for wireless broadband and two 2 MHz bands that could be used to continue to support the current narrowband uses of the spectrum. They propose that the broadband channels be auctioned to a single user in each BTA but that the narrowband uses continue to be licensed upon request in the same manner as today.

This docket is a perfect example of the complexities that the FCC always has deal with in changing the way that we use spectrum. The big question that has to always be addressed by the FCC is what to do with existing users of the spectrum. Any new allocation plan is going to cause many existing users to relocate their spectrum within the 900 MHz block or to spectrum elsewhere. And it’s generally been the practice of the FCC to make new users of spectrum pay to relocate older uses of spectrum that must be moved. And so the FCC must make a judgement call about whether it makes monetary sense to force relocation.

The FCC also has to always deal with technical issues like interference. Changing the way the spectrum will be used from numerous narrowband channels to a few wideband channels is going to change the interference patterns with other nearby spectrum. And so the FCC must make a determination of the likelihood of a spectrum change not causing more problems than it solves.

This particular band is probably one of the simpler such tasks the FCC can tackle. While the users of the spectrum perform critical tasks with the current spectrum, there is not an unmanageable number of current users and there are also large swaths of the US that have no use at all. But still, the FCC does not want to interfere with the performance at nuclear plants, petroleum refineries or electric grids.

For anybody that wants to read more about how the FCC looks at spectrum, here is the FCC Docket 17-200. The first thing you will immediately notice is that this document, like most FCC documents dealing with wireless spectrum, is probably amongst the most jargon-heavy documents produced by the FCC. But when talking about spectrum the jargon is useful because the needed discussions must be precise. But it is a good primer on the complications involved in changing the way we use spectrum. There has been a recent clamor from the Congress to free up more spectrum for cellular broadband, but this docket is a good example of how complex of an undertaking that can be.