States and Net Neutrality

We now know how states are going to react to the end of net neutrality. There are several different responses so far. First, a coalition of 23 states filed a lawsuit challenging the FCC’s ability to eliminate net neutrality and Title II regulation of broadband. The lawsuit is mostly driven by blue states, but there are red states included like Mississippi and Kentucky.

The lawsuit argues that the FCC has exceeded its authority in eliminating net neutrality. The lawsuit makes several claims:

  • The suit claims that under the Administrative Procedure Act (ACA) the FCC can’t make “arbitrary and capricious” changes to existing policies. The FCC has defended net neutrality for over a decade and the claim is that the FCC’s ruling fails to provide enough justification for abandoning the existing policy.
  • The suit claims that the FCC ignored the significant public record filed in the case that details the potential harm to consumers from ending net neutrality.
  • The suit claims that the FCC exceeded its authority by reclassifying broadband service as a Title I information service rather than as a Title II telecommunications service.
  • Finally, the suit claims that the FCC ruling improperly preempts state and local laws.

Like with past challenges of major FCC rulings, one would expect this suit to go through at least several levels of courts, perhaps even to the supreme court. It’s likely that the loser of the first ruling will appeal. This process is likely to take a year or longer. Generally, the first court to hear the case will determine quickly if some or all of the FCC’s ruling net neutrality order will be stayed until resolution of the lawsuit.

I lamented in a recent blog how partisan this and other FCCs have gotten. It would be a positive thing for FCC regulation in general if the courts put some cap on the ability of the FCC to create new policy without considering existing policies and the public record about the harm that can be caused by a shift in policy. Otherwise we face having this and future FCCs constantly changing the rules every time we get a new administration – and that’s not healthy for the industry.

A second tactic being used by states is to implement a state law that effectively implements net neutrality at the state level. The states of New York, New Jersey and Montana have passed laws that basically mimic the old FCC net neutrality rules at the state level. It’s an interesting tactic and will trigger a lawsuit about state rights if challenged (and I have to imagine that somebody will challenge these laws). I’ve read a few lawyers who opine that this tactic has some legs since the FCC largely walked away from regulating broadband, and in doing so might have accidentally opened up the door for the states to regulate the issue. If these laws hold up that would mean a hodgepodge of net neutrality rules by state – something that benefits nobody.

Another tactic being taken is for states, and even a few cities, to pass laws that change the purchasing rules so that any carrier that bids for government telecom business must adhere to net neutrality. This is an interesting tactic and I haven’t seen anybody that thinks this is not allowed. Governments have wide latitude in deciding the rules for purchasing goods and services and there are already many similar restrictions that states put onto purchasing. The only problem with this tactic is going to be if eventually all of the major carriers violate the old net neutrality rules. That could leave a state with poor or no good choice of telecom providers.

As usual, California is taking a slightly different tactic. They want to require that carriers must adhere to net neutrality if they use state-owned telecom facilities or facilities that were funded by the state. Over the years California has built fiber of its own and also given out grants for carriers to build broadband networks. This includes a recently announced grant program that is likely to go largely to Frontier and CenturyLink. If this law is upheld it could cause major problems for carriers that have taken state money in the past.

It’s likely that there are going to be numerous lawsuits challenging different aspects of the various attempts by states to protect net neutrality. And there are likely to also be new tactics tried by states during the coming year to further muddy the picture. It’s not unusual for the courts to finally decide the legitimacy of major FCC decisions. But there are so many different tactics being used here that we are likely to get conflicting rulings from different courts. It’s clearly going to take some time for this to all settle out.

One interesting aspect of all of this is how the FCC will react if their cancellation of net neutrality is put on hold by the courts. If that happens it means that some or all of net neutrality will still be the law of the land. The FCC always has the option to enforce or not enforce the rules, so you’d suspect that they wouldn’t do much about ISPs that violate the spirit of the rules. But more importantly, the FCC is walking away from regulating broadband as part of killing Title II regulation. They are actively shuttling some regulatory authority to the FTC for issues like privacy. It seems to me that this wouldn’t be allowed until the end of the various lawsuits. I think the one thing we can count on is that this is going to be a messy regulatory year for broadband.

Gigabit LTE

Samsung just introduced Gigabit LTE into the newest Galaxy S8 phone. This is a technology with the capability to significantly increase cellular speeds, and which make me wonder if the cellular carriers will really be rushing to implement 5G for cellphones.

Gigabit LTE still operates under the 4G standards and is not an early version of 5G. There are three components of the technology:

  • Each phone has as 4X4 MIMO antenna, which is an array of four tiny antennae. Each antenna can make a separate connection to the cell tower.
  • The network must implement frequency aggregation. Both the phone and the cell tower must be able to combine the signals from the various antennas into one coherent data path.
  • Finally, the new technology utilizes the 256 QAM (Quadrature Amplitude Modulation) protocol which can cram more data into the cellular data path.

The amount of data speeds that can be delivered to a given cellphone under this technology is going to rely on a number of different factors:

  • The nearest cell site to a customer needs to be upgraded to the technology. I would speculate that this new technology will be phased in at the busiest urban cell sites first, then to busy suburban sites and then perhaps to less busy sites. It’s possible that a cellphone could make connections to multiple towers to make this work, but that’s a challenge with 4G technology and is one of the improvements promised with 5G.
  • The amount of data speed that can be delivered is going to vary widely depending upon the frequencies being used by the cellular carrier. If this uses existing cellular data frequencies, then the speed increase will be a combination of the impact of adding four data streams together, plus whatever boost comes from using 256 QAM, less the new overheads introduced during the process of merging the data streams. There is no reason that this technology could not use the higher millimeter wave spectrum, but that spectrum will use different antennae than lower frequencies.
  • The traffic volume at a given cell site is always an issue. Cell sites that are already busy with single antennae connections won’t have the spare connections available to give a cellphone more than one channel. Thus, a given connection could consist of one to four channels at any given time.
  • Until the technology gets polished, I’d have to bet that this will work a lot better with a stationary cellphone rather than one moving in a car. So expect this to work better in downtowns, convention centers, etc.
  • And as always, the strength of a connection to a given customer will vary according to how far a customer is from the cell site, the amount of local interference, the weather and all of those factors that affect radio transmissions.

I talked to a few wireless engineers and they guessed that this technology using existing cellular frequencies might create connections as fast as a few hundred Mbps in ideal conditions. But they could only speculate on the new overheads created by adding together multiple channels of cellular signal. There is no doubt that this will speed up cellular data for a customer in the right conditions, with the right phone near the right cell site. But adding four existing cellular signals together will not get close to a gigabit of speed.

It will be interesting to see how the cellular companies market this upgrade. They could call this gigabit LTE, although the speeds are likely to fall far short of a gigabit. They could also market this as 5G, and my bet is that at least a few of them will. I recall back at the introduction of 4G LTE that some carriers started marketing 3.5G as 4G, well before there were any actual 4G deployments. There has been so much buzz about 5G now for a year that the marketing departments at the cellular companies are going to want to tout that their networks are the fastest.

It’s always an open question about when we are going to hear about this. Cellular companies run a risk in touting a new technology if most bandwidth hungry users can’t yet utilize it. One would think they will want to upgrade some critical mass of cell sites before really pushing this.

It’s also going to be interesting to see how faster cellphone speeds affect the way people use broadband. Today it’s miserable to surf the web on a cellphone. In a city environment most connections are more than 10 Mbps today, but it doesn’t feel that fast because of shortfalls in the cellphone operating systems. Unless those operating systems get faster, there might not be that much noticeable different with a faster connection.

Cellphones today are already capable of streaming a single video stream, although with more bandwidth the streaming will get more reliable and will work under more adverse conditions.

The main impediment to faster cellphones really changing user habits is the data plans of the cellular carriers. Most ‘unlimited’ plans have major restrictions on using a cellphone to tether data for other devices. It’s that tethering that could make cellular data a realistic substitute for a home landline connection. My guess is until we reach a time when there are ubiquitous mini-cell sites spread everywhere that the cellular carriers are not going to let users treat cellular data the same as landline data. Until cellphones are allowed to utilize the broadband available to them, faster cellular data speeds might not have much impact on the way we use our cellphones.

Broadband Regulation in Limbo

The recent ruling earlier this week by the US Court of Appeals for the 9th Circuit highlights the current weak state of regulations over broadband. The case is one that’s been around for years and stems from AT&T’s attempt to drive customers off of their original unlimited cellphone data plans. AT&T began throttling unlimited customers when they reached some unpublished threshold of data use, in some cases as small as 2 GB in a month. AT&T then lied to the FCC about the practice when they inquired. This case allows the FTC suit against AT&T to continue.

The ruling demonstrates that the FTC has some limited jurisdiction over common carriers like AT&T. However, the clincher came when the court ruled that the FTC only has jurisdiction over issues where the carriers aren’t engaging in common-carrier services. This particular case involves AT&T not delivering a product they promised to customers and thus falls under FTC jurisdiction. But the court made it clear that future cases that involve direct common carrier functions, such as abuse of net neutrality would not fall under the FTC.

This case clarifies the limited FTCs jurisdiction over ISPs and contradicts the FCC’s statements that the FTC is going to be able to step in and take their place on most matters involving broadband. The court has made it clear that is not the case. FCC Chairman Ajit Pai praised this court ruling and cited it as a good example of how the transition of jurisdiction to the FTC is going to work as promised. But in looking at the details of the ruling, that is not true.

This court ruling makes it clear that there is no regulatory body now in charge of direct common carrier issues. For instance, if Netflix and one of the ISPs get into a big fight about paid prioritization there would be nowhere for Netflix to turn. The FCC would refuse to hear the case. The FTC wouldn’t be able to take the case since it involves a common carrier issue. And while a court might take the case, they would have no basis on which to make a ruling. As long as the ISP didn’t break any other kinds of laws, such as reneging on a contract, a court would have no legal basis on which to rule for or against the ISPs behavior.

That means not only that broadband is now unregulated, it also means that there is no place for some body to complain against abuse by ISPs until the point where that abuse violates some existing law. That is the purest definition of limbo that I can think of for the industry.

To make matters worse, even this jumbled state of regulation is likely to more muddled soon by the courts involved in the various net neutrality suits. Numerous states have sued the FCC for various reasons, and if past practice holds, the courts are liable to put some or all of the FCC’s net neutrality decision on hold.

It’s hard to fathom what that might mean. For example, if the courts were to put the FCC’s decision to cancel Title II regulation on hold, then that would mean that Title II regulation would still be the law of the land until the net neutrality lawsuits are finally settled. But this FCC has made it clear that they don’t want to regulate broadband and they would likely ignore such a ruling in practice. The Commission has always had the authority to pick and choose cases it will accept and I’m picturing that they would refuse to accept cases that relied on their Title II regulation authority.

That would be even muddier for the industry than today’s situation. Back to the Netflix example, if Title II regulation was back in effect and yet the FCC refused to pursue a complaint from Netflix, then Netflix would likely be precluded from trying to take the issue to court. The Netflix complaint would just sit unanswered at the FCC, giving Netflix no possible remedy, or even a hearing about their issues.

The real issue that is gumming up broadband regulation is not the end of Title II regulation. The move to Title II regulation just became effective with the recent net neutrality decision and the FCCs before that had no problem tackling broadband issues. The real problem is that this FCC is washing their hands of broadband regulation, and supposedly tossed that authority to the FTC – something the court just made clear can’t work in the majority of cases.

This FCC has shown that there is a flaw in their mandate from Congress in that they feel they are not obligated to regulate broadband. So I guess the only fix will be if Congress makes the FCC’s jurisdiction, or lack of jurisdiction clear. Otherwise, we couldn’t even trust a future FCC to reverse course, because it’s now clear that the decision to regulate or not regulate broadband is up to the FCC and nobody else. The absolute worst long-term outcome would be future FCCs regulating or not regulating depending upon changes in the administration.

My guess is that AT&T and the other big ISPs are going to eventually come to regret where they have pushed this FCC. There are going to be future disputes between carriers and the ISPs are going to find that the FCC can not help them just like they can’t help anybody complaining against them. That’s a void that is going to serve this industry poorly.

Public Networks and Privacy

I’ve been investigating smart city applications and one of the features that many smart network vendors are touting is expanded public safety networks that can provide cameras and other monitoring devices for police, making it easier to monitor neighborhoods and solve crimes. This seems like something most police departments have on their wish list, because cameras are 24/7 and can see things that people are never likely to witness.

The question I ask today is if this what America wants? There are a few examples of cities with ubiquitous video surveillance like London, but is that kind of surveillance going to work in the US?

I think we’ve gotten our first clue from Seattle. The City installed a WiFi mesh network using Aruba wireless equipment in 2013 with a $3.6 million grant from the Department of Homeland Security. The initial vision for the network was that it would be a valuable tool to provide security in the major port in Seattle as well as provide communications for first responders during emergencies. At the time of installation the city intended to also make the surveillance capabilities available to numerous departments within the City, not just to the police.

But when the antennas, like the one shown with this blog, went up in downtown Seattle in 2013, a number of groups began questioning the city about their surveillance policies and the proposed use of these devices. Various groups including the ACLU voiced concerns that the network would be able to track cellphones, laptops and other devices that have MAC addresses. This could allow the City to gather information on anybody moving in downtown or the Port and might allow the City to do things like identify and track protesters, monitor who enters and leaves downtown buildings, track the movement of homeless people who have cellphones, etc.

Privacy groups and the ACLU complained to the City that the network effectively was a suspicionless surveillance system that monitors the general population and is a major violation of various constitutional rights. The instant and loud protests about the network caught City officials by surprise and by the end of 2013 they deactivated the network until they developed a surveillance policy. The city never denied that the system could monitor the kinds of things that citizens were wary of. That surveillance policy never materialized, and the City recently hired a vendor to dismantle the network and salvage any usable parts for use elsewhere in the City.

I can think of other public outcries that have led to discontinuance of public monitoring systems, particularly speed camera networks that catch and ticket speeders. Numerous communities tried that idea and scrapped it after massive citizen outrage. New York City installed a downtown WiFi network a few years ago that was to include security cameras and other monitoring devices. From what I read they’ve never yet activated the security features, probably for similar reasons. A web search shows that other cities like Chicago have implemented a network similar to Seattle’s and have not gotten the negative public reaction.

The Seattle debacle leads to the question of what is reasonable surveillance. The developers of smart city solutions today are promoting the same kinds of features contained in the Seattle network, plus new ones. Technology has advanced since 2013 and newer systems are promising to include the first generation of facial recognition software and also the ability to identify people by their walking gait. These new monitoring devices won’t just track people with cellphones and can identify and track everybody.

I think there is probably a disconnect between what smart city vendors are developing and what the public wants out of their city government. I would think that most citizens are in favor of smart city solutions like smart traffic systems that would eliminate driving backups, such as changing the timing of lights to get people through town as efficiently as possible.

But I wonder how many people really want their City to identify and track them every time they go within reach of one of City monitors. The information gathered by such monitors can be incredibly personal. It identifies where somebody is including a time stamp. The worry is not just that a City might misuse such personal information, but IT security guys I’ve talked to believe that many Municipal IT networks are susceptible to hacking.

In the vendors defense they are promoting features that already function well. Surveillance cameras and other associated monitors are tried and true technologies that work. Some of the newer features like facial recognition are cutting edge, but surveillance  systems installed today can likely be upgraded with software changes as the technology gets better.

I know I would be uncomfortable if my city installed this kind of surveillance system. I don’t go downtown except go to restaurants or bars, but what I do is private and is not the city’s business. Unfortunately, I suspect that city officials all over the country will get enamored by the claims from smart city vendors and will be tempted to install these kinds of systems. I just hope that there is enough public discussion of city plans so that the public understands what their city is planning. I’m sure there are cities where the public will support this technology, but plenty of others where citizens will hate the idea. Just because we have the technical capabilities to monitor everybody doesn’t mean we ought to.

The Infrastructure Plan and Broadband

The administration finally published their infrastructure plan last week. The document is an interesting read, particularly for those with a financial bent like me. There is no way this early to know if this plan has any chance to make it through Congress, or how much it might change if it does pass. But it’s worth reviewing because it lets us know what the government is thinking about infrastructure and rural broadband.

First, the details of the plan:

  • The proposed plan provides $200B of federal funding over 10 years;
  • $100B goes to States in the form of a 20% grant for new projects that won’t require additional future federal spending;
  • $50B is set aside as grants to states as a grant program for rural infrastructure. States can use the money as they wish;
  • $20B goes to projects that are in the demonstration phase of new technologies and that can’t easily attract other financing;
  • $20B would to towards existing federal loan programs including Transportation Infrastructure Finance and Innovation Act (TIFIA) and the Water Infrastructure Finance and Innovation Act (WIFIA).
  • Finally, $10 billion would be used to create a revolving fund that would allow the purchase, rather than the lease of federal infrastructure.

The funding for the program is a bit murky, as you would expect at this early stage. It appears that some of the funding comes from existing federal highway infrastructure funds, and one might suppose those funds will still be aimed at highways.

This plan gives governors and state legislators a lot of new money to disperse, meaning that every state is likely to tackle this in a different way. That alone is going to mean a varied approach to funding or not funding rural broadband.

The plan is completely mute in terms of broadband funding. This makes sense since the plan largely hands funds to the states. The program does not promote rural broadband, but it certainly does not preclude it. The most likely source of any funding for rural broadband would come out of the $50B rural pot of funding. We’ll have to wait and see what strings are attached to that money, but the proposal would largely hand this money to states and let them decide how to use it.

The larger $100B pot is to be used to provide up to 20% of the funding for projects and there are very few rural fiber projects that don’t need more than 20% assistance to make them viable. If the 20% funding basis is firm for this pot of funding I can’t see it being used much for broadband.

States are not going to like this $100B funding pool because this completely flips the role of the federal government in infrastructure funding. Today, for many road and bridge projects the federal government supplies as much as 80% of the funding, and this flips the 80% to the states. Because of this, States are likely to view this as an overall long-term decrease in federal infrastructure spending. The trade-off for the flip, though, is that the money is immediate and the states get to decide what to fund. Today, when the feds are involved it can take literally decades to finish some road projects.

The overall goal of the plan is to promote private investment in infrastructure projects, which contrasts to today where almost all infrastructures projects are 100% government funded. Historically public/private partnerships (PPPs) have played only a small role in US infrastructure spending. PPPs have been successful in other countries for helping to build infrastructure projects on time and on budget – which is a vast improvement over government projects that routinely go over on both. But incorporating PPP financing into infrastructure spending is going to take a change of mindset. That challenge is going to be complicated by the fact that most of this spending will be dispersed by the states. And it’s the states that will or will not embrace PPPs, so we’ll probably have a varied response across the country.

One of the most interesting ideas embedded in the plan is that projects should be funded in such a way as to cover the long-term maintenance costs of a project. That’s a big change from today where roads, bridges and other major projects are constructed with no thought given about the funding for the ongoing maintenance, or even for related costs of a project for environmental and other ancillary costs. This is going to force a change in the way of thinking about infrastructure to account for the full life-cycle cost of a project up-front.

I’ve read a few private reports from investment houses and their take on the plans. The analysis I’ve seen believes that the vast majority of the money will go to highway, bridges and water projects. That might mean very little for rural broadband.

One thing is for sure, though. If something like this plan becomes law then the power to choose infrastructure projects devolves largely to states rather than the federal government. Today states propose projects to the feds, but under this plan the states would be able to use much of the federal funding as they see fit.

There are states that already fund some rural broadband infrastructure, and you might suppose those states would shuttle some of this new funding into those programs. But there are other states, some very rural, that have rejected the idea of helping to fund broadband. Expect a widely varying response if the states get the power to choose projects.

In summary, this plan is not likely going to mean any federal broadband grant program. But states could elect to take some of this funding, particularly the $50B rural fund, and use it to promote rural broadband. But there are likely to be as many different responses to this funding as there are states. We have a long way to go yet to turn this proposal into concrete funding opportunities.

Another FCC Giveaway

The FCC just voted implement a plan to give up to $4.53 billion dollars to the cellular carriers over the next ten years to bring LTE cellular and data to the most remote parts of America. While this sounds like a laudable goal, this FCC seems determined to hand out huge sums of money to the biggest telecom companies in the country. This program is labeled Mobility II and will be awarded through an auction among the cellular companies.

As somebody who travels frequently in rural America there certainly are still a lot of places with poor or no cellphone coverage. My guess is that the number of people that have poor cellphone coverage is greater than what the FCC is claiming. This fund is aimed at providing coverage to 1.4 million people with no LTE cellphone coverage and another 1.7 million people where the LTE coverage is subsidized.

Recall that the FCC’s knowledge of cellphone coverage comes from the cellphone companies who claim better coverage than actually exists. Cellphone coverage is similar to DSL where the quality of signal to a given customer depends upon distance from a cellphone tower. Rural America has homes around almost every tower that have crappy coverage and that are probably not counted in these figures.

My main issue with the program is not the goal – in today’s world we need cellphone coverage except to the most remote places in the country. My problem is that the perceived solution is to hand yet more billions to the cellular carriers – money that could instead fund real broadband in rural America. Additionally, the ten-year implementation is far too long. That’s an eternity to wait for an area with no cellular coverage.

I think the FCC had a number of options other than shelling out billions to the cellular companies:

  • The FCC could require the cellular companies to build these areas out of their own pockets as a result of having taken the licensed spectrum. Other than Sprint, these companies are extremely profitable right now and just got a lot more profitable because of the recent corporate tax-rate reductions. The FCC has always had build-out requirements for spectrum and the FCC could make it mandatory to build the rural areas as a condition for retaining the spectrum licenses in the lucrative urban areas.
  • The FCC could instead give unused spectrum to somebody else that is willing to use it. The truth is that the vast majority of licensed spectrum sits unused in rural America. There is no reason that spectrum can’t come with a use-it-or-lose it provision so that unused spectrum reverts back to the FCC to give to somebody else. There are great existing wireless technologies that work best with licensed spectrum and it’s aggravating to see the spectrum sit unused but still unavailable to those who might use it.
  • Finally, the FCC could force the cellular carriers to use towers built by somebody else. I work with a number of rural counties that would gladly build towers and the necessary fiber to provide better cellphone coverage. It would cost the cellular carriers nothing more than the cell site electronics if others were to build the needed core infrastructure.

This idea of handing billions to the big telecom companies is a relatively new one. Obviously the lobbyists of the big companies have gained influence at the FCC. It’s not just this FCC that is favoring the big companies. Originally the CAF II program was going to be available to everybody using reverse auction rules. But before that program was implemented the Tom Wheeler FCC decided to instead just give the money to the big telcos if they wanted it. The telcos even got to pick and choose and reject taking funding for remote places which will now be auctioned this summer.

That same CAF II funding could have been used to build a lot of rural fiber or other technologies that would have provided robust broadband networks. But instead the telcos got off the hook by having to only upgrade to 10/1 Mbps – a speed that was already obsolete at the time of the FCC order.

Now we have yet another federal program that is going to shovel more billions of dollars to big companies to provide broadband that will supposedly meet a 10/1 Mbps speed. But like with CAF II, the carriers will get to report the results of the program to the FCC. I have no doubt that they will claim success even if coverage remains poor. Honestly, there are days as an advocate for rural broadband that you just want to bang your head against a wall it’s hard to see billions and billions wasted that could have brought real broadband to numerous rural communities.

A Hybrid Model for Rural America

Lately I’ve looked at a lot of what I call a hybrid network model for bringing broadband to rural America. The network involves building a fiber backbone to support wireless towers while also deploying fiber to any pockets of homes big enough to justify the outlay. It’s a hybrid between point-to-multipoint wireless and fiber-to-the home.

I’ve never yet seen a business model that shows a feasible model for building rural FTTP without some kind of subsidy. There are multiple small telcos building fiber to farms using some subsidy funding from the A-CAM portion of the Universal Service Fund. And there are state broadband grant programs that are helping to build rural fiber. But otherwise it’s hard to justify building fiber in places where the cost per passing is $10,000 per household or higher.

The wireless technology I’m referring is a point-to-multipoint wireless network using a combination of frequencies including WiFi and 3.65 GHz. The network consists of placing transmitters on towers and beaming signals to dishes at a customer location. In areas without massive vegetation or other impediments this technology can now reliably deliver 25 Mbps download for 6 miles and higher bandwidth closer to the tower.

A hybrid model makes a huge difference in financial performance. I’ve now seen an engineering comparison of the costs of all-fiber and a hybrid network in half a dozen counties and the costs for building a hybrid network are in the range of 20% – 25% of the cost of building fiber to everybody. That cost reductions can result in a business model with a healthy return that creates significant positive cash over time.

There are numerous rural WISPs that are building wireless networks using wireless backhaul rather than fiber to get bandwidth to the towers. That solution might work at first, although I often see new wireless networks of this sort that can’t deliver the 25 Mbps bandwidth to every customer due to backhaul restraints.  It’s guaranteed that the bandwidth demands from customers on any broadband network will eventually grow to be larger than the size of the backbone feeding the network. Generally, over a few years a network using wireless backhaul will bog down at the busy hour while a fiber network can keep up with customer bandwidth demand.

One key component of the hybrid network is to bring fiber directly to customers that live close to the fiber. This means bringing fiber to any small towns or even small pockets of 20 or more homes that are close together. It also means giving fiber to farms and rural customers that happen to live along the fiber routes. Serving some homes with fiber helps to hold down customer density on the wireless portion of the network – which improves wireless performance. Depending on the layout of a rural county, a hybrid model might bring fiber to as much as 1/3 of the households in a county while serving the rest with wireless.

Another benefit of the hybrid model is that it moves fiber deeper into rural areas. This can provide the basis for building more fiber in the future or else upgrading wireless technologies over time for rural customers.

A side benefit of this business plan is that it often involves build a few new towers. Areas that need towers typically already have poor, or nonexistent cellular cover. The new towers can make it easier for the cellular companies to fill in their footprint and get better cellular service to everybody.

One reason the hybrid model can succeed is the high customer penetration rate that comes when building the first real broadband network into a rural area that’s never had it. I’ve now seen the customer numbers from numerous rural broadband builds and I’ve seen customer penetration rates range between 65% and 85%.

Unfortunately, this business plan won’t work everywhere, due to the limitations of wireless technology. It’s much harder to deploy a wireless network of this type in an area with heavy woods or lots of hills. This is a business plan for the open plains of the Midwest and West, and anywhere else with large areas of open farmland.

County governments often ask me how they can get broadband to everybody in their county. In areas where the wireless technology will work, a hybrid model seems like the most promising solution.

Setting the FCC Definition of Broadband

In the recently released 2018 Broadband Progress Report the FCC reluctantly kept the official definition of broadband at 25/3 Mbps. I say reluctantly because three of the Commissioners were on record for either eliminating the standard altogether or else reverting back to the older definition of 10/1 Mbps.

I’m guessing the Commissioners gave in to a lot of public pressure to keep the 25/3 standard. Several Commissioners had also taken a public stance that they wanted to allow cellular data to count the same for a household as landline broadband – and that desire was a big factor in lowering the definition since cellphones rarely meet the 25/3 speed standard.

The deliberation on the topic this year raises the question if there is some way to create a rule that would better define the speed of needed broadband. It’s worth looking back to see how the Tom Wheeler FCC came up with the 25/3 definition. They created sample profiles of the way that households of various sizes are likely to want to use broadband. In doing so, they added together the bandwidth needed for various tasks such as watching a movie or supporting a cellphone.

But the FCC’s method was too simple and used the assumption that various simultaneous uses of broadband are additive. They added together the uses for a typical family of four which resulted in bandwidth needs greater than 20 Mbps download, and used that as the basis for setting the 25/3 standard. But that’s now home broadband works. There are several factors that affect the actual amount of bandwidth being used:

For example, doing simultaneous tasks on a broadband network increases the overhead on the home network. If you are watching a single Netflix stream, the amount of needed bandwidth is predictable and steady. But if three people in a home are each watching a different Netflix the amount of needed bandwidth is greater than adding together the three theoretical streams. When your ISP and your home router try to receive and untangle multiple simultaneous streams there are collisions of packets that get lost and which have to be retransmitted. This is described as adding ‘overhead’ to the transmission process. Depending on the nature of the data streams the amount of collision overhead can be significant.

Almost nobody directly wires the signal from their ISP directly too all of their devices. Instead we use WiFi to move data around to various devices in the home. A WiFi router has an overhead of its own that adds to the overall bandwidth requirement. As I’ve covered in other blogs, a WiFi network is not impacted only by the things you are trying to do in your home, but a WiFi network is slowed when it pauses to recognizes demands for connection from your neighbor’s WiFi network.

Any definition of home broadband needs should reflect these overheads. If a household actually tries to download 25 Mbps of usage from half a dozen sources at the same time on a 25 Mbps, the various overheads and collisions will nearly crash the system.

The FCC’s definition of broadband also needs to reflect the real world. For example, most of the unique programming created by Netflix and Amazon Prime are now available in 4K. I bought a large TV last year and we now watch 4K when it’s available. That means a stream of 15-20 Mbps download. That stream forced me to upgrade my home WiFi network to bring a router into the room with the TV.

The FCC’s speed definition finally needs to consider the busy hour of the day – the time when a household uses the most broadband. That’s the broadband speed that the home needs.

We know household bandwidth needs keep increasing. Ten years ago I was happy with a 5 Mbps broadband product. Today I have a 60 Mbps product that seems adequate, but I know from tests I did last year that I would be unhappy with a 25 Mbps connection.

The FCC needs a methodology that would somehow measure actual download speeds at a number of homes over time to understand what homes area really using for bandwidth. There are ways that this could be done. For example, the FCC could do something similar for broadband like what Nielsen does for cable TV. The FCC could engage one of the industry firms that monitor broadband usage such as Akamai to sample a large number of US homes. There could be sample voluntary homes that meet specific demographics that would allow monitoring of their bandwidth usage. The accumulated data from these sample homes would provide real-life bandwidth usage as a guide to setting the FCC’s definition of broadband. Rather than changing the official speed periodically, the FCC could change the definition as needed as dictated by the real-world data.

The FCC does some spot checking today of the broadband speeds as reported by the ISPs that feed the national broadband map. But that sampling is random and periodic and doesn’t provide the same kind of feedback that a formal ongoing measuring program would show. We have tools that could give the FCC the kind of feedback it needs. Of course, there are also political and other factors used in setting the official definition of broadband, and so perhaps the FCC doesn’t want real facts to get into the way.

Building Fiber to Anchor Institutions

The Schools, Health & Libraries Broadband Coalition (SHLB) announced a strategy to bring broadband to every anchor institution in the continental US. They estimate this would cost between $13 and $19 billion. They believe this would act as a first step to bring broadband to unserved and underserved rural communities.

While this sounds like a reasonable idea, we’ve tried this before and it largely hasn’t worked. Recall that the BTOP program in 2009 and 2010 funded a lot of middle mile fiber projects that brought broadband deeper into parts of the country that didn’t have enough fiber. That program required the BTOP middle mile fiber providers to serve all anchor institutions along the path of their networks and was a smaller version of this same proposal.

We’re approaching a decade later and a lot of the communities connected by BTOP middle mile grants still don’t have a last mile broadband network. There are some success stories, so I don’t want to say that middle mile fiber has no value – but for the most part nobody is making that last mile investment in rural areas just because the BTOP middle mile fiber was built.

BTOP isn’t the only program that has built fiber to anchor institutions. There are a number of states and counties that have built fiber networks for the express purposes of serving anchor institutions. There are also numerous fiber networks that have been built by school systems to support the schools.

In many cases I’ve seen these various anchor institution networks actually hurt potential last mile fiber investment. Anybody that is going to build rural fiber needs as many ‘large’ customers as it can get to help offset building expensive rural fiber. I’ve had clients who were thinking about building fiber to a small rural town only to find out that the school, city hall and other government locations already had inexpensive broadband on an existing fiber network. Taking those revenues out of the equation can be enough to sink a potential business plan.

At least BTOP fiber required that the network owners make it easy for last mile providers to get reasonably priced backbone access on their networks. Many of the state and school board networks are prohibited from allowing any commercial use of their network. I’ve never understood these prohibitions against sharing spare pairs of government fiber with others, but they are fairly common. Most come from State edicts that are likely prompted by the lobbyists for the big carriers.

I’m sure I’ll take some flak for my position, but I’ve seen the negative results of this idea too many times in the real world. Communities get frustrated when they see a gigabit connection at a school or City Hall when nobody else in the area has decent broadband. I’ve even seen government staff and officials who have fast broadband in their offices turn a deaf ear to the rest of the community that has poor or no broadband.

To make matters worse, many of the BTOP networks have run into economic difficulties. The companies that invested in BTOP bought into the hype that the middle mile fiber networks would attract last mile fiber investments, and they counted on those extra revenues for long-term viability. But a significant portion of the BTOP middle mile networks ended up being fiber to nowhere. Companies funded by BTOP needed to bring matching capital, and a number of the BTOP providers have had to sell their networks at a huge discount and walk away from their unpaid debt since the revenues to cover debt payments never materialized.

This also raises the question of who is going to maintain the enormous miles of fiber that would be built by this proposal. Somebody has to pay the electric bill to keep the fiber lit. Somebody needs to do routine maintenance as well as fix fiber cuts and storm damage. And somebody has to pay to periodically replace the electronics on the network, which have an average economic life of around ten years.

I feel certain I will get an inbox full of comments about this blog. I’m bound to get stories telling me about some of the great success stories from the BTOP networks – and they do exist. There are cases where the middle mile fiber made it easier for some ISP to build last mile fiber to a rural community. And certainly a lot of extremely rural schools, libraries and other anchor institutions have benefitted from the BTOP requirement to serve them. But I believe there are more stories of failure that offset the success stories.

I seriously doubt that this FCC and administration would release this much money for any kind of rural broadband. But this is the kind of idea that can catch the interest of Congress and that could somehow get funded. There is no politician in DC who will take a stance against schools and libraries.

I can think of much better ways to spend that much money in ways that would bring broadband solutions many whole rural communities, not just to the anchor institutions. That’s not enough money to fix all of our rural broadband issues, but it would be a great start, particularly if distributed in a grant program for last mile projects that requires matching private investment.

Federal Funding for Broadband Infrastructure

There is a lot of speculation that we might be seeing some money aimed at broadband due to the budget passed by Congress on February 9. That bill contains $20 billion for infrastructure spending spread evenly in fiscal years 2018 and 2019. On a floor speech as part of the vote, Senate Majority Leader Charles Schumer says the money will go towards “existing projects for water and energy infrastructure as well as expanding broadband to rural regions and improving surface transportation”.

Any broadband money that comes out of this funding will have to be spent quickly by the government. The fiscal year 2018 is already almost half over and ends on September 30 of this year. It’s likely that any grants coming out of such money would have to awarded before that September date to count as spending in this fiscal year. In order to move that fast I’m guessing the government is going to have to take shortcuts and use processes already in place. That probably means using the BTOP grant forms and processes again.

The short time frame for any of this funding also likely means that only ‘shovel-ready’ projects will be considered. But that aligns with statements made by the administration last year when talking about infrastructure projects. Anybody hoping to go after such grants better already have an engineered project in mind.

Assuming that funding follows the BTOP funding program, there were a few issues in those grants that ought to be kept in mind:

  • The grants favored areas that had little or no broadband. This is going to be more muddled now since a lot of rural America is seeing, or soon will be seeing broadband upgrades from the CAF II and A-CAM programs funded by the FCC. It’s doubtful that the big telcos are updating the national databases for these upgrades on a timely basis, so expect mismatches and challenges from them if somebody tries to get funding for an area that’s just been upgraded.
  • The BTOP grants required that anybody that wanted funding had to already have the matching funds in place. There were some notable BTOP failures from winners who didn’t actually have the funding ready, and I speculate tighter restrictions this time.
  • There were several requirements that added a lot of cost to BTOP programs – requirement to pay prevailing wages along with environmental and historic preservation reviews. There has been talk in Congress about eliminating some of these requirements, and hopefully that would happen before any funding. But that will take Congressional action soon.
  • The BTOP process surprisingly awarded a number of projects to start-up companies. Some of these start-ups have struggled and a few failed and it will be interesting to see if they make it harder for start-ups. The BTOP process also made it difficult, but not impossible for local governments to get the funding.

If there is going to be any money allocated for broadband, it’s going to have to be announced soon and one would think that deadline to ask for this funding is going to have to come soon – in very early summer at the latest.

The alternative to a federal grant program would be to award the $20 billion as block grants to states. If that happens it might be bad news for rural broadband. There are only a handful of states that have created state broadband grant programs. Any state with an existing program could easily shuttle some of this funding into broadband.

States without existing broadband programs will have a harder time. Most states will need legislative approval to create a broadband grant program and would also have to create the mechanisms for reviewing and approving these grants – a process that we’ve seen take a year in the few states that are already doing this.

It’s almost been two weeks since the budget was passed and I’ve read nothing about how the $20 billion will be used. Regardless of the path chosen, if any of this money is going to go to rural broadband we need to know how it will work soon, or else the opportunity for using the money this year will likely be lost.