Spectrum and 5G

All of the 5G press has been talking about how 5G is going to be bringing gigabit wireless speeds everywhere. But that is only going to be possible with millimeter wave spectrum, and even then it requires a reasonably short distance between sender and receiver as well as bonding together more than one signal using multiple MIMO antennae.

It’s a shame that we’ve let the wireless marketeers equate 5G with gigabit because that’s what the public is going to expect from every 5G deployment. As I look around the industry I see a lot of other uses for 5G that are going to produce speeds far slower than a gigabit. 5G is a standard that can be applied to any wireless spectrum and which brings some benefits over earlier standards. 5G makes it easier to bond multiple channels together for reaching one customer. It also can increase the number of connections that can be made from any given transmitter – with the biggest promise that the technology will eventually allow connections to large quantities of IOT devices.

Anybody who follows the industry knows about the 5G gigabit trials. Verizon has been loudly touting its gigabit 5G connections using the 28 GHz frequency and plans to launch the product in up to 28 markets this year. They will likely use this as a short-haul fiber replacement to allow them to more quickly add a new customer to a fiber network or to provide a redundant data path to a big data customer. AT&T has been a little less loud about their plans and is going to launch a similar gigabit product using 39 GHz spectrum in three test markets soon.

But there are also a number of announcements for using 5G with other spectrum. For example, T-Mobile has promised to launch 5G nationwide using its 600 MHz spectrum. This is a traditional cellular spectrum that is great for carrying signals for several miles and for going around and through obstacles. T-Mobile has not announced the speeds it hopes to achieve with this spectrum. But the data capacity for 600 MHz is limited and binding numerous signals together for one customer will create something faster then LTE, but not spectacularly so. It will be interesting to see what speeds they can achieve in a busy cellular environment.

Sprint is taking a different approach and is deploying 5G using the 2.5 GHz spectrum. They have been testing the use of massive MIMO antenna that contain 64 transmit and 64 receive channels. This spectrum doesn’t travel far when used for broadcast, so this technology is going to be used best with small cell deployments. The company claims to have achieved speeds as fast as 300 Mbps in trials in Seattle, but that would require binding together a lot of channels, so a commercial deployment is going to be a lot slower in a congested cellular environment.

Outside of the US there seems to be growing consensus to use 3.5 GHz – the Citizens Band radio frequency. That raises the interesting question of which frequencies will end up winning the 5G race. In every new wireless deployment the industry needs to reach an economy of scale in the manufacture of both the radio transmitters and the cellphones or other receivers. Only then can equipment prices drop to the point where a 5G capable phone will be similar in price to a 4GLTE phone. So the industry at some point soon will need to reach a consensus on the frequencies to be used.

In the past we rarely saw a consensus, but rather some manufacturer and wireless company won the race to get customers and dragged the rest of the industry along. This has practical implications for early adapters of 5G. For instance, somebody buying a 600 MHz phone from T-Mobile is only going to be able to use that data function when near to a T-Mobile tower or mini-cell. Until industry consensus is reached, phones that use a unique spectrum are not going to be able to roam on other networks like happens today with LTE.

Even phones that use the same spectrum might not be able to roam on other carriers if they are using the frequency differently. There are now 5G standards, but we know from practical experience with other wireless deployments in the past that true portability between networks often takes a few years as the industry works out bugs. This interoperability might be sped up a bit this time because it looks like Qualcomm has an early lead in the manufacture of 5G chip sets. But there are other chip manufacturers entering the game, so we’ll have to watch this race as well.

The word of warning to buyers of first generation 5G smartphones is that they are going to have issues. For now it’s likely that the MIMO antennae are going to use a lot of power and will drain cellphone batteries quickly. And the ability to reach a 5G data signal is going to be severely limited for a number of years as the cellular providers extend their 5G networks. Unless you live and work in the heart of one of the trial 5G markets it’s likely that these phones will be a bit of a novelty for a while – but will still give a user bragging rights for the ability to get a fast data connection on a cellphone.

Edging Closer to Satellite Broadband

A few weeks ago Elon Musk’s SpaceX launched two test satellites that are the first in a planned low-orbit satellite network that will blanket the earth with broadband. The eventual network, branded as Starlink, will consist of 4,425 satellites deployed at 700 miles above earth and another 7,518 deployed at around 210 miles of altitude.

Getting that many satellites into orbit is a daunting logistical task. To put this into perspective, the nearly 12,000 satellites needed are twice the number of satellites that have been launched in history. It’s going to take a lot of launches to get these into the sky. SpaceX’s workhorse rocket the Falcon 9 can carry about ten satellites at a time. They also have tested a Falcon Heavy system that could carry 20 or so satellites at a time. If they can make a weekly launch of the larger rocket that’s still 596 launches and would take 11.5 years. To put that number into perspective, the US led the world with 29 successful satellite launches last year, with Russia second with 21 and China with 16.

SpaceX is still touting this as a network that can make gigabit connections to customers. I’ve read the FCC filing for the proposed network several times and it looks to me like that kind of speed will require combining signals from multiple satellites to a single customer and I have to wonder if that’s practical when talking about deploying this networks to tens of millions of simultaneous subscribers. It’s likely that their standard bandwidth offering is going to be something significantly less.

There is also a big question to me about the capacity of the backhaul network that carry signal to and from the satellites. It’s going to take some major bandwidth to handle the volume of broadband users that SpaceX has in mind. We are seeing landline long-haul fiber networks today that are stressed and reaching capacity. The satellite network will face the same backhaul problems as everybody else and will have to find ways to cope with a world where broadband demand doubles every 3 years or so. If the satellite backhaul gets clogged or if the satellites get over-subscribed then the quality of broadband will degrade like with any other network.

Interestingly, SpaceX is not the only one chasing this business plan. For instance, billionaire Richard Branson wants to build a similar network that would put 720 low-orbit satellites over North America. Telesat has launched two different test satellites and also want to deploy a large satellite network. Boeing also announced intentions to launch a 1,000-satellite network over North America. It’s sounding like our skies are going to get pretty full!

SpaceX is still predicting that the network is going to cost roughly $10 billion to deploy. There’s been no talk of consumer prices yet, but the company obviously has a business plan – Musk want to use this business as the primary way to fund the colonization of Mars. But pricing is an issue for a number of reasons. The satellites will have some finite capacity for customer connections. In one of the many articles I read I saw the goal for the network is 40 million customers (and I don’t know if that’s the right number, but there is some number of simultaneous connections the network can handle). 40 million customers sounds huge, but with a current worldwide population of over 7.6 billion people it’s miniscule for a worldwide market.

There are those predicting that this will be the salvation for rural broadband. But I think that’s going to depend on pricing. If this is priced affordably then there will be millions in cities who would love to escape the cable company monopoly, and who could overwhelm the satellite network. There is also the issue of local demand. Only a limited number of satellites can see any given slice of geography. The network might easily accommodate everybody in Wyoming or Alaska, but won’t be able to do the same anywhere close to a big city.

Another issue is worldwide pricing. A price that might be right in the US might be ten times higher than what will be affordable in Africa or Asia. So there is bound to be pricing differences based upon regional incomes.

One of the stickier issues will be the reaction of governments that don’t want citizens using the network. There is no way China is going to let citizens bypass the great firewall of China by going through these satellites. Repressive regimes like North Kora will likely make it illegal to use the network. And even democratic countries like India might not like the idea – last year they turned down free Internet from Facebook because it wasn’t an ‘Indian’ solution.

Bottom line is that this is an intriguing idea. If the technology works as promised, and if Musk can find the money and can figure out the logistics to get this launched it’s going to be another new source of broadband. But satellite networks are not going to solve the world’s broadband problems because they are only going to be able to help some small limited percentage of the world’s population. But with that said, a remote farm in the US or a village in Africa is going to love this when it’s available.

Abandoned Telecom Infrastructure

I saw an article about Merrill, Oregon where the city was wrestling about what to do with an abandoned cable TV network hanging on poles in the City. It’s actually a fairly common occurrence to have abandoned telecom property on poles and I’ve been contacted by a number of cities over the years wondering what how to deal with the situation.

In this particular case the historic cable system in the city was operated by Rapid Communications out of Texas. That company sold cable properties to a number of companies in 2008 and the Merrill system went to Almega Cable Company, which stopped offering service in the City and went out of business in 2011.

There are all sorts of telecom assets that have been abandoned on poles and defunct cable companies are only one example. I saw a lot of WiFi mesh networks abandoned fifteen years ago as operators folded and never retrieved their equipment. There are numerous CLECs that folded in the late 1990s and that walked away from fiber networks on poles.

Having an abandoned set of wires on poles complicates the lives of any other pole users in the market. The unused wires take up space on poles and make it hard for anybody else to add additional wires onto the pole.

Abandoned networks also create havoc for the normal pole attachment process. This process requires buy-in from existing utilities to move or rearrange cables to make room for a new attacher. A new attacher can be paralyzed if they are unable to create the required clearance from existing wires.

In the end I’ve almost always seen the responsibility for getting rid of the network fall to the local government. Somebody has to go through the process of making certain there is no remaining active owner of the network before it can be condemned. Generally the pole owner is not willing to take on that role unless they have a need of their own to add wires to the poles.

Merrill is now undertaking the task of condemning the network. They have to follow law and post public notices to make sure that nobody claims rights to the cables. In the case of a cable company the City not only has to deal with the wires on poles, but also with customer drops and pedestals scattered throughout the community.

Merrill is hoping that some new carrier will want to use the cable network for overlashing fiber. Overlashing is the process of tying the fiber to existing wires and is generally the lowest cost method of fiber construction. But even if they find a taker for the offer my guess is that the new fiber provider is not going to want to assume ownership for the coaxial cables since that would give them liability for any issues or problems with the old wiring. So the City might end up owning the cables in perpetuity. If they don’t find a buyer, the city will have to pay to have the cables removed – although in today’s market there might be enough value in the copper inside the coaxial cables to offset the cost of removal.

We are going to see a lot more abandoned assets on poles in the future. We are just now entering a period when numerous companies are going to want to hang wireless devices of all types on poles. Some of these devices are tiny and I’ve seen others that are the size of a dorm refrigerator. It’s inevitable that some of the wireless deployments will fail, or that the wireless companies will lose the customers served by a given device.

Over time a significant inventory of abandoned wireless devices will likely grow in most cities. And unlike an abandoned cable network, my guess is that it’s often going to be hard to know which wireless devices have been abandoned or even who owns many of them. Cities ought to be considering ordinances today that require the companies that deploy wireless devices to somehow notify them of what they are doing and to also clearly label the ownership of each device.

But there is a movement at the FCC, in Congress and in States legislatures to institute rules for wireless carriers that would override any local rules. Such global rules are going to hinder cities in the coming decades when they try to deal with abandoned assets clogging their pole lines. Most of the proposed new rules I’ve seen don’t address this issue, which will make it messy to deal with later.

Is AT&T Violating Net Neutrality?

I got a text on my AT&T cellphone last month that told me that my wireless plan now includes sponsored data. Specifically they told me that I could now stream movies and other content from DirecTV or U-Verse TV without the video counting against my monthly data cap. This has been available to AT&T post-paid customers for a while, but now is apparently available to all customers. What I found most interesting about the message was that it coincided with the official end of net neutrality.

AT&T is not the first cellular company to do this. Verizon tried this a few years ago, although that attempt was largely unsuccessful because they didn’t offer much content that people wanted to watch. T-Mobile does something similar with their Binge-on program, but since most of their data plans are unlimited, customers can watch anything on their phones, not just the Binge-on video.

The sponsored data from AT&T would be a direct violation of net neutrality if it was still in effect and is a textbook example of paid prioritization. By excusing the DirecTV content from cellular data caps they have created an advantage for DirecTV compared to competitors. It doesn’t really matter that AT&T also happens to own DirecTV, and I imagine that AT&T is now shopping this same idea around to other video providers.

So what is wrong with what AT&T is doing? Certainly their many customers that buy both AT&T cellphones and DirecTV will like the plan. Cellular data in the US is still some of the most expensive data in the world and letting customers watch unlimited video from a sponsored video provider is a huge benefit to customers. Most people are careful to not go over monthly data limits, and that means they carefully curtail watching video on cellphones. But customers taking advantage of sponsored video are going to watch video that would likely have exceeded their monthly data cap – it doesn’t take more than a handful of movies to do that.

AT&T has huge market power with almost 140 million cellphones users on their network at the end of last year. Any video provider they sponsor is going to gain a significant advantage over other video providers. AT&T customers that like watching video on their cellphones are likely to pick DirecTV over Comcast or any other video provider.

It’s also going to be extremely tempting for AT&T to give prioritized routing to DirecTV video – what means implementing the Internet fast lane. AT&T is going to want their cellular customers to have a quality experience, and they can do that by making sure that DirecTV video has the best connections throughout their network. They don’t necessarily have to throttle other video to make DirecTV better – they can just make sure that DirectTV video gets the best possible routing.

I know to many people the AT&T plan is going to feel somewhat harmless. After all, they are bundling together their own cellular and video products. But it’s a short step from here for AT&T to start giving priority to content from others who are willing to pay for it. It’s not to hard to imagine them offering the same plan to Netflix, YouTube or Facebook.

If this plan expands beyond AT&T’s own video, we’ll start seeing the negative impacts of paid prioritization:

  • Only the biggest companies like Netflix, Facebook or Google can afford to pay AT&T for the practice. This is going to shut out smaller video providers and start-ups. Already in the short history of the web we’ve seen a big turnover in the popular platforms on the web – gone or greatly diminished are earlier platforms like AOL, CompuServe and Prodigy. But with the boost given by paid prioritization the big companies today will get a step-up to remain as predominant players on the web. Innovation is going to be severely hampered.
  • This is also the beginning of a curated web where many people only see the world through the filter of the predominant web services. We already see that phenomenon a lot today, but when people are funneled to only using the big web services this will grow and magnify.
  • It’s not hard to imagine the next step where we see reduced price data plans that are ‘sponsored’ by somebody like Facebook. Such platforms will likely make it a challenge for customers to step outside their platform. And that will lead to a segmentation and slow death of the web as we know it.

Interestingly, the Tom Wheeler FCC told AT&T that this practice was unacceptable. But through the change of administration AT&T never stopped the practice and is now expanding it. It’s likely that courts are going to stay some or all of the net neutrality order until the various lawsuits on the issue get resolved. But AT&T clearly feels emboldened to move forward with this, probably since they know the current FCC won’t address the issue even if net neutrality stays in effect.

Industry Shorts – March 2018

Following are a few topics that I find interesting, but which are too short to cover in a full blog:

Surge in Online Video Subscriptions. The number of households buying online video is surging. Netflix added almost 2 million US and 6.36 million international customers in the 4th quarter of 2017. That’s 18% more than the same quarter from a year earlier. There are also a growing number of vMVPD customers. At the end of last year CBS All Access has nearly 5 million customers. Showtime OTT also has nearly 5 million customers. Sling TV now has nearly 2 million customers. AT&T DirecTV hit the 1 million customer mark in December. PlayStation Vue reported 670,000 customers in mid-December. The new YouTube service has about 300,000. Hulu is also growing but doesn’t separately report it’s live TV customers from it’s video on demand customers (reported at 17 million total in December). Note that Hulu let’s customers buy one TV series or movies without needed a subscription.

Cellphone Data Usage Growth. According to the research firm NPD the average US smartphone now is used for an average of 31.4 GB per month of data. This is combined usage between cellular and WiFi data and is evidence that people are starting to really accept the small screen for video. This is up over 25% from a year earlier. The firm reports that video accounts for 83% of the usage.

The number of people willing to use a cellphone for video has also surged. NPD reports that 67% of cellphone users watched at least one video during the 3rd quarter of 2017, up from 57% in the 2nd quarter. Another research firm, Strategic Analytics reported that worldwide cellular data usage grew 115% in 2017, or more than doubled.

Global Streaming Doubled in 2017. Conviva, which provides systems to monitor and analyze online usage also reports that online video content more than doubled last year. They report that there were 12.6 billion viewing hours of online video in 2017 measured across 2.4 billon viewing devices. They report that 58% of video viewing came from North America; 21% from Europe; 19% from Asia 2% from the rest of the world.

Satellite TV Taking the Brunt of Cord Cutting. For some reason cord cutting seemed to be hitting the two big satellite TV providers even harder than landline cable companies. Dish Networks and DirecTV together lost 4.7% of their subscribers in the fourth quarter of 2017. We can only speculate for the reasons for the shift. The bundles of the landline cable companies make it harder for customers to drop their cable subscription. But to offset this, many satellite customers are in rural areas where there is often not a good broadband alternative to cable. But perhaps the FCC’s CAF II and ACAM programs are speeding up rural broadband enough for households to consider cutting the cord. It should be noted that AT&T is pushing their DirecTV now product more than their satellite TV, which also might account for part of the shift from satellite TV.

Apple Jumps into Programming. Apple quietly has gotten into the programing business. They’ve allocated over $1 billion in 2018 for the creation of new content. They’ve landed some big-name talent such as Steven Spielberg, Jennifer Aniston and Reese Witherspoon for projects. Apple doesn’t have a content platform and the industry is buzzing with speculation on how they might market and distribute the content.

Pirated Video Content on Rise. Sandvine reports that 6.5% of North American households have accessed pirated video content in the last year. I’ve read reports from Canada of companies openly advertising pirated content, including providing settop boxes that can download IPTV content directly from the Internet. Yesterday’s blog talked about new efforts by content owners to force ISPs to enforce copyright infringement.

The Newest Battle of Copyright Infringement

For years the big ISPs have paid lip service to complaints about customers who violate copyrights by sharing content on the web for music and video. Every big ISP has had a process in place that was intended to police violation of the Digital Millennium Copyright Act (DMCA).

The owners of copyrighted materials have long complained that the ISP response to violators has been weak and ineffective. And they are right in that most ISPs notify customers that they are accused of violating copyrights, but there has been little or no consequences for violators.

However, that might now be changing due to a lawsuit that’s been in the courts for a few years. Music label BMG sued Cox Communications for not providing adequate protection of it’s copyrighted music. Recently the 4th Circuit Court, on appeal, reversed the original verdict against Cox. However, in doing so the court threw out Cox’s primary defense, which was that they were protected by the ‘safe harbor’ laws that are part of DMCA.

The safe harbor rules protect ISPs like Cox against damages from customer theft of copyrighted materials. Removing the safe harbor means that the owners of copyrighted materials can seek and win damages against ISPs if they don’t take adequate steps to protect copyrights. In the specific case against Cox, the BMG issue was that Cox didn’t do anything to deter repeat offenders.

There are apparently a lot of repeat offenders – customers who share a lot of copyrighted material – so this ruling instantly got the attention of other big ISPs. Comcast responded last week by notifying customers of a new policy for repeat offenders of copyright theft. The new policy has several progressive stages of severity:

  • Customers notified of DMCA violations might be forced to log in fresh to their broadband account, and in doing so will probably have to agree to abide by the company’s DMCA policy before getting access. Customers might also have to talk to Comcast customer service before they can log into their broadband account.
  • Customer that continue to violate DMCA policies after this first stage face termination of their broadband and all other Comcast services.

This is going to have a chilling effect on those that share copyrighted materials. A majority of people live in markets where the cable company offers the best broadband, and losing the home broadband connection is drastic. I have to assume that telcos will come up with similar policies, meaning that DSL also won’t be a refuge for anybody who continues to violate copyrights.

There has always been people who share content. The old public bulletin boards were full of copyrighted songs and pictures that could be shared. Over time this morphed into Napster and other file-sharing services. Today there are still a number of sharing sites on Tor and other places on the web. And people have figured out how to use Kodi and other technologies to capture and share copyrighted video files.

Although they don’t want to play the role of policeman, I suspect the big ISPs will be forced to at least somewhat enforce policies like the one Comcast just initiated. There has always been a big tug of war between ISPs and content owners. This new response from Comcast shows that content owners now have the upper hand. It certainly means that those who continue to share copyrighted materials will face eventually losing their broadband. In today’s world that’s a severe penalty.

Smaller ISPs need to pay attention to this and watch what the big companies are doing. I wouldn’t be surprised to see BMG or some other content owner sue a smaller ISPs to make a point that this applies to everybody – and nobody wants to be that ISP. If the big ISPs really enforce this, then small ISPs need to follow suit and figure out an effective way to police and deter repeat copyright violators.

 

Public Networks and Privacy

I’ve been investigating smart city applications and one of the features that many smart network vendors are touting is expanded public safety networks that can provide cameras and other monitoring devices for police, making it easier to monitor neighborhoods and solve crimes. This seems like something most police departments have on their wish list, because cameras are 24/7 and can see things that people are never likely to witness.

The question I ask today is if this what America wants? There are a few examples of cities with ubiquitous video surveillance like London, but is that kind of surveillance going to work in the US?

I think we’ve gotten our first clue from Seattle. The City installed a WiFi mesh network using Aruba wireless equipment in 2013 with a $3.6 million grant from the Department of Homeland Security. The initial vision for the network was that it would be a valuable tool to provide security in the major port in Seattle as well as provide communications for first responders during emergencies. At the time of installation the city intended to also make the surveillance capabilities available to numerous departments within the City, not just to the police.

But when the antennas, like the one shown with this blog, went up in downtown Seattle in 2013, a number of groups began questioning the city about their surveillance policies and the proposed use of these devices. Various groups including the ACLU voiced concerns that the network would be able to track cellphones, laptops and other devices that have MAC addresses. This could allow the City to gather information on anybody moving in downtown or the Port and might allow the City to do things like identify and track protesters, monitor who enters and leaves downtown buildings, track the movement of homeless people who have cellphones, etc.

Privacy groups and the ACLU complained to the City that the network effectively was a suspicionless surveillance system that monitors the general population and is a major violation of various constitutional rights. The instant and loud protests about the network caught City officials by surprise and by the end of 2013 they deactivated the network until they developed a surveillance policy. The city never denied that the system could monitor the kinds of things that citizens were wary of. That surveillance policy never materialized, and the City recently hired a vendor to dismantle the network and salvage any usable parts for use elsewhere in the City.

I can think of other public outcries that have led to discontinuance of public monitoring systems, particularly speed camera networks that catch and ticket speeders. Numerous communities tried that idea and scrapped it after massive citizen outrage. New York City installed a downtown WiFi network a few years ago that was to include security cameras and other monitoring devices. From what I read they’ve never yet activated the security features, probably for similar reasons. A web search shows that other cities like Chicago have implemented a network similar to Seattle’s and have not gotten the negative public reaction.

The Seattle debacle leads to the question of what is reasonable surveillance. The developers of smart city solutions today are promoting the same kinds of features contained in the Seattle network, plus new ones. Technology has advanced since 2013 and newer systems are promising to include the first generation of facial recognition software and also the ability to identify people by their walking gait. These new monitoring devices won’t just track people with cellphones and can identify and track everybody.

I think there is probably a disconnect between what smart city vendors are developing and what the public wants out of their city government. I would think that most citizens are in favor of smart city solutions like smart traffic systems that would eliminate driving backups, such as changing the timing of lights to get people through town as efficiently as possible.

But I wonder how many people really want their City to identify and track them every time they go within reach of one of City monitors. The information gathered by such monitors can be incredibly personal. It identifies where somebody is including a time stamp. The worry is not just that a City might misuse such personal information, but IT security guys I’ve talked to believe that many Municipal IT networks are susceptible to hacking.

In the vendors defense they are promoting features that already function well. Surveillance cameras and other associated monitors are tried and true technologies that work. Some of the newer features like facial recognition are cutting edge, but surveillance  systems installed today can likely be upgraded with software changes as the technology gets better.

I know I would be uncomfortable if my city installed this kind of surveillance system. I don’t go downtown except go to restaurants or bars, but what I do is private and is not the city’s business. Unfortunately, I suspect that city officials all over the country will get enamored by the claims from smart city vendors and will be tempted to install these kinds of systems. I just hope that there is enough public discussion of city plans so that the public understands what their city is planning. I’m sure there are cities where the public will support this technology, but plenty of others where citizens will hate the idea. Just because we have the technical capabilities to monitor everybody doesn’t mean we ought to.

The Infrastructure Plan and Broadband

The administration finally published their infrastructure plan last week. The document is an interesting read, particularly for those with a financial bent like me. There is no way this early to know if this plan has any chance to make it through Congress, or how much it might change if it does pass. But it’s worth reviewing because it lets us know what the government is thinking about infrastructure and rural broadband.

First, the details of the plan:

  • The proposed plan provides $200B of federal funding over 10 years;
  • $100B goes to States in the form of a 20% grant for new projects that won’t require additional future federal spending;
  • $50B is set aside as grants to states as a grant program for rural infrastructure. States can use the money as they wish;
  • $20B goes to projects that are in the demonstration phase of new technologies and that can’t easily attract other financing;
  • $20B would to towards existing federal loan programs including Transportation Infrastructure Finance and Innovation Act (TIFIA) and the Water Infrastructure Finance and Innovation Act (WIFIA).
  • Finally, $10 billion would be used to create a revolving fund that would allow the purchase, rather than the lease of federal infrastructure.

The funding for the program is a bit murky, as you would expect at this early stage. It appears that some of the funding comes from existing federal highway infrastructure funds, and one might suppose those funds will still be aimed at highways.

This plan gives governors and state legislators a lot of new money to disperse, meaning that every state is likely to tackle this in a different way. That alone is going to mean a varied approach to funding or not funding rural broadband.

The plan is completely mute in terms of broadband funding. This makes sense since the plan largely hands funds to the states. The program does not promote rural broadband, but it certainly does not preclude it. The most likely source of any funding for rural broadband would come out of the $50B rural pot of funding. We’ll have to wait and see what strings are attached to that money, but the proposal would largely hand this money to states and let them decide how to use it.

The larger $100B pot is to be used to provide up to 20% of the funding for projects and there are very few rural fiber projects that don’t need more than 20% assistance to make them viable. If the 20% funding basis is firm for this pot of funding I can’t see it being used much for broadband.

States are not going to like this $100B funding pool because this completely flips the role of the federal government in infrastructure funding. Today, for many road and bridge projects the federal government supplies as much as 80% of the funding, and this flips the 80% to the states. Because of this, States are likely to view this as an overall long-term decrease in federal infrastructure spending. The trade-off for the flip, though, is that the money is immediate and the states get to decide what to fund. Today, when the feds are involved it can take literally decades to finish some road projects.

The overall goal of the plan is to promote private investment in infrastructure projects, which contrasts to today where almost all infrastructures projects are 100% government funded. Historically public/private partnerships (PPPs) have played only a small role in US infrastructure spending. PPPs have been successful in other countries for helping to build infrastructure projects on time and on budget – which is a vast improvement over government projects that routinely go over on both. But incorporating PPP financing into infrastructure spending is going to take a change of mindset. That challenge is going to be complicated by the fact that most of this spending will be dispersed by the states. And it’s the states that will or will not embrace PPPs, so we’ll probably have a varied response across the country.

One of the most interesting ideas embedded in the plan is that projects should be funded in such a way as to cover the long-term maintenance costs of a project. That’s a big change from today where roads, bridges and other major projects are constructed with no thought given about the funding for the ongoing maintenance, or even for related costs of a project for environmental and other ancillary costs. This is going to force a change in the way of thinking about infrastructure to account for the full life-cycle cost of a project up-front.

I’ve read a few private reports from investment houses and their take on the plans. The analysis I’ve seen believes that the vast majority of the money will go to highway, bridges and water projects. That might mean very little for rural broadband.

One thing is for sure, though. If something like this plan becomes law then the power to choose infrastructure projects devolves largely to states rather than the federal government. Today states propose projects to the feds, but under this plan the states would be able to use much of the federal funding as they see fit.

There are states that already fund some rural broadband infrastructure, and you might suppose those states would shuttle some of this new funding into those programs. But there are other states, some very rural, that have rejected the idea of helping to fund broadband. Expect a widely varying response if the states get the power to choose projects.

In summary, this plan is not likely going to mean any federal broadband grant program. But states could elect to take some of this funding, particularly the $50B rural fund, and use it to promote rural broadband. But there are likely to be as many different responses to this funding as there are states. We have a long way to go yet to turn this proposal into concrete funding opportunities.

Building Fiber to Anchor Institutions

The Schools, Health & Libraries Broadband Coalition (SHLB) announced a strategy to bring broadband to every anchor institution in the continental US. They estimate this would cost between $13 and $19 billion. They believe this would act as a first step to bring broadband to unserved and underserved rural communities.

While this sounds like a reasonable idea, we’ve tried this before and it largely hasn’t worked. Recall that the BTOP program in 2009 and 2010 funded a lot of middle mile fiber projects that brought broadband deeper into parts of the country that didn’t have enough fiber. That program required the BTOP middle mile fiber providers to serve all anchor institutions along the path of their networks and was a smaller version of this same proposal.

We’re approaching a decade later and a lot of the communities connected by BTOP middle mile grants still don’t have a last mile broadband network. There are some success stories, so I don’t want to say that middle mile fiber has no value – but for the most part nobody is making that last mile investment in rural areas just because the BTOP middle mile fiber was built.

BTOP isn’t the only program that has built fiber to anchor institutions. There are a number of states and counties that have built fiber networks for the express purposes of serving anchor institutions. There are also numerous fiber networks that have been built by school systems to support the schools.

In many cases I’ve seen these various anchor institution networks actually hurt potential last mile fiber investment. Anybody that is going to build rural fiber needs as many ‘large’ customers as it can get to help offset building expensive rural fiber. I’ve had clients who were thinking about building fiber to a small rural town only to find out that the school, city hall and other government locations already had inexpensive broadband on an existing fiber network. Taking those revenues out of the equation can be enough to sink a potential business plan.

At least BTOP fiber required that the network owners make it easy for last mile providers to get reasonably priced backbone access on their networks. Many of the state and school board networks are prohibited from allowing any commercial use of their network. I’ve never understood these prohibitions against sharing spare pairs of government fiber with others, but they are fairly common. Most come from State edicts that are likely prompted by the lobbyists for the big carriers.

I’m sure I’ll take some flak for my position, but I’ve seen the negative results of this idea too many times in the real world. Communities get frustrated when they see a gigabit connection at a school or City Hall when nobody else in the area has decent broadband. I’ve even seen government staff and officials who have fast broadband in their offices turn a deaf ear to the rest of the community that has poor or no broadband.

To make matters worse, many of the BTOP networks have run into economic difficulties. The companies that invested in BTOP bought into the hype that the middle mile fiber networks would attract last mile fiber investments, and they counted on those extra revenues for long-term viability. But a significant portion of the BTOP middle mile networks ended up being fiber to nowhere. Companies funded by BTOP needed to bring matching capital, and a number of the BTOP providers have had to sell their networks at a huge discount and walk away from their unpaid debt since the revenues to cover debt payments never materialized.

This also raises the question of who is going to maintain the enormous miles of fiber that would be built by this proposal. Somebody has to pay the electric bill to keep the fiber lit. Somebody needs to do routine maintenance as well as fix fiber cuts and storm damage. And somebody has to pay to periodically replace the electronics on the network, which have an average economic life of around ten years.

I feel certain I will get an inbox full of comments about this blog. I’m bound to get stories telling me about some of the great success stories from the BTOP networks – and they do exist. There are cases where the middle mile fiber made it easier for some ISP to build last mile fiber to a rural community. And certainly a lot of extremely rural schools, libraries and other anchor institutions have benefitted from the BTOP requirement to serve them. But I believe there are more stories of failure that offset the success stories.

I seriously doubt that this FCC and administration would release this much money for any kind of rural broadband. But this is the kind of idea that can catch the interest of Congress and that could somehow get funded. There is no politician in DC who will take a stance against schools and libraries.

I can think of much better ways to spend that much money in ways that would bring broadband solutions many whole rural communities, not just to the anchor institutions. That’s not enough money to fix all of our rural broadband issues, but it would be a great start, particularly if distributed in a grant program for last mile projects that requires matching private investment.

Federal Funding for Broadband Infrastructure

There is a lot of speculation that we might be seeing some money aimed at broadband due to the budget passed by Congress on February 9. That bill contains $20 billion for infrastructure spending spread evenly in fiscal years 2018 and 2019. On a floor speech as part of the vote, Senate Majority Leader Charles Schumer says the money will go towards “existing projects for water and energy infrastructure as well as expanding broadband to rural regions and improving surface transportation”.

Any broadband money that comes out of this funding will have to be spent quickly by the government. The fiscal year 2018 is already almost half over and ends on September 30 of this year. It’s likely that any grants coming out of such money would have to awarded before that September date to count as spending in this fiscal year. In order to move that fast I’m guessing the government is going to have to take shortcuts and use processes already in place. That probably means using the BTOP grant forms and processes again.

The short time frame for any of this funding also likely means that only ‘shovel-ready’ projects will be considered. But that aligns with statements made by the administration last year when talking about infrastructure projects. Anybody hoping to go after such grants better already have an engineered project in mind.

Assuming that funding follows the BTOP funding program, there were a few issues in those grants that ought to be kept in mind:

  • The grants favored areas that had little or no broadband. This is going to be more muddled now since a lot of rural America is seeing, or soon will be seeing broadband upgrades from the CAF II and A-CAM programs funded by the FCC. It’s doubtful that the big telcos are updating the national databases for these upgrades on a timely basis, so expect mismatches and challenges from them if somebody tries to get funding for an area that’s just been upgraded.
  • The BTOP grants required that anybody that wanted funding had to already have the matching funds in place. There were some notable BTOP failures from winners who didn’t actually have the funding ready, and I speculate tighter restrictions this time.
  • There were several requirements that added a lot of cost to BTOP programs – requirement to pay prevailing wages along with environmental and historic preservation reviews. There has been talk in Congress about eliminating some of these requirements, and hopefully that would happen before any funding. But that will take Congressional action soon.
  • The BTOP process surprisingly awarded a number of projects to start-up companies. Some of these start-ups have struggled and a few failed and it will be interesting to see if they make it harder for start-ups. The BTOP process also made it difficult, but not impossible for local governments to get the funding.

If there is going to be any money allocated for broadband, it’s going to have to be announced soon and one would think that deadline to ask for this funding is going to have to come soon – in very early summer at the latest.

The alternative to a federal grant program would be to award the $20 billion as block grants to states. If that happens it might be bad news for rural broadband. There are only a handful of states that have created state broadband grant programs. Any state with an existing program could easily shuttle some of this funding into broadband.

States without existing broadband programs will have a harder time. Most states will need legislative approval to create a broadband grant program and would also have to create the mechanisms for reviewing and approving these grants – a process that we’ve seen take a year in the few states that are already doing this.

It’s almost been two weeks since the budget was passed and I’ve read nothing about how the $20 billion will be used. Regardless of the path chosen, if any of this money is going to go to rural broadband we need to know how it will work soon, or else the opportunity for using the money this year will likely be lost.