Keeping Up With the Rest of the World

One of my readers sent me an article that announced a fiber-to-the-home expansion in Nepal. The ISP for the country is Vianet Communications P. Ltd, which uses Nokia GPON technology. The company built the first FTTH in the country in 2011 and already serves 10 of the 75 districts. The current expansion will bring fiber to an additional 4 districts, with another district already scheduled after that. Vianet is a commercial ISP and began as a dial-up ISP in the capital of Kathmandu is now expanding across the country with fiber. By the end of the year the ISP will have 200,000 customers on fiber, providing a minimum customer speed of 100 Mbps.

You don’t have to look hard to see similar stories around the world. Romania now has the fastest broadband in Europe and is ranked as having the sixth fastest overall broadband in the world. Romania’s broadband success story is unique since the fiber networks have largely been built by small neighborhood ISPs that have strung up fiber. There are local news articles that joke about the country having fiber-to-the-tree. The country had almost no telecom infrastructure at the end of the cold war and local entrepreneurs and neighborhood groups have tackled the task of bringing the needed fiber infrastructure.

I’ve often heard it said that one of the reasons the rest of the world has more fiber than us is because the governments in those countries build the infrastructure. However, when you look closer at a lot of countries like Nepal and Romania, it’s commercial ISPs that are building fiber, not the government. Singapore has had the fastest broadband in the world for years and their fiber was built by three ISPs. There are similar stories everywhere you look.

If ISPs are able to build fiber in Nepal and Romania, why are they having such a hard time doing so here? There are a few key reasons.

Big ISPs in the US are driven by quarterly earnings expected by Wall Street. They get crucified for not maximizing profits and none of them can undertake any major expansion that would earn infrastructure returns of 7% – 12%. It doesn’t matter that the ISP business is a cash cow and spins off piles of cash once the business is mature – the big ISPs are structured such that they really can’t consider undertaking building fiber to residents.

Years ago Verizon took a hit for tackling FiOS, and even then the company was very disciplined and only built where construction costs were low. People are praising AT&T currently for passing over 10 million homes and businesses with fiber – but their network is the very definition of cherry picking where they serve a few homes here and few homes there, nearby to their existing fiber nodes.

There are plenty of smaller US ISPs that would love to build more fiber, but they have a hard time raising the money. Fifty years ago banks were the primary source of infrastructure lending, but over time for various reasons they no longer want to make the long term loans necessary to support a fiber network.  The big banks are also Wall Street driven, and banks make a significantly higher return on equity by churning shorter-term notes compared to tying up money for 20 – 30 years.

One only has to visit a FISPA convention, the association for fiber overbuilders, to find numerous companies that would gladly tackle more fiber projects if they could borrow the money. Just about every member of FISPA will tell you that borrowing money is their biggest challenge.

The countries building fiber have found ways to overcome these issues. The ISPs there are able to borrow money to expand fiber networks. Their banks love the guaranteed long-term steady returns from broadband. The countries I’ve mentioned have one natural advantages over many parts of the US since they have a higher population density. Nepal has 29 million people and is about the same size as Michigan. Romania is a little smaller than Oregon with a population of 19 million. However, they have other challenges. As you can see from the map accompanying this blog, Nepal has some of the most challenging topography in the world. Both countries are far poorer than the US and yet they are finding ways to get fiber built – because like everywhere, there is a big demand for broadband.

I’ve said many times in this blog that we need government help to build fiber in the rural parts of the country. That’s due simply due to the cost of a fiber network calculated per household, and the numbers don’t work in most rural places. However, I’ve created hundreds of fiber business plans and it generally looks feasible to build fiber in most other places in the country, and yet there is no flood of ISPs building fiber in our towns, cities and suburbs. Detractors of municipal fiber always say that our broadband problems ought to be solved by the private sector – but I look around, and in 95% of America the private sector hasn’t showed up.

The End of Satellite TV?

DirecTV launched their most recent satellite in May of 2015. The company has launched 16 satellites in its history, and with twelve remaining in service is the largest commercial satellite company in the world. AT&T, the owner of DirecTV announced at the end of last year that there would be no more future satellite launches. Satellites don’t last forever, and that announcement marks the beginning of the death of DirecTV. The satellites launched before 2000 are now defunct and the satellites launch after that will start going dark over time.

AT&T is instead going to concentrate of terrestrial cable service delivered over the web. They are now pushing customers to subscribe to DirecTV Now or WatchTV rather than the satellite service. We’ve already seen evidence of this shift and DirecTV was down to 19.6 million customers, having lost a net of 883,000 customers for the first three quarters of 2018. The other satellite company, Dish Networks lost 744,000 customers in the same 9-month period.

DirecTV is still the second largest cable provider, now 2.5 million customers smaller than Comcast, but 3 million customers larger than Charter. It can lose a few million customers per year and still remain as a major cable provider for a long time.

In much of rural America, the two satellite companies are the only TV option for millions of customers. Households without good broadband don’t have the option of going online. I was at a meeting with rural folks last week who were describing their painful attempts to stream even a single SD-quality stream over Netflix.

For many years the satellite providers competed on price and were able to keep prices low since they didn’t have to maintain a landline network and the associated technician fleet. However, both satellite providers looked to have abandoned that philosophy. DirecTV just announced rate increase that range from $3 to $8 per month for various packages. They also raised the price for regional sports networks by $1. Dish just announced rate increases that average $6 per month for its packages. These are the two largest rate increases in the history of these companies and will shrink the difference between satellite and terrestrial cable prices.

These rate increases will make it easier for rural cable providers to compete. Many of them have tried to keep rates within a reasonable range of the satellite providers, and these rate increases will shrink the differences in rates.

In the long run the consequences of not having the satellite option will create even more change in a fast-changing industry. For years the satellite companies have been the biggest competitor of the big cable companies – and they don’t just serve in rural America. I recently did a survey in a community of 20,000 where almost half of the households use satellite TV. As the satellite companies drop subscribers, some of them will revert to traditional cable providers. The recent price increases ought to accelerate that shift.

Nobody has a crystal ball for the cable industry. Just a year ago it seemed like industry-wide consensus that we were going to see a rapid acceleration of cord cutting. While cord cutting gets a lot of headlines, it hasn’t yet grown to nearly the same magnitude of change that we saw with households dropping telephone landlines. Surprisingly, even after nearly a decade of landline losses there are still around 40% of homes with a landline. Will we see the same thing with traditional cable TV, or will the providers push customers online?

Recently I’ve seen a spate of articles talking about how it’s becoming as expensive to buy online programming as it is to stick with cable companies, and if this becomes the public perception, we might see a slowdown in the pace of cord cutting. It’s possible that traditional cable will be around for a long time. The satellite cable companies lost money for many years, mostly due to low prices. It’s possible that after a few more big rate increases that these companies might become profitable and reconsider their future.

Windstream Turns Focus to Wireless

Windstream CEO Tony Thomas recently told investors that the company plans to stress wireless technology over copper going into the future. The company has been using point-to-point wireless to serve large businesses for several years. The company has more recently been using fixed point-to-multipoint wireless technology to satisfy some of it’s CAF II build-out requirements.

Thomas says that the fixed wireless technology blows away what could be provided over the old copper plant with DSL. In places with flat and open terrain like Iowa and Nebraska the company is seeing rural residential broadband speeds as fast as 100 Mbps with wireless – far faster than can be obtained with DSL.

Thomas also said that the company is also interested in fixed 5G deployments, similar to what Verizon is now starting to deploy – putting 5G transmitters on poles to serve nearby homes. He says the company is interested in the technology in places where they are ‘fiber rich’. While Windstream serves a lot of extremely rural locations, there also serve a significant number of towns and small cities in their incumbent service areas that might be good candidates for 5G.

The emphasis on wireless deployments puts Windstream on the same trajectory as AT&T. AT&T has made it clear numerous times to the FCC that they company would like to tear down rural copper wherever it can to serve customers with wireless. AT&T’s approach differs in that AT&T will be using its licensed cellular spectrum and 4G LTE in rural markets while Windstream would use unlicensed spectrum like various WISPs.

This leads me to wonder if Windstream will join the list of big telcos that will largely ignore its existing copper plant moving into the future. Verizon has done it’s best to sell rural copper to Frontier and seems to be largely ignoring its remaining copper plant – it’s the only big telcos that didn’t even bother to chase the CAF II money that could have been used to upgrade rural copper.

The new CenturyLink CEO made it clear that the company has no desire to make any additional investments that will earn ‘infrastructure returns’, meaning investing in last mile networks, both copper and fiber. You can’t say that Frontier doesn’t want to continue to support copper, but the company is clearly cash-stressed and is widely reported to be ignoring needed upgrades and repairs to rural copper networks.

The transition from copper to wireless is always scary for a rural area. It’s great that Windstream can now deliver speeds up to 100 Mbps to some customers. However, the reality of wireless networks are that there are always some customers who are out of reach of the transmitters. These customers may have physical impediments such as being in a valley or behind a hill and out of line-of-sight from towers. Or customers might just live to far away from a tower since all of the wireless technologies only work for some fixed distance from a tower, depending upon the specific spectrum being used.

It makes no sense for a rural telco to operate two networks, and one has to wonder what happens to the customers that can’t get the wireless service when the day comes when the copper network gets torn down. This has certainly been one of the concerns at the FCC when considering AT&T’s requests to tear down copper. The current FCC has relaxed the hurdles needed to tear down copper and so this situation is bound to arise. In the past the telcos had carrier of last-resort obligations for anybody living in the service area. Will they be required to somehow get wireless signal to those customers that fall between the cracks? I doubt that anybody will force them to do so. It’s not far-fetched to imagine customers living within a regulated telcos service area who can’t get telephone or broadband service from the telco.

Customers in these areas also have to be concerned with the future. We have wide experience that the current wireless technologies don’t last very long. We’ve seen electronics wear out and become functionally obsolete within seven years. Will Windstream and the other telcos chasing the wireless technology path dedicate enough capital to constantly replace electronics? We’ll have to wait for that answer – but experience says that they will cut corners to save money.

I also have to wonder what happens to the many parts of the Windstream service areas that are too hilly or too wooded for the wireless technology. As the company becomes wireless-oriented will they ignore the parts of the company stuck with copper? I just recently visited some rural counties that are heavily wooded, and which were told by local Windstream staff that the upgrades they’ve already seen on copper (which did not seem to make much difference) were the last upgrades they might ever see. If Windstream joins the other list of big telcos that will ignore rural copper, then these networks will die a natural death from neglect. The copper networks of all of the big telcos are already old and it won’t take much neglect to push these networks into the final death spiral.

Can Cable Fight 5G?

The big cable companies are clearly worried about 5G. They look at the recently introduced Verizon 5G product and they understand that they are going to see something similar over time in all of their metropolitan markets. Verizon is selling 5G broadband – currently at 300 Mbps second, but promised to get faster in the future – for $70 standalone or for $50 for those with Verizon cellular.

This is the nightmare scenario for them because they have finally grown to the point where they are approaching a near monopoly in most markets. They have successfully competed with DSL and quarter after quarter have been taking DSL customers from the telcos. In possibly the last death knell for DSL, both Comcast and Charter recently increased speeds of their base products to at least 200 Mbps. Those speeds makes it hard for anybody to justify buying DSL at 50 Mbps or slower.

The big cable companies have started to raise broadband rates to take advantage of their near-monopoly situation. Charter just recently raised bundled broadband prices by $5 per month – the biggest broadband price increase I can remember in a decade or more. Last year a major Wall Street analyst advised Comcast that their basic broadband price ought to be $90.

But now comes fixed 5G. It’s possible that Verizon has found a better bundle than the cable companies because of the number of households that already have cellphones. It’s got to be tempting to homes to buy fast broadband for only $50 per month in a bundle.

This fixed 5G competition won’t come over night. Verizon is launching 5G in urban markets where they already have fiber. Nobody knows how fast they will really implement the product, due mostly to distrust of a string of other Verizon hype about 5G. But over time the fixed 5G will hit markets. Assuming Verizon is successful, then others will follow them into the market. I’m already seeing some places where companies American Tower are building 5G ‘hotels’ at poles, which are vaults large enough to accommodate several 5G providers at the same location.

We got a clue recently about how the cable companies might fight back against 5G. A number of big cable companies like Comcast, Charter, Cox and Midco announced that they will be implementing the new 10 Gbps technology upgrade from CableLabs. These cable companies just recently introduced gigabit service using DOCSIS 3.1. It looks like the cable companies will fight against 5G with speed. It sounds like they will advertise speeds far faster than the 5G speeds and try to win the speed war.

But there is a problem with that strategy. Cable systems with the DOCSIS 3.1 upgrade can clearly offer gigabit speeds, but in reality cable company networks aren’t ready or able to deliver that much speed to everybody. Fiber networks can easily deliver a gigabit to every customer, and with an electronics upgrade can offer 10 Gbps to everybody, as is happening in parts of South Korea. But cable networks have an inherent weakness that makes gigabit speed problematical.

Cable networks are still shared networks and all of the customers in a node share the bandwidth. Most cable nodes are still large with 150 – 300 customers in each neighborhood node, and some with many more. If even a few customers start really use gigabit speeds then the speed for everybody else in the node will deteriorate. That’s the issue that caused cable networks to bog done in the evenings a decade ago. Cable companies fixed the problem then by ‘splitting’ the nodes, meaning that they build more fiber to reduce the number of homes in each node. If the cable companies want to really start pushing gigabit broadband, and even faster speeds, then they are faced with that same dilemma again and they will need another round, or even two rounds of node splits.

For now I have serious doubts about whether Comcast and Charter are even serious about their gigabit products. Comcast gigabit today costs $140 plus $10 for the modem. The prices are lower in markets where the company is competing against fiber, and customers can also negotiate contract deals to get the gigabit price closer to $100. Charter has similar pricing – in Oahu where there is competition they offer a gigabit for $105, and their price elsewhere seem to be around $125.

Both of these companies are setting gigabit prices far above Google’s Fiber’s $70 gigabit. The current cable company gigabit is not a serious competitor to Verizon’s $50 – $70 price for 300 Mbps. I have a hard time thinking the cable companies can compete on speed alone – it’s got to be a combination of speed and price. The cable companies can compete well against 5G if they are willing to price a gigabit at the $70 Verizon 5G price and then use their current $100+ price for 10 Gbps. That pricing strategy will cost them a lot of money in node upgrades, but they would be smart to consider it. The biggest cable companies have already admitted that their ultimate network needs to be fiber – but they’ve been hoping to milk the existing coaxial networks for another decade or two. Any work they do today to reduce node size would be one more step towards an eventual all-fiber network – and could help to stave off 5G.

It’s going to be an interesting battle to watch, because if we’ve learned anything in this industry it’s that it’s hard to win customers back after you lose them. The cable companies currently have most of the urban broadband customers and they need to act now to fight 5G – not wait until they have lost 30% of the market.

Facebook Takes a Stab at Wireless Broadband

Facebook has been exploring two technologies in its labs that they hope will make broadband more accessible for the many communities around the world that have poor or zero broadband. The technology I’m discussing today is Terragraph which uses an outdoor 60 GHz network to deliver broadband. The other is Project ARIES which is an attempt to beef up the throughput on low-bandwidth cellular networks.

The Terragraph technology was originally intended as a way to bring street-level WiFi to high-density urban downtowns. Facebook looked around the globe and saw many large cities that lack basic broadband infrastructure – it’s nearly impossible to fund fiber in third world urban centers. The Terragraph technology uses 60 GHz bandwidth and the 802.11ay standard – this technology combination was originally called AirGig.

Using 60GHz and 801.11ay together is an interesting choice for an outdoor application. On a broadcast basis (hotspot) this frequency only carries between 35 and 100 feet depending upon humidity and other factors. The original intended use of the AirGig was as an indoor gigabit wireless network for offices. The 60 GHz spectrum won’t pass through anything, so it was intended to be a wireless gigabit link within a single room. 60 GHz faces problems as an outdoor technology since the frequency is absorbed by both oxygen and water vapor. But numerous countries have released 60Ghz as unlicensed spectrum, making it available without costly spectrum licenses, and the channels are large enough to still be able to deliver bandwidth even with the physical limitations.

It turns out that a focused beam of 60 GHz spectrum will carry up to about 250 meters when used as backhaul. The urban Terragraph network planned to mount 60 GHz units on downtowns poles and buildings. These units would act as both hotspots and to create a backhaul mesh network between units. This is similar to the WiFi networks we saw being tried in a few US cities almost twenty years ago. The biggest downside to the urban idea is the lack of cheap handsets that can use this frequency.

Facebook took a right turn on the urban idea and completed a trial of the technology deployed in a different network design. Last May Facebook worked with Deutsche Telekom to deploy a fixed Terragraph network in Mikebuda, Hungary. This is a small town of about 150 homes covering 0.4 square kilometers – about 100 acres. This is drastically different than a dense urban deployment with a far lower housing density than US suburbs – this is similar to many small rural towns in the US with large lots, and empty spaces between homes. The only current broadband in the town was about 100 DSL customers.

In a fixed mesh network every unit deployed is part of the mesh network each unit can deliver bandwidth into that home as well as bounce signal to the next home. In Mikebuda the two companies decided that the ideal network would be to serve 50 homes (not sure why they couldn’t serve all 100 of the DSL customers). The network is delivering about 650 Mbps to each home, although each home is limited to about 350 Mbps due to the limitations of the 802.11ac WiFi routers inside the home. This is a big improvement over the 50 Mbps DSL that is being replaced.

The wireless mesh network is quick to install and the network was up and running to homes within two weeks. The mesh network configures itself and can instantly reroute and heal to replace a bad mesh unit. The biggest local drawback is the need for pure line-of-sight since 60 GHz can’t tolerate any foliage or other impediments, and tree trimming was needed to make this work.

Facebook envisions this fixed deployment as a way to bring bandwidth to the many smaller towns that surround most cities. However, they admit in the third world that the limitation will be for backhaul bandwidth since the third world doesn’t typically have much middle mile fiber outside of cities – so figuring out how to get the bandwidth to the small towns is a bigger challenge than serving the homes within a town. Even in the US, the cost of bandwidth to reach a small town is often the limiting factor on affordably building a broadband solution. In the US this will be a direct competitor to 5G for serving small towns. The Terragraph technology has the advantage of using unlicensed spectrum, but ISPs are going to worry about the squirrelly nature of 60 GHz spectrum.

Assuming that Facebook can find a way to standardize the equipment and get it into mass production, then this is another interesting wireless technology to consider. Current point-to-multipoint wireless network don’t work as well in small towns as they do in rural areas, and this might provide a different way for a WISP to serve a small town. In the third world, however, the limiting factor for many of the candidate markets will be getting backhaul bandwidth to the towns.

The Physics of Millimeter Wave Spectrum

Many of the planned used for 5G rely upon the use of millimeter wave spectrum, and like every wireless technology the characteristics of the spectrum defines both the benefits and limitations of the technology. Today I’m going to take a shot at explaining the physical characteristics of millimeter wave spectrum without using engineering jargon.

Millimeter wave spectrum falls in the range of 30 GHz to 300 GHz, although currently there has been no discussion yet in the industry of using anything higher than 100 GHz. The term millimeter wave describes the shortness of the radio waves which are only a few millimeters or less in length. The 5G industry is also using spectrum that is a little longer than millimeter waves size such as 24 GHz and 28 GHz – but these frequencies share a lot of the same operating characteristics.

There are a few reasons why millimeter wave spectrum is attractive for transmitting data. The millimeter spectrum has the capability of carrying a lot of data, which is what prompts discussion of using millimeter wave spectrum to deliver gigabit wireless service. If you think of radio in terms of waves, then the higher the frequency the greater the number of waves that are being emitted in a given period of time. For example, if each wave carries one bit of data, then a 30 GHz transmission can carry more bits in one second than a 10 GHz transmission and a lot more bits than a 30 MHz transmission. It doesn’t work exactly like that, but it’s a decent analogy.

This wave analogy also defines the biggest limitation of millimeter wave spectrum – the much shorter effective distances for using this spectrum. All radio waves naturally spread from a transmitter, and in this case thinking of waves in a swimming pool is also a good analogy. The further across the pool a wave travels, the more dispersed the strength of the wave. When you send a big wave across a swimming pool it’s still pretty big at the other end, but when you send a small wave it’s often impossible to even notice it at the other side of the pool. The small waves at millimeter length die off faster. With a higher frequency the waves are also closer together. Using the pool analogy, that means that the when waves are packed tightly together then can more easily bump into each other and become hard to distinguish as individual waves by the time they get to the other side of the pool. This is part of the reason why shorter millimeter waves don’t carry as far as other spectrum.

It would be possible to send millimeter waves further by using more power – but the FCC limits the allowed power for all radio frequencies to reduce interference and for safety reasons. High-power radio waves can be dangerous (think of the radio waves in your microwave oven). The FCC low power limitation greatly reduces the carrying distance of this short spectrum.

The delivery distance for millimeter waves can also be impacted by a number of local environmental conditions. In general, shorter radio waves are more susceptible to disruption than longer spectrum waves. All of the following can affect the strength of a millimeter wave signal:

  • Mechanical resonance. Molecules of air in the atmosphere naturally resonate (think of this as vibrating molecules) at millimeter wave frequencies, with the biggest natural interference coming at 24 GHz and 60 GHz.
  • Atmospheric absorption. The atmosphere naturally absorbs (or cancels out) millimeter waves. For example, oxygen absorption is highest at 60 GHz.
  • Millimeter waves are easily scattered. For example, the millimeter wave signal is roughly the same size as a raindrop, so rain will scatter the signal.
  • Brightness temperature. This refers to the phenomenon where millimeter waves absorb high frequency electromagnetic radiation whenever they interact with air or water molecules, and this degrades the signal.
  • Line-of-sight. Millimeter wave spectrum doesn’t pass through obstacles and will be stopped by leaves and almost everything else in the environment. This happens to some degree with all radio wavs, but at lower frequencies (with longer wavelengths) the signal can still get delivered by passing through or bouncing off objects in the environment (such as a neighboring house and still reach the receiver. However, millimeter waves are so short that they are unable to recover from collision with an object between the transmitter and receiver and thus the signal is lost upon collision with almost anything.

One interesting aspect of these spectrum is that the antennas used to transmit and receive millimeter wave spectrum are tiny and you can squeeze a dozen or more antenna into a square inch. One drawback of using millimeter wave spectrum for cellphones is that it takes a lot of power to operate multiple antennas, so this spectrum won’t be practical for cellphones until we get better batteries.

However, the primary drawback of small antennas is the small target area used to receive a signal. It doesn’t take a lot of spreading and dispersion of the signal to miss the receiver. For spectrum in the 30 GHz range the full signal strength (and maximum bandwidth achievable) to a receiver can only carry for about 300 feet. With greater distances the signal continues to spread and weaken, and the physics show that the maximum distance to get any decent bandwidth at 30 GHz is about 1,200 feet. It’s worth noting that a receiver at 1,200 feet is receiving significantly less data than one at a few hundred feet. With higher frequencies the distances are even less. For example, at 60 GHz the signal dies off after only 150 feet. At 100 GHz the signal dies off in 4 – 6 feet.

To sum all of this up, millimeter wave transmission requires a relatively open path without obstacles. Even in ideal conditions a pole-mounted 5G transmitter isn’t going to deliver decent bandwidth past about 1,200 feet, with the effective amount of bandwidth decreasing as the signal travels more than 300 feet. Higher frequencies mean even less distance. Millimeter waves will perform better in places with few obstacles (like trees) or where there is low humidity. Using millimeter wave spectrum presents a ton of challenges for cell phones – the short distances are a big limitation as well as the extra battery life needed to support extra antennas. Any carrier that talks about deploying millimeter wave in a way that doesn’t fit the basic physics is exaggerating their plans.

Putting Skin in the Game for Broadband

Recently, Anne Hazlett, the Assistant to the Secretary for Rural Development at the USDA was quoted in an interview with Telecompetitor saying, “We believe the federal government has a role (in rural broadband), but we also need to see skin in the game from states and local communities because this is an issue that really touches the quality of life in rural America”.

This is a message that I have been telling rural communities for at least five years. Some communities are lucky enough to be served by an independent telco or an electric cooperative that is interested in expanding into fiber broadband. However, for most of rural America there is nobody that will bring the broadband they need to survive as a community.

Five years ago this message was generally not received well because local communities didn’t feel enough pressure from citizens to push hard for a broadband solution. But the world has changed and now I often hear that lack of broadband is the number one concern of rural counties and towns with poor broadband. We now live in a society where broadband has grown to become a basic necessity for households similar to water and electricity. Homes without broadband are being left behind.

When I’m approached today by a rural county, one of the first questions I ask them is if they have considered putting money into broadband. More and more rural areas are willing to have that conversation. In Minnesota I can think of a dozen counties that have decided they will pledge $1 million to $6 million to get broadband to the unserved parts of their county – these are pledges to make outright grants to help pay for the cost of a fiber network.

States are also starting to step up. Just a few year ago there were only a few states with grant programs to help jump start rural broadband projects. I need to start a list to get a better count, but there are now at least a dozen states that either have or are in the process of creating a state broadband grant program.

I don’t want to belittle any of the state broadband grant programs, because any state funding for broadband will helps to bring broadband to places that would otherwise not get it. But all of the state broadband grant programs are far too small. Most of the existing state grant programs allocate between $10 – $40 million annually towards solving a broadband problem that I’ve seen estimated at $40 – $60 billion nationwide. The grants are nice and massively appreciated by the handful of customers who benefit with each grant – but this doesn’t really fit into the category of putting skin in the game at the state level.

The federal programs are the same way. The current e-Connectivity program at $600 million sounds like a lot of assistance for broadband. But this money is not all grants and a significant amount of it will be loans that have to be repaid. Even if this was 100% grant money, if the national cost to bring rural fiber is $60 billion, then this year’s program would help to fund 1% of the national broadband shortfall – all we need to do is to duplicate the program for a century to solve the broadband deficit. If this program was to be spread evenly across the country, it’s only $12 million per state.

For many years we’ve been debating if government ought to help in funding rural broadband. In some ways it’s hard to understand why we are having this debate since in the past the country quickly got behind the idea of the government helping to fund rural electricity, rural telephony and rural roads. It seemed obvious that the whole country benefits when these essential services are brought to everybody. I’ve never seen any criticism that those past programs weren’t successful – because the results of these efforts were instantly obvious.

There is nobody anywhere asking governments to outright pay for broadband networks – although some local governments are desperate enough to consider this when there is no other solution. Building rural fiber – which is what everybody wants – is expensive and putting skin in the game means helping to offset enough of the cost in order to enable a commercial provider to make a viable business plan for fiber.

I wrote a blog in December that references a study done by economists at Purdue who estimate that the benefit of rural fiber is around $25,000 per household. I look at the results of the study and think it’s conservative – but even if the results are a little high this ought to be all of the evidence we need to justify governments at all levels putting more skin in the same.

When I see a rural county with a small population talking about pledging millions of dollars towards getting broadband I see a community that is really putting skin in the game, because that is a major financial commitment. For many counties this will be the largest amount of money they have ever spent for anything other than roads. By contrast, a state grant program of $20 million per year when the state budget might be $20 billion is barely acknowledging that broadband is a problem in their state.

I’m sure I’m going to hear back from those who say I’m being harsh on the state and federal grant program, and that any amount of funding is helpful. I agree, but if we are going to solve the broadband problem it means putting skin into the game – and by definition that means finding enough money to put a meaningful dent in the problem. To me that’s what skin in the game means.

An End-run Around Broadband Regulation

In a recent article in Wired, Susan Crawford, the Harvard law professor who follows tech policy and telecom writes about the long-term strategy of the big ISPs to avoid regulation. She discusses the attempt of ISPs to equate some of their actions to be the equivalent of free speech – thus removing any such ISP actions from regulation.

The big ISPs aren’t the only big corporations adopting this strategy which has been enabled due to the Citizens United v. Federal Election Commission decision in 2010. This landmark Supreme Court decision ruled that the free speech clause of the First Amendment to the Constitution prohibits the government from regulating independent expenditures for communications from corporations – specifically in that case campaign contributions to politicians. Corporations have been emboldened by that ruling to push to widen the definition of First Amendment rights for corporations. While not entirely accurate, the most common interpretation of that case is that corporations now have some of the same First Amendment rights as people, and corporations want to expand that list of rights.

The heart of the big ISP argument is that transmitting speech is protected by the First Amendment. The ISPS want to equate the act of transmitting a voice call or sending any transmission of data as protected speech – the same as speech between two people. Susan Crawford’s article describes the big ISP argument, and to a non-lawyer the big ISP logic is a bit hard to understand. However, what matters is that the big ISPs are hoping to get a favorable hearing of the issue should this ever make it to the Supreme Court – a ruling in their favor would effectively eliminate the possibility of regulated ISP broadband transmissions.

To anybody who is not a constitutional lawyer this seems like a silly argument. It’s clear to most of us that big ISPs can best be classified as utilities. They sell services that we think of as utility products. Depending upon the market, the ISPs differ in the degree of competition, but even where there aren’t telecom monopolies, we understand that the big cable companies and telcos act like oligopoly providers and don’t vigorously compete with each other on price. I think the average person believes that the big ISPs services ought to be regulated to some extent since we are all aware of ways that the big ISPs have abused their customers in the past.

The big ISPs are currently enjoying the least amount of regulation they’ve ever seen. The current FCC effectively walked away from regulating broadband. While there are still telephone and cable TV regulations on the books that derive from acts of Congress, the current FCC is regulating those products in the lightest manner possible.

However, the big ISPs know this could change in a hurry. In 2017 the Supreme Court ruled that the prior FCC had the authority to impose net neutrality rules using Title II regulation. The ISPs understand that as future administrations change, they could get a future FCC that is pro-consumer rather than pro-ISP. They also understand that a future Congress could pass new laws that provide for stricter regulations.

In fact, it’s almost inevitable that the regulatory pendulum will swing the other way – that’s how regulation of all industries has always worked. The government will implement new regulations and the companies that are regulated will challenge those regulations and over time weaken them. When regulation become too lax, the government will restart the cycle with a new round of tougher regulations. The very nature of regulation leads to this cycle of swings between tougher and weaker regulation.

ISPs are their own worst enemy, because like all monopolies they can’t seem to control themselves from going too far in the way they treat customers. Just in recent news we saw the State of Minnesota suing Comcast for lying about hidden fees on cable bills. We just heard about the big wireless carriers selling real-time customer cellphone location data to the highest bidders like bounty hunters, after promising the government they would stop the practice. The big ISPs (and all monopolies) are unable to police themselves because the desire for greater profits always overrides common sense – which is the primary reason that we regulate big corporations.

As a consumer I feel that the current FCC has gone too far towards deregulation, and as someone who understands regulation, I’ve always assumed the pendulum would swing the other way. You have to give the big ISP lawyers credit for thinking out of the box, and they have found a tactic that they hope might remove them from the regulatory cycle. I think anybody that buys services from these big ISPs hopes that they are unsuccessful in this effort.

Small Fiber Builders Making an Impact

The research firm RVA, LLC conducted a study for the Fiber Broadband Association looking at the number of homes and businesses that are now passed and/or served with fiber. The numbers show that smaller fiber providers are collectively having a big impact on the industry.

RVA found that as of September 2018 there were 18.4 million homes with fiber, up from 15 million a year earlier. To put that into perspective, at the end of 2017 there was just over 126 million US households, meaning that fiber has now made it into over 14% of US homes. What’s most impressive, though, about that finding is that 2.7% of homes got fiber in that one-year period. The number of fiber households has been creeping up slowly over the decade, but the speed of deployment is accelerating.

RVA also looked at passings and says that 39.2 million or 31% of homes are now passed with fiber. Comparing the 18.4 million fiber customers to the 39.2 million passings shows a fiber penetration rate of 47%. RVA also says that there are 1.6 million homes that are passed by two fiber providers – no doubt in the markets like Kansas City, Austin and the Research Triangle in North Carolina where Google and the incumbents both built fiber. RVA shows that when accounting for homes that have no broadband that fiber networks are achieving a 60% penetration rate.

Small fiber providers are collectively having a big impact on the industry. RVA says there are over 1,000 smaller fiber providers in the country. They quantify the overall market share of these providers as follows: smaller telcos (10.3%), fiber overbuilders (6.4%), cable companies (5.5%), municipalities (3.7%), real estate development integrators (1.1%) and electric cooperatives (0.5%).

In 2018 the small providers built to 29% of the new homes passed with the rest built by four Tier one providers. RVA didn’t identify these big providers, but clearly the biggest fiber builder right now is AT&T. The company has built fiber to over 10 million passings in the past four years and says they will reach about 14 million passings by mid-2019. A lot of the AT&T fiber passings come from an aggressive plan to build to MDUs (apartments and condominium complexes). However, the company is also making fiber available to homes within close range of its numerous existing neighborhood fiber POPs that are near to existing larger AT&T fiber customers.

The other biggest fiber builder right now is Altice. They announced a little over a year ago that they are planning to build fiber across their footprints from the Cable Vision and Suddenlink acquisitions – nearly 8 million passings. The company seems to be fulfilling that promise with a flurry of press releases in 2018 talking about active fiber deployments. Altice is currently trying to sell off some of its European fiber networks to lighten debt load and assumedly raise the cash needed to complete the US fiber build.

Most other large providers have more modest fiber plans. We know that the CenturyLink fiber expansion that was hot news just two years ago is likely now dead. Verizon is now putting its effort into fixed 5G wireless. The big cable companies all build fiber in new subdivisions but have all committed to DOCSIS 3.1 on their existing cable networks.

Looking forward a few years and most of the new fiber is likely to come from smaller providers. AT&T hasn’t announced any plans past the 2019 schedule and by then will have effectively passed all of the low-hanging fruit within range of its existing fiber network. Altice says it will take until at least 2022 to finish its fiber construction. There are no other big companies with announced plans to build fiber.

All of this is good news for the US households lucky enough to get fiber. It’s always been industry wisdom that the industry wouldn’t develop gigabit applications until there are enough fiber households to make it economically viable. While most customers on fiber probably are subscribing to speeds less than a gigabit, there ought to finally be enough gigabit fiber customers nationwide to create a gigabit market.

 

The Pushback Against Smart Cities

If you follow the smart city movement in the US you’ll quickly see that Kansas City, Missouri touts itself as the nation’s smartest city. The smart city movement got an early launch there when the City was announced as the first major market for Google Fiber. That gigabit fiber network attracted numerous small tech start-ups and the City also embraced the idea of being a technology leader.

The city’s primary smart city venture so far has been to bring smart city technology to a 54-block area in downtown. But this area only covers about 1% of the total area of the City. The City is currently contemplating expanding the smart city into the neglected east side neighborhoods near downtown. This is an area with boarded up storefronts and vacant lots, and the hope is that investing in smart city will bring a boost to this area as a way to kick-start economic development.

So far the primary smart city applications include smart parking, smart intersections, smart water meters and smart streetlights. The city also installed video surveillance cameras along the 2.2-mile downtown corridor.  The existing deployment also includes public WiFi provided through 25 kiosks placed throughout the smart city neighborhood. As of last fall there had been a reported 2.7 million log-ins to the WiFi network.

In the east side expansion WiFi will take on a more significant role since it’s estimated that only 40% of the residents in that area have home broadband today – far below the national average of 85%. The city is also looking to implement a rapid transit bus line into the east side as part of the smart grid expansion.

The new expansion into the east side is slated to have more surveillance including new features like gun shot detectors. There has been public fear voiced that this system can be used to disadvantage the largely minority population of the area.

The biggest hurdle to an expanded smart city services is money. The initial deployment was done through a public-private partnership. The city contributed $3.7 million, which it largely borrowed. Sprint, which manages the WiFi network contributed about $7 million and Cisco invested $5 million. The cost to expand the smart city everywhere has been estimated to cost half a billion.

It is the public-private partnerships that bring a troublesome aspect to the smart city concept. It’s been reported that Sprint collects data from those who log in to the free WiFi network – information like home zip code and results of Internet searches. It’s also been reported that Sprint can track people who have once subscribed to the service, even if they don’t log in. Sprint won’t say how it collects and uses customer data – but as we are learning throughout the tech world, it is the monetization of customer data that fuels many ISPs and online services.

There is also growing public concern about surveillance cameras. It’s starting to become clear that Americans don’t want to be tracked by cameras, especially now with the advent of decent facial recognition technology. We saw Seattle have to tear down a similar surveillance network before it ever went into service. We’re seeing huge pushback in Toronto about a proposed smart city network that includes surveillance.

We only have to look at China to see an extreme example of the misuse of this technology. The country is installing surveillance in public places and in retail areas and tracks where people are and what they do. China has carried this to such an extreme that they are in the process of implementing a system that calculates a ‘citizen score’ for every person. The country goes so far as to notify employers of even minor infractions of employees like jaywalking.

It’s going to be an uphill battle, perhaps one that never can be won for US cities to implement facial recognition tracking. People don’t want the government to be tracking where they are and what they do every time they go out into public. The problem is magnified many times when private companies become part of the equation. As much as the people in Kansas City might not fully trust the City, they have far less reason to trust an ISP like Sprint. Yet the smart city networks are so expensive it’s hard to see them being built without private money – and those private partners want a chance to get a return on their investment.