California Lowers the Definition of Broadband

California Governor Jerry Brown just signed a bill into law that lowers the official definition of broadband in the state while also providing state funding to upgrade rural broadband. The bill, AB 1665, goes into effect immediately. It lowers the definition of broadband in the state to 10 Mbps down and 1 Mbps up. But it goes even further and lowers the definition of an unserved customer to somebody who can’t get speeds of 6 Mbps and 1 Mbps up.

The bill reinstates a telecom tax that will provide a $300 million fund intended to be used to improve rural broadband. The California press believes that the fund will largely go to AT&T and Frontier, which both lobbied hard for the bill. My reading of the bill is that the incumbent carriers have first shot at the funding and anybody else only gets it when they don’t take it. In practical terms, assuming those two companies take the funding, almost none of this money would be made available to anybody who wants to build something faster in unserved areas.

We know that state funding done the right way can be a tremendous boon to broadband expansion. Consider, for example, the Minnesota DEED grants that have coaxed dozens of telecom providers to expand fiber networks deep into unserved and underserved areas of the state. It’s commonly understood that it can be hard to justify bringing fiber to rural areas, but some grant funding can be an effective tool to attract private money to fund the rest.

We also understand today that there are huge economic benefits for areas that have good broadband. The farmers in Minnesota that benefit from the grant program there are going to have a competitive advantage over farmers elsewhere that have little or no broadband. I’ve been looking at the IOT and other fiber-based technologies on the horizon for farming that are going to vastly increase productivity.

We also know that having good broadband benefits the small communities in rural America as well. These communities have been experiencing brain drain and economic flight as people are forced to go to metropolitan areas to find work. But broadband opens up work-at-home opportunities that ought to make it possible for families to thrive in rural America.

This move by California is a poor decision on many levels. First, it funnels money to the incumbent providers to make tiny tweaks to the existing networks so that existing broadband is just a little better. The new 10/1 Mbps broadband definition is also nothing more than a legislative definition of broadband and has no relevance in the real world. Many homes need more broadband than that, and as household broadband demand grows, a 10/1 Mbps connection will become inadequate for every home.

Another reason this is a bad idea is that the incumbents there are already making improvements to increase broadband to the 10/1 Mbps level. AT&T took $361.4 million of FCC CAF II funding that is to be used to upgrade broadband to 141,500 homes in California. That works out to $2,554 per home passed. Frontier took another $36.6 million, or $2,853 per home passed to improve broadband to 12,800 homes. That federal money requires that speeds increase to the 10/1 Mbps speed. This state funding will be an additive to those large federal amounts that these two companies have already received from the government.

AT&T has also already said that it plans to meet its CAF II obligations by upgrading rural cellular speeds. Frontier is mostly going to improve DSL on ancient copper and also is now looking at using point-to-point wireless technology to meet the CAF II obligations.

I don’t know how much it’s going to cost these companies to upgrade their rural customers to 10/1 Mbps. But the federal funding might be enough to pay for all of it. Adding the state funding means it’s likely that these two companies will make an immediate profit from upgrading rural customers to barely adequate broadband speeds. As we’ve seen many times in the past, this bill is good evidence that the big companies get value out of their lobbying efforts. The losers in all of this are the homes that won’t get anything faster than CAF II broadband. This $300M could have been used as matching grants to bring much faster broadband to many of these homes.

 

Will Banks Invest in Infrastructure Again?

Six local banks in Kentucky banded together to create a $150 million investment fund to support public private partnerships. The fund is called the Commonwealth Infrastructure Fund and is intended to provide debt financing to state and local PPP initiatives in the state.

You might not think this is newsworthy, but it is for several reasons. It’s one of only a handful of examples of bank debt being clearly earmarked for infrastructure investing. In this country virtually all debt for projects that involve the government is financed with municipal bonds. But this wasn’t always the case. While municipal bonds, or their equivalent, have been around for centuries, as recently as fifty years ago banks also played a big role in lending to municipal projects.

But for various reasons banks backed out of infrastructure investing. First, banks have backed away over the years from lending into long-term projects. Municipal projects are often of long duration and it’s not unusual to see infrastructure projects financed over 20 – 30 years. That’s a long time for a bank to tie up money and it also carries the risk of lending into future higher interest rates.

There have also been some spectacular failures with municipal bond defaults in places like New York City and Detroit. While the risk of lending to commercial businesses is a lot higher, the municipal defaults have added risk to lending to municipal-based projects. However, to offset this, the collateral on municipal loans can be extremely safe, particularly if default on a loan is backed by tax revenues.

It’s important to note that this particular fund is looking specifically at public private partnerships. That is a venture that that benefits the government but is backed to some degree by private capital. PPPs come in many flavors. At one end of the spectrum are projects that are all private money, such as some recent projects where a commercial company built new schools and then leased them back to the government. At the other end of the spectrum are PPP projects where the government mostly finances but a private firm largely operates the venture. A good example of this is the fiber network in Huntsville, AL where the city built the project and Google Fiber operates the business.

This fund is something that the country really needs. I’ve seen estimates that there are somewhere between $4 – $6 trillion of needed infrastructure improvements in the country. This ranges from deteriorating roads, crumbling overpasses and bridges, old government buildings, outdated schools, old dams and water projects, etc. But currently there is already over $3.7 trillion in outstanding municipal bond debt. The cities and states can’t begin to take on all of the additional debt needed to bring our infrastructure up to snuff. So we need private money to enter the picture and to help pay for projects where that makes sense.

Anybody lending into PPPs understands the relatively low returns from infrastructure investing. Municipal bonds today generally pay interest rates of 2% to 5%. A lot of private money has been chasing the higher returns of technology investing, but there are still plenty of sources of money like pension funds that are happy with long-term stable and predictable returns. All of the financiers I know say that they are seeing a renewed interest in long-term safe returns.

This Kentucky fund would be a perfect place to look for help with fiber projects. Kentucky is one of the states that still has huge amounts of its geography with poor or non-existing broadband. I would be surprised if the telcos in the state don’t show interest in the fund, assuming the fund would be interested in them.

Raising $150 million for infrastructure lending is only a drop in the bucket when looking at the big picture. But it’s a start and hopefully this will lure other banks and sources of debt and equity to give more consideration to infrastructure funding.

Bad Telecom Deals

FierceWireless recently published a short article listing the 10 worst telecom business moves of the last 10 years. And there are some clunkers on the list like Google’s purchase of Motorola, AT&T’s effort to buy T-Mobile and Time Warner Cable’s agreement to pay over $8 billion for the rights to broadcast the LA Dodgers.

One of the bad moves listed was Fairpoint’s purchase of Verizon’s customers and networks in Maine, New Hampshire and Vermont. Everything imaginable went wrong with that purchase that closed in 2007. The transition to Fairpoint was dreadful. There were numerous network outages as the cords were cut to the Verizon network. Customers lost email access. They couldn’t place long distance calls out of state and many couldn’t even call customer service. Customers abandoned the company in droves and in 2009 Fairpoint declared bankruptcy and recently sold the company to Consolidated.

There are other similar stories about companies that have bought large number of customers from the large telcos. Earlier this year there was reports of widespread customer dissatisfaction after Frontier bought a large swath of Verizon lines.

There are a number of lessons to be learned from the Fairpoint and similar transactions. First, it is exceedingly difficult to buy customers from the large telcos. The processes at the big companies are mind-numbingly complicated. I remember talking to a guy at AT&T years ago about the process of provisioning a new T1 to a customer. As we walked through the internal processes at the company I realized that nearly a dozen different departments at AT&T scattered across the country were involved in selling and connecting a single T1. It’s impossible for a new buyer to step into the middle of such complication – no matter what employees might come with the purchase of a property there will be numerous functions that the acquired folks don’t know how to do.

I recall helping a client buy a few exchanges from Verizon back in the 1990s. The buyer got literally zero records telling them the services that business customers were using. The buyer had to visit every business customer in the hopes of getting copies of bills, which were often undecipherable. I remember even years later that there were business customers that had working data circuits that the buyer didn’t entirely understand – they worked and their philosophy was to just never touch them.

The point of all of this is that the transition of a property from a big company always has major problems. No matter how long the transition process before conveying everything to the buyer, on the day the switch is thrown there are big holes. And this quickly leads to customer dissatisfaction.

The other issue highlighted by these transitions is that a buyer rarely has enough human resources ready to deal with the onslaught of problems that start immediately with the cutover. It can be massively time consuming to help even a single customer if you don’t have good enough records to know what services they have. Multiplying that times many customers spells disaster.

Not all sales of big telco properties are in massive piles and I’ve helped clients over the years to purchase smaller numbers of exchanges from the big telcos. I have several clients looking at potential purchases today, which highlights the other big problems with buying telco properties.

Today, any small buyer of a copper network probably only does so with a plan to convert the new acquisition to fiber-to-the-home. The condition of acquired copper plant is generally scarily bad. I can remember that Verizon let it be known for at least fifteen years that the whole state of West Virginia was for sale before Frontier finally bought it. Industry folks all knew that during that whole time that Verizon had largely walked away from making any investments in the state or even doing anything beyond putting band-aids on maintenance problems. Frontier ended up with a network that barely limped along.

So a buyer has to ask how much value there really is in a dilapidated copper network. If a buyer spends ‘market’ rates to buy a telco property and then spends again to upgrade the acquisition they are effectively paying for the property twice. I’ve crunched the numbers and I’ve never been able to find a way to justify this.

I think we may have reached the point where existing copper networks have almost zero market value. Even with paying customers, the revenues generated from older copper networks are not high enough to support buying the exchange and then spending again to upgrade it. This is something that prospective buyers often don’t want to hear. But as I always advise, numbers don’t lie, and it’s become obvious to me that it’s not a good economic deal to invest in old copper networks. It usually makes more sense to instead overbuild the property and take the customers.

New Technology – October 2017

I’ve run across some amazing new technologies that hopefully will make it to market someday.

Molecular Data Storage. A team of scientists at the University of Manchester recently made a breakthrough with a technology that allows high volumes of data to be stored within individual molecules. They’ve shown the ability to create high-density storage that could save 25,000 gigabits of data on something the size of a quarter.

They achieved the breakthrough using molecules that contain the element dysprosium (that’s going to send you back to the periodic table) cooled to a temperature of -213 centigrade. At that temperature the molecules retain magnetic alignment. Previously this has taken molecules cooled to a temperature of -259 C. The group’s goal is to find a way to do this at -196 C, the temperature of affordable liquid nitrogen, which would make this a viable commercial technology.

The most promising use of this kind of dense storage would be in large data centers since this storage is 100 times more dense than existing technologies. This would make data centers far more energy efficient while also speeding up computing. This kind of improvement since there are predictions that within 25 years data centers will be the largest user of electricity on the planet.

Bloodstream Electricity. Researchers at Fudan University in China have developed a way to generate electricity from a small device immersed in the bloodstream. The device uses stationary nanoscale carbon fibers that act like a tiny hydropower generator. They’ve named the device as ‘fiber-shaped fluidic nanogenerator” (FFNG).

Obviously there will need to be a lot of testing to make sure that the devices don’t cause problems like blood clots. But the devices hold great promise. A person could use these devices to charge a cellphone or wearable device. They could be used to power pacemakers and other medical devices. They could be inserted to power chips in farm animals that could be used to monitor and track them, or used to monitor wildlife.

Light Data Storage. Today’s theme seems to be small, and researchers at Caltech have developed a small computer chip that is capable of temporarily storing data using individual photons. This is the first team that has been able to reliably capture photons in a readable state on a tiny device. This is an important step in developing quantum computers. Traditional computers store data as either a 1 or a 0, but quantum computers store also can store data that is both a 1 and 0 simultaneously. This has shown to be possible with photons.

Quantum computing devices need to be small and operate at the nanoscale because they hold data only fleetingly until it can be processed, and nanochips can allow rapid processing. The Caltech device is small around the size of a red blood cell. The team was able to store a photon for 75 nanoseconds, and the ultimate goal is to store information for a full millisecond.

Photon Data Transmission. Researchers at the University of Ottowa have developed a technology to transmit a secure message using photons that are carrying more than one bit of information. This is a necessary step in developing data transmission using light, which would free the world from the many limitations of radio waves and spectrum.

Radio wave data transmission technologies send one bit of data at a time with each passing wavelength. Being able to send more than one bit of data with an individual proton creates the possibility of being able to send massive amounts of data through the open atmosphere. Scientists have achieved the ability to encode multiple bits with a proton in the lab, but is the first time it’s been done through the atmosphere in a real-world application.

The scientists are now working on a trial between two locations that are almost three miles apart and that will use a technology they call adaptive optics that can compensate for atmospheric turbulence.

There are numerous potential uses for the technology in our industry. This could be used to create ultrahigh-speed connections between a satellite and earth. It could be used to transmit data without fiber between locations with a clear line-of-sight. It could used as a secure method of communications with airplanes since small light beams can’t be intercepted or hacked.

The other use of the technology is to leverage the ability of photons to carry more than one bit of data to create a new kind of encryption that should be nearly impossible to break. The photon data transmission allows for the use of 4D quantum encryption to carry the keys needed to encrypt and decrypt packets, meaning that every data packet could use a different encryption scheme.

The Competition Dilemma

One of the most perplexing issues for fiber overbuilders is what I call the competition dilemma. That is where the big cable companies like Comcast will match the prices of any major competitor in their footprint, making it impossible for a competitor to ever get a price advantage.

A lot of fiber overbuilders enter the market and hope to gain customers by offering lower prices. You saw this when Google Fiber offered a gigabit broadband connection for $70, and I see the same thing from many smaller ISPs. But any price advantage disappears if the large incumbent cable company matches the lower prices.

This is an interesting dilemma for municipal cable systems. They often enter the market with a goal of lowering prices in their market. And when the incumbent provider matchers their prices the municipality has achieved their goal since everybody in the city then benefits from lower prices.

But this comes at a cost. Lower prices mean lower margins, and any ISP that lowers prices is hurting their own bottom line. You would think that lower prices also hurt the incumbent providers, but the big ISPs have the advantage of being able to charge more in surrounding communities to offset lower margins where there is competition. They factor in competition when setting their nationwide prices, so it can be argued that competition doesn’t really hurt big companies at all – they make up for competitive losses by charging a little more everywhere else.

There doesn’t seem to be any limit on how low an incumbent provider will go to match prices. Take the example of the cable TV product on the city-owned Click! Network in Tacoma, WA. For many years the city didn’t raise cable prices, and Comcast matched their low pricing. Over time the cable prices in Tacoma were over 30% lower than prices in the Tacoma suburbs and nearby cities like Seattle. The customers in the city benefitted from low cable rates, but the city was losing money on cable TV and over time raised their rates back to the market rates.

This issue is going to be in the news a lot more in the future. In a recent blog I talked about an analyst who believes that Comcast is going to double their broadband rates over the next few years. Even if their rate increases aren’t that drastic I think it’s obvious that they plan to raise rates. This is probably the number one reason they have been lobbying hard to get rid of Title II regulation, since that is the only tool that regulators could use to examine and react to broadband rate increases.

If Comcast and the other big ISPs undertake regular broadband price increases they will create an interesting dynamic in the industry. Anybody with a competing network is going to have to decide if they are going to raise rates to match them. It’s going to be tempting to do so because increases in broadband rates flow 100% straight to the bottom line. But if a competitor doesn’t raise rates, then it’s likely that the big ISPs will raise rates everywhere except where there is significant competition. And that would result in big difference in broadband prices between markets with and without a competitor.

It’s also likely that as the big ISPs raise broadband rates that they will be inviting competitors into the market. I create a lot of financial business plans and there are many markets where it’s hard to make a business case for building fiber at today’s broadband rate. But raise those rates and a lot more business plans become attractive.

The final issue raised by the competition dilemma is customer choice. Most cities desperately want competition in their markets because they can see the large cable companies becoming near-monopolies. One of the primary reasons why cities build fiber networks or lure ISPs to do so is to provide more choice. But you have to ask what kind of choice customers really get when there is no price difference between a competitor and the incumbents?

When Disaster Strikes

As many of you know who read this blog, I lived for nearly a decade on St. Croix in the US Virgin Islands. Since all three of the US Virgin Islands as well Puerto Rico got devastated by the two recent hurricanes I thought I’d talk a bit about how our industry responds to disasters.

Disasters have always been with us in the telecom industry. There have been other hurricanes that have knocked down wires and poles in the past. I have a number of clients who have experienced crippling ice storms. I have clients in the West who have lost network from wildfires. And I’ve had clients all over the country who have suffered from massive flooding.

I witnessed the impact of a hurricane in St. Croix when category 3 hurricane Omar hit the island in 2008. The hurricane itself was bad enough. A wall of water came down the hill behind my house, burst through the french doors at the rear of my house and streamed through and out my front door. Then, at the very end of the storm I took a direct hit from a tornado – one of the impacts of hurricanes often forgotten about.

As bad as these storms are, it’s the aftermath that is the most devastating. I was without power for over six weeks, meaning that my consulting work came to a screeching halt. But it took those whole six weeks anyway to clean the mud out of my house and to cut up the hundred dead trees around the property, including a magnificent hundred-year old mahogany tree. And while there are always mosquitoes in the Caribbean, after the flooding from a hurricane they come in dense clouds, making it miserable to work outside. What I remember most about that period is that my world shrank and all of my energy was needed to deal with the effects of the storm. I also learned how much I rely on electricity, refrigeration and lights in a place that gets dark at 6:00 PM every day. It’s mind-boggling to think that there are millions of Americans that will be without power for months.

A category 3 hurricane is strong enough to send trees crashing through overhead wires, and so there were wires down all over the island. But there was a minimal number of poles broken, and so the task of restoring power and telephone wires just needed lots of crews with cherry-pickers. Our island was the only place hit by Omar and crews from St. Thomas, Tortola and Puerto Rico came to help with the recovery. The island was so grateful we threw a huge well-deserved parade and party for repair crews when they were finished.

It was the response from work crews from other islands that made all of the difference. We see the same thing here in the US all of the time. One of my clients got devastated by hurricane Katrina and work crews from all over the US rushed to help. We see this after every stateside disaster as telecom and power crews from elsewhere rush to aid a utility in trouble.

And that is the big problem right now in the Caribbean. St. Thomas and the British Virgin Islands got devastated by hurricane Irma. The storm was so strong that it snapped the majority of the utility poles in St Thomas, meaning the work effort needed to restore the island is going to be massive. Since St. Croix got only minor damage in that storm it become the staging area for the work effort to help St. Thomas and St. John. But then two weeks later St. Croix and Puerto Rico were flattened by hurricane Maria.

We now have the unprecedented situation where all of the islands in the region lost their utility infrastructure at the same time. This presents an almost unsolvable logistical challenge of somehow getting the resources in place to get the islands back up and running. As bad as the Virgin Islands are right now, it’s almost impossible for the mind to grasp the amount of damage in Puerto Rico with it’s rough terrain and 3 million people still without power.

No utility can shoulder the cost of the repair efforts from a bad natural disasters. In the US the federal government has always jumped in to fund some of the needed recovery. The crews that rush in to help don’t ask first about getting paid and they assume they will eventually reimbursed for their costs. The FCC quickly approved $76.9 million towards the recovery effort for the Virgin Island and Puerto Rico. But that’s just a start on the cost of fixing the damage – I have colleagues working on St. Thomas and their first quick estimate of the utility damage there was almost $60 million. I imagine the final number for all of the islands is going to be astronomical.

I know that if there was an easy way to get there that many of the telco and power companies in the US would be sending crews to help the islands. It’s going to be hard enough just getting the needed poles, cables and electronics to the islands. It’s frustrating to know that the logistics challenges means that the repair will take a long time. It won’t be surprising to still see parts of Puerto Rico without electricity six months from now – and that is heartbreaking.

Cable Systems Aren’t All Alike

Big cable companies all over the country are upgrading their networks to DOCSIS 3.1 and announcing that they will soon have gigabit broadband available. Some networks have already been upgraded and we are seeing gigabit products and pricing springing up in various markets around the country. But this does not mean that all cable networks are going to be capable of gigabit speeds, or even that all cable networks are going to upgrade to DOCIS 3.1. As the headline of this blog says, all cable systems aren’t alike. Today’s blog looks at what that means as it applies to available broadband bandwidth.

A DOCSIS cable network is effectively a radio network that operates only inside the coaxial cable. This is why you will hear cable network capacity described using megahertz, which is a measure of the frequency of a radio transmission. Historically cable networks came in various frequency sizes such as 350 MHz, 650 MHz or 1,000 MHz.

The size of the available frequency, in megahertz, describes the capacity of the network to carry cable TV channels or broadband. Historically one analog TV channel uses about 6 MHz of frequency – meaning that a 1,000 MHz system can transmit roughly 167 channels of traditional analog TV.

Obviously cable networks carry more channels than this, which is why you’ve seen cable companies upgrade to digital system. The most commonly used digital compression scheme can squeeze six digital channels into the same frequency that carries one analog channel. There are new compression techniques that can squeeze in even more digital channels into one slot.

In a cable network each slice of available frequency can be used to either transmit either TV channels or else be used for broadband. If a cable companies wants more broadband capacity they must create room for the broadband by reducing the number of slots used for TV.

It is the overall capacity of the cable network along with the number of ‘empty’ channel slots that determine how much broadband the network can deliver to customers. A cable system needs roughly 24 empty channel slots to offer gigabit broadband download speeds. It’s a lot harder to carve out enough empty channels on smaller capacity networks. An older cable system operating at 650 MHz has significantly less capacity for broadband than a newer urban system operating at 1,000 MHZ or greater capacity.

One of the primary benefits of DOCSIS 3.1 is the ability to combine any number of empty channels into a signal broadband stream. But the task of upgrading many older networks to DOCSIS 3.1 is not just a simple issue of upgrading the electronics. If a cable company wants the faster broadband speeds they need to also upgrade the overall capacity of the network. And the upgrade from 350 MHz or 650 MHz to 1,000 MHz is often expensive.

The higher capacity network has different operating characteristics that affect the outside cable plant. For example, the placement and spacing of cable repeaters and power taps is different in a higher frequency network. In some cases the coaxial cable used in an older cable networks can’t handle the higher frequency and must be replaced. So upgrading an older cable network to get faster speeds often means making a lot of changes in the physical cable plant. To add to the cost, this kind of upgrade also usually means having to change out most or all of the cable settop boxes and cable modems – an expensive undertaking when every customer has multiple devices.

The bottom line of all of this is that it’s not necessarily cheap or easy to upgrade older or lower-capacity cable networks to provide faster broadband. It takes a lot more than upgrading the electronics to get faster speeds and often means upgrades the physical cable plant and replacement of settop boxes and cable modems. Cable operators with older networks have to do a cost/benefit analysis to see if it’s worth the upgrade cost to get faster broadband. Since most older cable systems are in rural small towns, this is one more hurdle that must be overcome to provide faster broadband in rural America.

TiVo’s Latest Video Trend Report

TiVo just released their Q2 2017 Online Video and Pay-TV Trends Report, and as usual it’s full of interesting statistics about the cable and video industries.

They looked in detail at those without a traditional pay-TV service – the cord cutters and cord nevers. They found the following:

  • One-fourth left cable TV during the last 12 months – proof that cord cutting is a real phenomenon.
  • 7% use an antenna to get over-the-air free TV.
  • Over 85% report that they don’t have cable TV because it costs too much.
  • Almost 46% use an OTT service like Netflix, Hulu or Amazon.

For people that still have a cable TV subscription:

  • 53% are satisfied with the value of their cable and another 31% are very satisfied. Those percentages are higher than past quarters, possibly due to the dissatisfied cord-cutters leaving cable.
  • In a statistic that might surprise many, only 15% are dissatisfied with the value of their cable subscription.
  • 47% of households now pay between $50 – $75 per month for cable. Over 36% of households spend more than $100 per month for cable, with 10% of households spending more than $150 per month.

TiVo looked at those who plan to change TV service in the next year:

  • 6% plan to cut the cord
  • 8% plan to change to another TV provider
  • 31% say that they have thought about cutting the cord.
  • 56% would change to an a la carte TV offering that let them buy only the channels they want.
  • 39% said they would be more willing to cut the cord if there was some easy way to navigate between OTT providers.

Many are still buying premium movie channels:

  • 27% of households are buying HBO, up from 22% just a year ago.
  • 17% are buying Showtime
  • 17% are buying the Movie Channel
  • 13% buy a premium sports package
  • 13% buy Starz
  • 12% buy Cinemax

They also looked at TV viewing habits:

  • 89% of households watch TV on a daily basis
  • 67% of homes watch recorded content (DVR / DVD) content on a daily basis.
  • 63% watch OTT content on a daily basis

Households are largely loyal to a handful of content:

  • Over 80% of homes report that they watch 10 or fewer different channels of content.
  • 59% of households watch 5 or fewer different shows per week.
  • 83% of households watch 10 or fewer shows per week.

Households were asked what they like most about OTT services:

  • 59% like services where each family member can create their own profile.
  • 56% like the lower prices of the service
  • 46% like auto-play of episodes where the next show comes on automatically

TV Everywhere still doesn’t have universal acceptance

  • Just over 50% of households are aware that their cable service offers TV Everywhere
  • Just over 1/3 of households actually use TV Everywhere to watch content on cell phones, tablets, etc.

CAF II and Wireless

Frontier Communications just announced that they are testing the use of wireless spectrum to complete the most rural portions of their CAF II build-out requirement. The company accepted $283 million per year for six years ($1.7 billion total) to upgrade broadband to 650,000 rural homes and businesses. That’s a little over $2,600 per location passed. The CAF II program requires that fund recipients increase broadband to speeds of at least 10 Mbps down and 1 Mbps up.

Frontier will be using point-to-multipoint radios where a transmitter is mounted on a tower with the broadband signal then sent to a small antenna at each customer’s location. Frontier hasn’t said what spectrum they are using, but in today’s environment it’s probably a mix of 2.4 GHz and 5 GHz WiFi spectrum and perhaps also some 3.65 GHz licensed spectrum. Frontier, along with CenturyLink and Consolidated told the FCC a year ago that they would be interested in using the spectrum in the ‘citizens’ radio band’ between 3.7 MHz and 4.2 MHz for this purpose. The FCC opened a docket looking into this spectrum in August and comments in that docket were due to the FCC last week.

I have mixed feelings about using federal dollars to launch this technology. On the plus side, if this is done right this technology can be used to deliver bandwidth up to 100 Mbps, but in a full deployment speeds can be engineered to deliver consistent 25 Mbps download speeds. But those kinds of speeds require an open line-of-sight to customers, tall towers that are relatively close to customers (within 3 – 4 miles) and towers that are fiber fed.

But when done poorly the technology delivers much slower broadband. There are WISPs using the technology to deliver speeds that don’t come close to the FCC’s 10/1 Mbps requirement. They often can’t get fiber to their towers and they will often serve customers that are much further than the ideal distance from a tower. Luckily there are many other WISPs using the technology to deliver great rural broadband.

The line-of-sight issue is a big one and this technology is a lot harder to make work in places with lots of trees and hills, making it a difficult delivery platform in Appalachia and much of the Rockies. But the technology is being used effectively in the plains and open desert parts of the country today.

I see downsides to funding this technology with federal dollars. The primary concern is that the technology is not long-lived. The electronics are not generally expected to last more than seven years and then the radios must be replaced. Frontier is using federal dollars to get this installed, and I am sure that the $2,600 per passing is enough to completely fund the deployment. But are they going to keep pouring capital into replacing radios regularly over time? If not, these deployments would be a sick joke to play on rural homes – giving them broadband for a few years until the technology degrades. It’s hard to think of a worse use of federal funds.

Plus, in many of areas where the technology is useful there are already WISPs deploying point-to-multipoint radios. It seems unfair to use federal dollars to compete against firms who have made private investments to build the identical technology. The CAF money ought to be used to provide something better.

I understand Frontier’s dilemma. In the areas where they took CAF II money they are required to serve everybody who doesn’t have broadband today. My back-of-the envelope calculations tells me that the CAF money was not enough for them to extend DSL into the most rural parts of the CAF areas since extending DSL means building fiber to feed the DSLAMs.

As I have written many times I find the whole CAF program to be largely a huge waste of federal dollars. Using up to $10 billion to expand DSL, point-to-multipoint, and in the case of AT&T cellular wireless is a poor use of our money. That same amount of money could have seeded matching broadband that could be building a lot of fiber to these same customers. We only have to look at state initiatives like the DEED grants in Minnesota to see that government grant money induces significant private investment in fiber. And as much as the FCC doesn’t want to acknowledge it, building anything less than fiber is nothing more than a Band-aid. We can and should do better.

Local Government Funding for Fiber

There is an interesting new trend where local government acts as the banker for rural broadband projects. It’s an interesting new twist on public / private partnerships and is a model that more communities should consider.

Consider these rural broadband projects in Minnesota.

  • First is RS Fiber. This is a new broadband cooperative that serves most of Sibley County and some of Renville County in Minnesota. Bonds were approved to fund 25% of a broadband project and those bonds are backed by the counties, some small cities and also by townships that are getting the fiber. The expectation is that the project will make the bond payments.
  • Next is in Swift County Minnesota. Federated Telephone Cooperative, an existing telephone company, was awarded $4.95 million to build fiber to rural homes in the county. The county approved general obligation bonds of $7.8 million to complete the project, or 60% of the funding.

Both projects are classic examples of a public private partnership. In these particular cases the company that will own and operate the network is a cooperative, but these same agreements could have been made with a for-profit telco or some other telecom provider as well.

These kinds of projects make sense for a number of reasons:

  • The process of approving bond financing is far faster than securing traditional funding for these kinds of projects.
  • Bonds for fiber can be financed over a long period of time – 20 to 30 years, while loan terms for commercial loans are usually shorter. Just like with a home mortgage, borrowing for a longer time period means lower annual debt payments, which is essential to make these projects financially feasible.

In both cases the Counties and other local government entities have taken on the role of banker. The local governments will have no operational role in running the fiber business (a role they did not want). The Counties expect for the bond payments to be covered by the fiber project. And since these networks are being built in rural areas with few other broadband alternatives the new fiber ventures should get high customer penetration rates. But if the ventures fail then the local governments are on the hook to cover any shortfalls in the bond payments.

These are both cases of local governments deciding that the need for rural broadband was great enough to risk taxpayer money to get this done. They also decided that the risk of not getting paid is low. The business cases show that even in the worst case the revenues from the projects should cover almost all costs, meaning that the downside risk to the Counties is minimal. In the case of RS Fiber, as a start-up new cooperative, they would not have been able to get any traditional funding without the seed money from the local governments.

This is a model that the rest of rural America should consider. Small ISPs like these cooperatives stand ready to serve a lot of rural America, but they often don’t have the financial wherewithal to do so. In these cases, a public private partnership with local government as the banker seemed to be the only way to make this happen.

Everywhere I travel in rural America homeowners and farmers want good broadband. They understand that it’s costly to build fiber to farms and small rural towns. But they also seem willing to help pay to make this work. I think if more rural counties would listen to their constituents they would take a harder look at this model.

Of course, a county needs to do their homework up front and make sure they know it’s a sound project and that the estimated cost of building the broadband network is accurate. But assuming there is a solid business plan, perhaps the most valuable role a county can tackle is that of being the banker to help new broadband builds get off the ground