An Effective Federal Broadband Program, Part 2

CCG LogoI wrote a blog last week that talked about the things the feds ought to avoid if they design a huge program to build rural broadband. The industry has been buzzing with the possibility that large amounts of federal money might become available for this purpose. But it’s not good enough just to avoid pitfalls. If we really want an effective plan to construct and operate rural broadband there are some positive steps that need to be taken. This series of blogs looks at how to best design a federal broadband construction program to bring broadband to areas that currently don’t have it.

Build for the Future. It would be a huge mistake if a rural broadband expansion builds only to meet today’s definition of broadband. Cisco recently said that the average home today needs about 24 Mbps to meet their needs, which is nearly identical to the FCC’s current definition of broadband. Historically we have seen broadband speeds for customers double about every three years. But Cisco’s latest broadband report suggests this might have slowed down to about every four years. Cisco predicts by 2020 that households will need almost 50 Mbps. Look out a decade from now and the math says that households will need over 140 Mbps.

It would be totally irresponsible to spend billions of federal dollars to build infrastructure that will be inadequate by the time it’s installed. The current CAF II program is a travesty because it is spending billions on DSL and cellular data to achieve 10/1 Mbps speeds and won’t even be completed until 2021. CAF II is not building broadband infrastructure – it’s spending gold-plated federal money to build a lead solution for rural broadband. It’s not going to be very long before all of the rural people getting CAF II networks will be screaming again for something better.

This means a federal broadband program should not be used to fund cellular wireless, point-to-point fixed wireless or DSL. Those technologies all have a place in the marketplace today, but they can’t come close to meeting tomorrow’s needs, so let’s not toss away billions of tax dollars on the wrong technologies.

Use Federal Loan Guarantees. A federal broadband program does not have to rely only on matching grants. The federal government has several loan guarantee programs that can be expanded to bring banks into the funding process. Banks love loan guarantees because they greatly reduce the risk of projects by having the federal government act as the backstop for bad loans. If the review process is done well and funding is only given to companies with a good chance of success, then there should be few loan defaults and the loan guarantee program would cost the federal government very little.

Don’t Forget the Small Towns. It’s easy when looking to fund a rural broadband solution to concentrate only on areas that are categorized as either unserved or underserved. But business plans to serve only the neediest customers are hard to make work. Rural business plans work best if they can also incorporate the small towns and county seats.

The stimulus grants ignored these towns because they are considered to have adequate broadband. That is shortsighted because small towns do not have networks that are up to snuff with urban networks. For example, they may have cable modems, but these little towns are unlikely to get upgraded to the next generation of cable electronics for a long time, if ever. If we want to have successful business plans then the funding needs to also cover the small towns in the middle of the unserved areas to help the service providers achieve an economy of scale.

Don’t Try to Serve Every Home. Any broadband program ought to have the goal of reaching the most homes as possible with the funding available. This means that there should not be rules that require that every customer within a Census block get broadband. Rural Census blocks can be large and can cover diverse topology. A census block might have most customers along a river valley with a few high up nearby mountains, or on the other side of a lake or river. In my experience when designing rural networks the hardest-to-reach 10% of the customers can easily represent 40% of the cost to build. If we want to stretch federal dollars we need flexible rules that allow for realistic business plans. There comes a point where the guy who built on the top of a mountain shouldn’t get broadband, just like it’s hard for him to get electricity or city water or other utilities. What matters more is stretching federal dollars smartly to serve as many homes as possible.

ESPN and the Cable Industry

espnI’ve been writing periodically about ESPN because they seem to be the poster child for what is happening to cable TV and to programmers in the country. It’s been obvious over the last year or two that ESPN is bleeding customers, and the many articles about them concentrate on that issue.

ESPN is a good bellwether for the industry because they are carried by practically every cable TV provider, and because their contracts require that the channel be carried in the expanded basic tier – the tier that generally has between 50 and 75 channels. Only a few tiny rural cable systems don’t carry ESPN since they carry only a small number of channels.

When ESPN loses customers it can only come from one of two reasons – people that cut the cord and drop cable altogether or from cord shavers who downsize to the smallest basic cable package. Basic cable is the small package of 10 – 15 channels that includes the local network affiliates, government channels and a few cheap throw-ins like shopping channels.

But it’s not easy to figure out the real number of cord cutters and cord shavers. The largest cable companies report total subscriber numbers each quarter but they don’t report on the packages that customers buy. Various analysts estimate the number of cord cutters each quarter, but they differ on these estimates – and I haven’t seen anybody try to estimate the number of cord shavers.

Nielsen tracks the number of customers of each cable network and that tells us how the various cable TV networks are faring. The latest article on ESPN comes from Sports TV Ratings, a website that tracks subscribers to the various sports networks. That site shows that ESPN lost 621,000 subscribers just last month (October 2016). That is an astounding number since ESPN has roughly 89 million customers – it’s a drop of 7/10’s of a percent, which annualized would be over 8% of ESPN customers.

But that number may not be a huge aberration. FierceCable reported earlier this year that ESPN had lost 2.2 million customers between February and August of this year, which is a clip of 440,000 lost customers per month. And the network has lost more than 11 million customers since its peak in 2013 when it had almost 100 million customers.

Trying to count cord shavings gets even more complicated because of OTT content. The cited drop of 610,000 ESPN customers is from the Nielsen numbers for carriage on cable systems. This doesn’t include online content which includes ESPN. For instance, the basic package on Sling TV includes ESPN and Goldman Sachs estimated that Sling TV will have almost 2 million customers by the end of this year. There are a number of new OTT offerings just hitting the market that will include the network, but for now Sling TV has most of the online ESPN subscribers.

ESPN has an advantage over many other networks in that it probably can add back customers by selling to people directly on the web. And so perhaps the network can find an equilibrium number of customers at some lower threshold than today. But this is not going to be true for a lot of other content. As an example, in October the Golf Channel lost 600,000 subscribers and The Major League Baseball Channel lost 515,000 customers – and those kinds of networks have very limited appeal on a standalone basis. That is the real story behind the losses at ESPN – the vast majority of cable networks are bleeding customers right now.

Some of the content providers are not too worried about the drop of US cable customers since they are picking up far greater numbers of new customers worldwide right now. But networks that are US-centric – sports, news, weather – are in for a rough ride over the next few years as the industry settles out to a new and lower norm. I think we can expect to see a transformation of sports programming as the numerous sports networks bleed customers. This probably means more emphasis on live programming and fewer sports networks.

A Year of Mergers

Bell_logo_1969Our industry has seen many mergers over the years between the biggest companies in the sector. But for the most part big mergers that change the face of the industry have been sporadic. We had AOL buying Time Warner in 2000, Alcatel buying Lucent in 2006 and CenturyLink buying Qwest in 2011.

But now it seems like I can’t read industry news without seeing discussions of a new merger. During the last year or so we saw AT&T gobble up DirecTV, saw Alcatel-Lucent grabbed by Nokia and saw Charter buy Time Warner Cable and Bright House Networks. And we are now watching the regulators sorting out mergers with Verizon trying to buy both XO Communications and Yahoo, with CenturyLink wanting to buy Level 3 Communications and AT&T wanting to acquire Time Warner.

From reading Wall Street speculation it seems like the current merger mania in our industry is not over. The rumors are strong that CBS and Viacom will soon announce a merger. There is rampant speculation that several companies might try to outbid CenturyLink for Level 3. There are rumors that Comcast, Charter and Altice are interested in buying T-Mobile or Sprint. There are continuing rumors that Verizon wants to buy Dish Networks to get permanent access to the huge swatch of spectrum they own. And there have been rumors for the last year that somebody ought to buy Netflix.

And these giant mergers aren’t just happening in telecom. We see Bayer buying Monsanto, Microsoft buying Linked-In, Marriott buying Starwood, Tyco buying Johnson Control, Protection 1 buying ADT, Sherwin-Williams buying Valspar and Fortis buying ITC Holdings.

It’s really hard in the telecom world to know if mergers are good or bad for the industry. Some mergers are clearly bad because they eliminate competition and create oligopolies at the top of the market. The rumored merger between CBS and Viacom is one such merger. Today there are only five major programmers in the country and this reduces that to four. A lot of the woes in the industry today are due to the greed of programmers and consolidation at the top of the industry can’t mean anything good.

But other mergers might be beneficial. Consider the impact of Comcast or Charter buying T-Mobile or Sprint. I just saw an article this week that showed that the wireless operations of AT&T and Verizon are still showing a gross margin of over 50%. It’s been clear to every consumer that cellular service is overpriced due to lack of meaningful competition. Perhaps one of the big cable companies could drive down cellular prices in an attempt to grab market share.

But on the flip side, letting these huge cable companies develop a quad play product is bad for anybody else that tries to compete with them for broadband. A new fiber overbuilder in a city would have an even bigger challenge if they try to displace a cable competitor that offers cellphone service bundled with their broadband. It’s been clear for a long time that lack of broadband competition is bad for consumers.

The underlying theme driving all of these mergers is that Wall Street has a never-ending appetite for increased earnings. That alone is often a good thing. Many times the companies being acquired are underperforming for some reason and mergers sometimes wake them up to do better. Many mergers promise improvement earnings due to the effects of consolidation and a reduction in the management and overhead drags.

But consider what mega-mergers in the telecom space more often mean. They mean that fewer and fewer companies control the vast majority of the market. And those giant companies are driven by Wall Street to increase earnings quarter after quarter forever – and at a pace and level that exceeds general inflation. You only have to do the math on that basic concept to realize that this means price increases for residential and business customers year after year to keep meeting higher earnings targets.

Years ago we had Ma Bell that controlled 95% of the phone business in the country. AT&T would have acted like any other commercial company except for the fact that their prices were heavily restricted by regulators. But stockholders of these big companies today do just the opposite and they pressure management to increase profits no matter the consequences. It is the chase for bigger earnings that has seen programming costs and cable TV rates climb much faster than inflation for the last decade to the point where the cable TV product costs more than many households are willing to pay.

I doubt we will see the end to these mergers, but if we don’t find a way to curb them the inevitable results will be a tiny number of companies controlling the whole sector, but with none of the restrictions in the past that were put on companies like Ma Bell. It scares me sometimes to think that broadband rates are going to increase in the same manner that cable rates increased in the past. But when you look at what the big ISPs have to sell it’s hard to not picture a scenario where earnings pressures are going to do the same thing to broadband that has been done to cable rates. That is going to do great harm the country to the benefit of the stockholders of a few big companies.

Technology Predictions

Alexander_Crystal_SeerThroughout history there are examples of people publicly declaring that something wasn’t possible and then a few years later the predicted impossible happened. There are some well-known examples of this in our own industry. Consider the following:

In 1961 Tunis Craven, an FCC Commissioner said, “There is practically no chance communications space satellites will be used to provide better telephone, telegraph, television, or radio service inside the United States.” Four years later the first commercial communications satellite was launched.

In 1878 Sir William Preece, the chief engineer of the British Post Office declared, “The Americans have need of the telephone, but we do not. We have plenty of messenger boys.”

The same year an internal memo at Western Union opined that, “This ‘telephone’ has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us.”

The famous movie producer Darryl Zanuck was quoted in 1946 as saying, “Television won’t be able to hold on to any market it captures after the first six months. People will soon get tired of staring at a plywood box every night.”

In possibly the most famously wrong technology quote Bill Gates said, “We will never make a 32 bit operating system.”

Along the same lines, Ken Olson, the President of Digital Equipment Corporation and a major manufacturer of mainframes said in 1977, “There is no reason anyone would want a computer in their home.

Our industry is full of predictions about the future. You always have to wonder which of these predictions will come true and which will prove to be totally wrong. It seems predictions are of two types. Some predictions are made about working technologies that aren’t going to make a dent in the marketplace. Our technology history is full of devices that nobody wanted to buy. But there are also many predictions made, like the examples above, about the limitations of technology and what is possible.

I know that technology improvements have taken me by surprise a few times. If in the early 2000s you would have asked me to predict the top data speeds that could be achieved on telephone copper or on coax I would have underestimated the speeds by a magnitude or more. At that time scientists in the labs all said that there were insurmountable interference issues that made higher frequencies unusable on both kinds of copper.

But some smart scientists doggedly worked on these problems and today we have G.fast that is putting gigabit speeds for short distances over telephone copper. And we have the potential for incredibly fast speeds on coax. I’ve witnessed a lab test that put 6 gigabits through a piece of coax.

I make a lot of predictions in this blog. But one thing I’ve learned is that you should never say never when it comes to technology (well, except maybe for tabletop fusion power). Instead predictions are better made talking about the likelihood of something being achieved in the foreseeable future versus the distant future.

A good example of this is gigabit cellular service. Unfortunately some of the press has confused millimeter wave wireless with 5G cellular and there are articles all over the web talking about the coming gigabit cellular service. Is gigabit cellular possible? I would venture to say that in a lab setting with a small number of phones this might be possible today or in the near future.

But there are physics limitations that limit gigabit wireless speeds to short distances and for this technology to ever become pervasive we would have to massively rework all cellular infrastructure to literally surround ourselves with cellular transmitters. It is that limitation that means that this is an extremely unlikely application within any reasonable time frame. It’s certainly possible someday that we might be surrounded by tiny IoT devices that can somehow work as a mesh network to bounce around fast data signals. But there are a whole lot of technology breakthroughs needed first to implement such a technology. So is gigabit wireless possible – I think it is. Will we see it in our lifetimes other than perhaps in a few controlled settings – I predict not. Guess we’ll have to wait to see if I’m right.

An Effective Federal Broadband Program, Part 1

eyeballThere are a lot of rumors flying around the industry that there is going to be a big nationwide federal program to fund rural broadband infrastructure. So I’ve been thinking about what such a program might look like. We have the experience a few years back of a few billion dollars being handed out for broadband by the stimulus plan. It’s vital to learn from past mistakes, and so today I look at lessons learned from earlier federal grant programs.

This is the first in a series of blogs that will look a how a federal broadband program could be done to get the most bang for the federal buck. We might only get one chance at this as a country, so I hope we can do this right.

So, in starting with lessons learned from the past, here are a few things that a nationwide federal broadband build-out should avoid:

Don’t Impose Unnecessary Restrictions. There were three rules associated with the stimulus grants that added a lot of cost and delay to projects. A federal project could get a lot more bang for its buck by eliminating the following:

  • Environmental Impact Studies. Telecom networks are built almost entirely in existing rights-of-ways within a few feet of paves roads. So there is no reason to impose a time-eating study to prove that a fiber cable won’t bother endangered plants or animals unless the fiber is being built outside of the existing rights-of-ways.
  • Historical Preservation Rules. Having to check that fiber is not going to somehow disturb historic sites is also silly unless the fiber is being built across open fields. There should be no requirement to do archeological studies for work done in the narrow shoulders of existing highways that have been dug up in the past.
  • Prevailing Wages. I saw projects where requiring prevailing wages added 20% to the cost of the whole project. Prevailing wages sounds like a good idea, but in practice what happens is that large city wages structures are imposed on construction companies that have been building in rural areas for decades. Making these companies pay much higher wages to employees who have worked for them for years is great for employees, but is a terrible waste of the federal dollars.

Don’t Overwhelm the Industry. A federal broadband buildout could be a magnitude larger than the stimulus program and even that program overwhelmed the industry. There are only a finite (and small) number of consultants, engineers and construction companies available in the market and if the government tries to build a lot of infrastructure in a hurry, then a lot of projects are going to be designed and built by companies with no experience.

The stimulus program also showed that it’s not hard to overwhelm the companies that make broadband products. The stimulus program caused a shortage of fiber and prices spiked. There was also a shortage of some kinds of common fiber electronics that delayed projects. It’s hard to imagine what would happen if we tried to build a lot faster than the stimulus program.

Don’t Give Money to Start-ups. The stimulus program gave a lot of money to start-up businesses and a number of these networks have not done well. There was unfortunately a lot of fiber built to nowhere with stimulus funds that even today is barely carrying any traffic. Existing carriers already have the underlying talents and systems in place that are needed to be a successful telecom company. It does not good to get the fiber built to people’s homes unless the company doing so is poised to be a long-term successful ISP.

Hire Experienced People to Review Applications. There was no existing pool of experienced people to review the stimulus grant applications, and so the agencies involved scurried to try to find bodies. I’ve written about this before, but to see if the process was as bad as I feared I encouraged a guy who did my landscaping to apply to be a reviewer. He had done some computer coding years earlier but otherwise had zero experience with telecom. To both of our astonishment he was offered a position as a grant reviewer. If there are a lot of grant funds available there will be a ton of unworthy and faulty applications and it takes seasoned industry veterans to be able to distinguish the good ones from the bad ones.

Take Only Real Matching Funds. The stimulus grants required a significant amount of matching for the federal grant dollars. Unfortunately not all of the matching was with cash and they accepted ‘in-kind’ matching. In-kind matchings were supposed to be an asset that had significant and quantifiable benefits to the project. I reviewed a number of successful grant applications and saw that many of them had made outlandish claims of in-kind matchings that the feds accepted. As an example, I saw one grant that claimed a huge dollar benefit for already having existing rights-of-ways on state highways. The fact is that these same rights-of-ways are available to anybody who meets the qualifications. But the in-kind matching meant that the applicant didn’t need to have any actual matching cash to get the grant.

Get the Industry to Design the Grant Forms. I’ve been doing telephone accounting since the 70’s and the stimulus grants asked for expenses and capital expenditures in a format that baffled me at times. Most telecom companies keep similar books and it’s not hard to ask for financial information in a way that everybody understands.

The CenturyLink – Level 3 Merger

CenturyLinkCenturyLink just confirmed their bid to buy Level 3 Communications for $34 billion in the latest round of what looks like major industry consolidation. After Verizon’s purchase of XO Communications it looks like large nationwide fiber networks are going to be gobbled by larger players.

But we can’t quite put this merger in the books yet. There have been rumors floating for the last year of others interested in the company. Just this summer there were strong rumors that Comcast wanted to buy Level 3. And now there is a lot of speculation that the big wireless companies are also interested in the company. So don’t be surprised by one or more counterbids.

Why is Level 3 wanted by so many large players? The easy answer is that they have a huge fiber network, but it’s more because they have a fiber network that goes to all of the right places. Big companies like Verizon and AT&T are already connected into all of the major fiber hubs around the country. But Level 3 is connected nearly everywhere else. Their network extends out to a huge number of tier two and three cities.

And more than that, Level 3 has a lot connections to the big fiber users in local markets – the ISPs, large businesses, governments, school systems and cellular sites. The company has been busy for many years building fiber to places asking for big broadband.

This makes Level 3 a huge player in the Internet backhaul business. They are the ones that carry a lot of the Internet backbone to the smaller competitors of the giant incumbents. Level 3 also serves the supply side of the Internet and is a prime supplier of bandwidth to companies like Netflix, as well as the many large data centers for the other big web companies. Level 3’s revenues have been booming with the explosion of video traffic on the web.

CenturyLink is already a significant player nationwide for large businesses and governments. Before Qwest bought the old US West company they had built a significant nationwide fiber network and had vigorously pursued nationwide customers. That business has been extended and grown under CenturyLink and this acquisition would push the company to the top of the heap in the fiber business. There are so many benefits of the acquisition that nobody is questioning the sense of the merger (unlike the AT&T and Time Warner merger that has analysts scratching their heads).

I have a lot of clients that are going to be concerned about this acquisition (and others who will be once they understand the implications). Level 3 is one of the primary providers of fiber backhaul to reach the Internet for a huge number of small communities, and in many cases they are the only alternative to buying overinflated backhaul directly from the incumbents.

There are a lot of small ISPs and other users of broadband that are going to be worried about losing affordable backhaul – particularly those that compete with CenturyLink. It’s unlikely that these places will be denied connectivity by the combined company, but rather that over time the fear is that if you compete directly with CenturyLink that prices for backhaul will be increased. It wouldn’t take long for smaller competitors to CenturyLink to be put at a competitive disadvantage.

There is another class of carriers that might not even know that the merger could harm them. It turns out Level 3 is the primary underlying carrier for most wholesale VoIP products sold to carriers. Level 3 has developed a product called local access that gives carriers connections into all of the right places to deliver VoIP traffic to the PSTN. When somebody today pays $6.50 to buy a wholesale VoIP line it’s likely that half of that money goes to Level 3. CenturyLink could gut the VoIP world and a lot of competitors by discontinuing or restricting that product.

So the concern with any merger like this is what it’s going to do to limit competition. Every big merger decreases competition significantly in some markets. This merger holds out the possibility of harming competition over the very large geographic footprint covered by CenturyLink. Big mergers like this almost always come with restrictions against bad behavior from the FCC or the Justice Department. But we’ve seen big telcos often ignore such restrictions within a few years after a big merger.

CenturyLink is not making this purchase to eliminate competition. There are numerous benefits directly to the company that are drivers of the transaction. But we know that over time companies act to limit competition when they have the ability to do so. We’ve seen this happen in huge ways with Comcast, Verizon and AT&T. We’ve not seen nearly as much anti-competitive behavior in the past from CenturyLink (and their predecessor Qwest) – but this merger puts them into the position to act like the other large companies if they so wish. And my cynical side says that the bigger a company gets, the more it benefits them to be anti-competitive.

Technology and Telecom Jobs

PoleIn case you haven’t noticed, the big companies in the industry are cutting a lot of jobs – maybe the biggest job cuts ever in the industry. These cuts are due to a variety of reasons, but technology change is a big contributor.

There have been a number of announced staff cuts by the big telecom vendors. Cisco recently announced it would cut back as many as 5,500 jobs, or about 7% of its global workforce. Cisco’s job cuts are mostly due to the Open Compute Project where the big data center owners like Facebook, Amazon, Google, Microsoft and others have turned to a model of developing and directly manufacturing their own routers and switches and data center gear. Cloud data services are meanwhile wiping out the need for corporate data centers as companies are moving most of their computing processes to the much more efficient cloud. Even customers that are still buying Cisco boxes are cutting back since the technology now provides a huge increase of capacity over older technology and they need fewer routers and switches.

Ericsson has laid off around 3,000 employees due to falling business. The biggest culprit for them is SDNs (Software Defined Networks). Most of the layoffs are related to cell site electronics. The big cellular companies are actively converting their cell sites to centralized control with the brains in the core. This will enable these companies to make one change and have it instantly implemented in tens of thousands of cell sites. Today that process requires upgrading the brains at each cell site and also involves a horde of technicians to travel to and update each site.

Nokia plans to layoff at least 3,000 employees and maybe more. Part of these layoffs are due to final integration with the purchase of Alcatel-Lucent, but the layoffs also have to do with the technology changes that are affecting every vendor.

Cuts at operating carriers are likely to be a lot larger. A recent article published in the New York Times reported that internal projections from inside AT&T had the company planning to eliminate as many as 30% of their jobs over the next few years, which would be 80,000 people and the biggest telco layoff ever. The company has never officially mentioned a number but top AT&T officials have been warning all year that many of the job functions at the company are going to disappear and that only nimble employees willing to retrain have any hope of retaining a long-term job.

AT&T will be shedding jobs for several reasons. One is the big reduction is technicians needed to upgrade cell sites. But an even bigger reason is the company’s plans to decommission and walk away from huge amounts of its copper network. There is no way to know if the 80,000 number is valid, but even a reduction half that size would be gigantic.

And vendor and carrier cuts are only a small piece of the cuts that are going to be seen across the industry. Consider some of the following trends:

  • Corporate IT staffs are downsizing quickly from the move of computer functions to the cloud. There have been huge number of technicians with Cisco certifications, for example, that are finding themselves out of work as their companies eliminate the data centers at their companies.
  • On the flip side of that, huge data centers are being built to take over these same IT functions with only a tiny handful of technicians. I’ve seen reports where cities and counties gave big tax breaks to data centers because they expected them to bring jobs, but instead a lot of huge data centers are operating with fewer than ten employees.
  • In addition to employees there are fleets full of contractor technicians that do things like updating cell sites and these opportunities are going to dry up over the next few years. There will always be opportunities for technicians brave enough to climb cell towers, but that is not a giant work demand.

It looks like over the next few years that there are going to be a whole lot of unemployed technicians. Technology companies have always been cyclical and it’s never been unusual for engineers and technicians to have worked for a number of different vendors or carriers during a career, yet mostly in the past when there was a downsizing in one part of the industry the slack was picked up somewhere else. But we might be looking at a permanent downsizing this time. Once SDN networks are in place the jobs for those networks are not coming back. Once most IT functions are in the cloud those jobs aren’t coming back. And once the rural copper networks are replaced with 5G cellular those jobs aren’t coming back.

Forced Arbitration

Scale_of_justice_2_newYou may have noticed that the majority of consumer contracts and consumer terms of service documents now require arbitration to resolve disputes. These are in almost every contract or terms of service you sign like when you take out a credit card, buy a new washer or open a new bank account. Arbitration is now included in most telco and ISP terms of service that customers must sign before buying a new service.

Arbitration is a process that has been used in the business world and in the telecom world for many years. With arbitration two parties in a contract submit their dispute to one or more impartial persons for a final and binding decision, known as an award. These awards are made in writing and are generally final and binding on the parties in the case. Most contracts between carriers use arbitration because it’s a faster and less costly way for two carriers to resolve a dispute. Arbitration works well between companies that voluntarily agree to abide by a decision made by an arbitrator. Disputes are resolved more quickly than with a normal law suit and since it’s binding the two sides can move forward without having to worry about appeals.

But forcing arbitration on consumers when they buy products or services is a very different situation because the two parties are not equal. Consumers are forced to agree to binding arbitration by checking a box on a computer screen or in signing a receipt to accept a product. This is a very different relationship than one between two commercial companies since the consumer has limited rights to start with and the agreements people must sign constrict their rights even more.

Commissioner Mignon Clyburn has taken the position that forced arbitration is bad for consumers. There is also a movement in Congress with a bill proposed by Patrick Leahy and Al Franken to end the practice. They all argue that the arbitration process is heavily biased in favor of the big companies, like ISPs, that force arbitration.

And they are right. An individual consumer of broadband or cable service isn’t going to invoke a very costly arbitration process to resolve a billing dispute. It’s hard to imagine that would ever happen. And so the practical impact of forced arbitration is that big companies can overbill or abuse customers with no fear of having to make things right.

In the telephone world consumers used to be protected by tariffs that were on file with regulators. Those tariffs contained a lot of rules about how customer complaints would be resolved. For the most part the rules were not too heavily biased in either direction, and so the mere presence of the rules generally meant that a customer could reach an agreement with a telco over a dispute without having to go to a higher level. But if necessary the customer could complain to the state regulatory Commission. Most states had a few hundred such complaints per year and these generally led to a forced conversation between the telco and the customer to reach a resolution.

But tariffs are largely gone. Some, but not all, cable franchises have created rules to provide some consumer protection. And there is nothing like a tariff for broadband products.

The main reason large ISPs are using forced arbitration is not so that they don’t have to adjust customers’ bills. The arbitration provision makes it much harder to bring a class action lawsuit against a carrier for harming many customers. You can like or hate class action lawsuits. There certainly have been abuses in this area with unscrupulous lawyers filing such suits in the hope of reaching a settlement. But there are also many cases where these suits were the only way to get large companies to stop deceptive billing practices or other ways of ripping off customers.

What I find most interesting about Commissioner Clyburn’s position is that perhaps the FCC is now in the process of doing something about forced arbitration for telco products. There are other government agencies ranging from the Department of Education, the Department of Defense, and the Consumer Financial Protection Bureau that are trying to crack down on the practice.

It’s possible that Title II regulation gives the FCC the authority to address the issue. I’m no lawyer and I have no idea if Title II regulation gives the FCC that power. But in the past, the protections built into tariffs largely flowed downhill to the states due to the FCC’s position on customer rights, and that authority stemmed from Title II regulation of telephone service.

I think small ISPs ought to look to see if your customer terms of service contain forced arbitration. I would bet that many do, because it’s common for small companies to copy the terms of service from some larger company. If you have such a clause you ought to consider the message it gives to your customers. It says that they are basically powerless to sue you over a dispute. That may sound like a good thing, but then also ask how many times customers have actually sued you over one of your consumer products. Chances are that it’s zero. The fact is that small companies find ways to resolve issues with customers while big companies do not. Removing forced arbitration from customer contracts is more customer friendly and that is probably in your best interest. Also, if the FCC makes this mandatory you want to be able to say to your customers that you were ahead of the curve and do not require forced arbitration.

Lies, Damned Lies and 5G

4g-antennaI’m not sure that there is a major industry that lies more to its customers than the cellular industry. The whole industry has spent the last decade touting its 4G LTE networks, when in fact the industry is just now installing the first cell sites that actually meet the 4G standard.

And now we are starting this cycle all over again with the industry buzz about how 5G is right around the corner. But it isn’t. And I will take bets that within the next year or so one of the cellular companies is going to tell their customers they now have a 5G network.

Every once in a while somebody in the industry tells a little bit of the truth. At a Qualcomm summit recently in Hong Kong, Roger Gurnani, the EVP and chief information and technology architect at Verizon said that 5G is not a replacement for 4G and that LTE will be around for many years. And he is right, because it’s going to be at least ten years until a customer anywhere is going to be able use a cellphone that meets the full 5G standard. But there is no way that anybody at one of the cellular companies is ever going to say that.

The 4G standard was established around 2008 and we are just now seeing US cell sites that are implementing what they are labeling as LTE-Advanced, which is the first deployment that meets the full 4G standard. I say ‘about 2008’ because the effort to create the new 4G standard took two different paths with WIMAX and LTE, with different timelines. The standards for 5G are still under development and probably aren’t going to be finalized until late 2019.

How have the cellular companies been able to claim 4G all these years with a straight face (and without getting shut down by the Federal Trade Commission or hit with class action lawsuits)? The answer lies in the fact that the specifications for a standard like 4G or 5G contains a lot of different components. To use a simple analogy, if there are ten technology improvements needed to migrate from 3G to 4G, then the cellular companies started touting they had 4G after only one or two of the upgrades. But until all of the improvements have been implemented a customer cannot receive the actual promised benefits of the 4G standard.

A lot of this has to do with marketing hype. Think back to a decade ago when there was an arms race to be the first cellular company to have 4G. All the cellular commercials made 4G claims and we were bombarded by maps showing who had the best 4G coverage. But these claims were made by the marketing folks at the wireless companies and the fact is that all of those maps were a lie and nobody had 4G. Even now most people can’t get full 4G.

The cellular companies are also egged on by the cellular vendors. Right now that is all that the companies that make wireless equipment want to talk about – how they will be the first to support 5G. And so if you go to an industry forum right now that is all you will hear. I’ve noticed numerous 5G summits being announced around the world, mostly led by vendors, to talk about the next generation of cellphones for which the standards are not even finished.

I see several problems with the inflated hype from the cellular companies. First is that customers don’t see much evidence of the upgrades from one technology to another because the upgrades are made incrementally in little steps. The first customers that bought a 4G cellphone didn’t get very much faster speeds than they had on 3G.

Today the average data speeds in the US on 4G connections are just over 7 Mbps. Some customers in some instances can do much better than that, but that is the average for the billions of connections made. When 4G is finally everywhere (and full 4G may never be put into more rural cell sites) that average speed ought to creep up to about 15 Mbps as long as cell sites aren’t overloaded. The first phones cited as 5G are probably not going to do much better than 4G, but as upgrades are implemented over time the 5G speeds are supposed to creep towards 50 Mbps.

And that is the second problem I see with the inflated claims of the cellular companies. By touting that much faster cellphones are right around the corner they are causing those who would build fiber landline networks to pause. I am sure that this is on purpose – one only has to read an AT&T or Verizon press release to see that is part of their motivation. But nobody would pause in building fiber if these companies were to tell the truth and say that 50 Mbps cellphone coverage might be possible in ten years. That is the real harm from these lies.

The Google Fiber Rumor

googlefibertruckMy wife went into a Best Buy this week and when she was talking with the salespeople there they found out that we are fiber consultants. The first thing they wanted to know from her was if the rumors were true that Google Fiber was coming to our town early next year. They firmly believed this was the case and they were really excited about the possibility.

I live in a small town halfway between Fort Myers and Sarasota in Florida. Google Fiber had been in talks with Tampa which is about two hours north of here. But I can’t imagine that our community is on anybody’s radar to build fiber. I live in a snowbird community, meaning we are where northerners come to get away from winter. For about six months a year this is a ghost town. Most of the houses in my neighborhood are dark for half the year. It’s hard to think that anybody would build fiber in an area where half the potential customers are gone half the year, and where a lot of the customers are elderly and not particularly interested in fiber broadband speeds.

But I find it intriguing that there is a strong rumor about getting fiber in this area. I am sure that this rumor started from folks in Tampa since Google Fiber has been in talks with the city for the last few years. I guess people assume that if they come there that they will come to the whole region.

But I think this rumor speaks to how much fiber is wanted. These salespeople were young techie guys and would be expected to part of the fiber demographic. There are a number of people in every community that would love to buy fiber and who would sign up for it as soon as it is available. But the big question that still has to be answered is if there are enough people willing to pay a premium price for fiber to even create a business plan.

It certainly doesn’t seem like Google Fiber has been going gangbusters. They don’t release customer numbers, but the general buzz in the industry is that they haven’t picked up as many customers as they had hoped for. And that, possibly more than any other factor has probably led to them taking a ‘pause’ from new fiber expansion.

Building fiber networks is expensive. I create fiber business plans and have studied every size market possible from farmlands to NFL cities. The one common feature of every fiber business is that there is some minimum customer penetration rate needed just to break even – with breaking even meaning being able to cover all of the costs of operations including capital.

When municipalities and cooperatives look to build fiber they want to make sure that they do a little better than breakeven. They will obviously be pleased if a fiber business does even better and spins off cash, but they worry more about having to subsidize a fiber business. And it is this perspective that makes it seem easier in some ways to build fiber to small towns and even to farms than to big cities. For these builders, breakeven is good enough.

But Google Fiber or CenturyLink or any of the other commercial fiber overbuilders want to do much better than breakeven. These big companies are beholden to shareholders who expect a significant increase quarter-over-quarter in profits and returns. And the need for significant profits means they have to get a lot of customers to meet their financial goals.

Frankly, the desire for high profits and high capital costs don’t jive very well. Fiber is infrastructure and it’s a real challenge to get high returns out of any kind of infrastructure. Other utilities like electric or water are a lot more realistic and hope to make a modest, but steady profit and make it for a long time. If fiber overbuilders were being realistic they would have the same perspective – but tech companies are not utilities and their stockholders are not going to be patient with slow steady returns.

Google Fiber is now on hold and will consider expanding again if they can find a way to use wireless technology to build the last connection to customers. Assuming that such a technology lowers costs (not a given), then this would reset the bar and lower the breakeven needed – and also make it easier to make profits. But even then it’s going to cost a huge amount of money to build broadband in a city. Wireless networks like the ones Google is envisioning still require a lot of fiber, and that means they will still be an infrastructure-heavy business.

I think there is a good possibility that Google Fiber will never resume their expansion plans using an infrastructure model. This is going to disappoint millions who have been hoping for fiber like the guys at Best Buy. Google Fiber might still consider opportunities like Huntsville, Alabama where the city paid for the fiber network. But my guess is that the Google parent company doesn’t have a real appetite for infrastructure returns, and that is why Google Fiber is on hold.