OTT News – August 2017

SANYO DIGITAL CAMERA

It’s been a busy time in the OTT market with players coming and going and the choices available to customers growing more complicated and confusing.  Here are some of the bigger recent events in the industry.

Continued Cord Cutting. The major cable providers lost 946,000 cable customers in the second quarter – the worst quarterly loss ever. This puts cord cutting at an annual loss rate of 2.7% of customer, up from only 1% a year ago. It’s obvious that cord cutting is picking up momentum, and the wide variety of OTT viewing has to be a contributor. Nielsen recently reported that 62% of homes now watch OTT content at least occasionally.

It’s getting harder for analysts to count cable customers. For example, Dish Networks is not reporting on the specific performance of its satellite service versus SlingTV. The losses for the quarter were also eased a bit by the fact that Charter began counting seasonal customers even when they go dormant, such as the snowbird in Florida who subscribe only in the winter but who keep the account active.

ESPN / Disney OTT Offering. Disney announced that it would be launching two new OTT offerings in 2019 – a standalone ESPN offering and a standalone Disney offering. Along with this announcement they announced they will be withdrawing Disney content from Netflix. The ESPN offering will not duplicate the cable version of the network and will not include things like the NFL and NBA. But it will include major league baseball, the NHL, major league soccer, grand slam tennis events and college sports. Analysts think this offering is mandatory since ESPN has lost 13 million subscribers since 2011 and advertising revenues dropped 8% last quarter.

The standalone Disney offering is also interesting in that the company has decided to take Netflix on head-to-head. Because of contractual arrangements Netflix will still have access to content produced by Disney such as the numerous shows produced by Disney’s Marvel Studios. But starting in 2019 Disney is going to make new content only available on their own platform. This prompted Netflix to purchase Millarworld, a major comics producer.

NBC Closing Seeso. NBCUniversal says that it will be ending the Seeso OTT offering later this year. This is an offering that consisted largely of NBC comedy and related entertainment such as Saturday Night Live and the Tonight with Jimmy Fallon.

This failure is a big warning to the many cable networks that have been contemplating using the strategy of shoving existing content online. Industry analysts say that simply taking linear content online is not a recipe for success. It seems that the platform is just as important as the concept and the bigger platforms like Netflix keep customers engaged and enabling them to move from show to show without leaving the platform. But it’s too easy for a customer to leave a limited-offering platform, thus diminishing the perceived value for customers to buy a subscription.

Facebook OTT Offering. Facebook has announced the launch of Watch, an OTT service that will include content from A&E, Univision, Major League Baseball and other content such as worldwide soccer. For now the new service is being launched overseas with some limited US trials, but is expected to hit the whole US market later this year.

The offering is being structured like YouTube to enable content creators to launch their own channels. Facebook is currently funding some content providers to seed content on the new service. They are hoping that within time the platform becomes self-sustaining and can be an alternative to the wildly popular YouTube. Facebook is counting on their ability to lure enough of their billion plus users to the new platform to make it a success. The company’s goal is to keep people on their platform for more than just social networking.

Apple. Apple will be entering the OTT world and announced that they will spend $1 billion to create programming content over the next year. This puts them into rarified company with Netflix that is spending $6 billion, Amazon at $4.5 billion and HBO at $2 billion. There is no news yet of the nature or timing of an Apple OTT offering.

The Best Way to Bundle

I read an interesting quote recently in an article written by Mike Dano of FierceWireless. He interviewed Ronan Dunne, the EVP of Verizon Wireless. He quoted Mr. Dunne as saying, “In competitive markets, and the U.S. is one, if you’ve got real choice in the individual products, the cost of bundling is that you end up taking the second-best wireless product and you map it to the third-best TV bundle in order to get the cheapest broadband connection or fiber connection. No wonder you get $5 off at the end of the bill,

That statement is a perfect lead-in to talk about the different ways to bundle. Mr. Dunne was referring to bundles like the one that AT&T does with DirecTV to try to get more video customers. That AT&T bundle is similar to what we see from most of the big ISPs. I wouldn’t even label these efforts as bundles, but rather as marketing specials that are designed to lure customers to buy specific product sets.

And Mr. Dunne is right. If you go to the web pages of all of the big ISPs you will see their pages splashed with really low-cost sounding specials. By now most people have figured out that the price for these specials increases at the end of the special term. And often people have found out that even with these specials that the actual price paid is higher because the ISP will load up these specials with all sorts of extra fees and charges that were not described in the advertising.

But Mr. Dunne is making an even more important point in that these specials end up luring customers to buy the smallest and least profitable products that an ISP sells. In order to get a cheap web price the ISP will pair their slowest broadband product with a small cable TV package. When customers contact the company to buy this special the customer service rep answering the phone then has an uphill battle to talk the customer into anything better – because they already have the advertised low price in mind when they call. Mr. Dunne went on to say that this kind of bundling is not attractive to Verizon wireless and that they would much rather sell premium products at a fair market price.

My clients face this same dilemma all of the time. I have some clients that take the exact opposite approach. They list all of their possible packages on the web, including those that might cost over $150 per month. But companies that do this face the opposite problem in that the high prices on the web might drive customers away from buying what they really want.

Many of my clients don’t post bundled pricing on their web sites for these exact reasons. They don’t want to lure people with false specials and they don’t want to chase customers away by talking about high prices. I see these clients taking several different approaches on how to handle bundling.

Some provide a discount for buying multiple services. For instance, they might discount $5 when somebody buys two products and $10 when they buy three. I’ve never particularly liked this kind of discounting for a few reasons. First, if a customer does buy your lowest margin products, such as your smallest cable package and a basic telephone line, then this discount might be giving away most of the margin on those small products. Another customer that buys the two highest margin products would get the same discount. I also don’t like the message that sends – it says that in general your products are overpriced.

I have other clients that don’t give any bundling discounts. They try to right-price each product on a standalone basis. They are not afraid to tell this to their customers and they take pride that they think each product is a bargain at the price they sell it at. I like this approach because I like the math. If a company ends up giving some sort of bundling discount to most of their customers then they have given up margin on every one of them. If you do the math you’ll see that you’d make more money with no discounts even with significantly fewer customers. A $10 bundling discount is giving away $10 of bottom line margin, which for most ISPs is a significant amount.

I’ve always asked clients who give big bundling discounts if they think they are saving any money when customers buy multiple products. The answer I get back – when they really think about it – is that they don’t save much. I think a lot of small companies bundle because the big ISPs do it and they think it’s the only way to do business. But I look at companies like Google and many of my other clients that don’t bundle and I see them getting similar market penetrations as my clients that offer bundles.

There is no question that it’s harder to sell without the bundle. It largely means that a sales call with a customer needs to be consultative and a good salesperson will ask a customer to define what they really want before talking price. Then, if the price is too high they will work with a customer to find a compromise they can live with. This kind of sales approach is going to sell a lot more of your premium products. And it’s going to make customers better understand just what they are buying. I think a lot of the customers that buy the cheap advertised bundles are not really happy with their products and are likely to churn at the end of the contract. What they really might want is faster data speeds or more TV channels, but when they start the conversation with the ISP based upon getting the lowest price that real desire gets lost in the transaction.

The main point of this conversation is that ISPs really need to examine their bundling practices. Just copying the big companies might mean giving away a lot of bottom line needlessly. And offering big discounts to new customers might not be adding many new customers after considering the churn and loyalty from customers who only buy due to the specials.

 

Lifeline on the Line?

There are two bills currently in Congress worth monitoring:

Rural Spectrum. The first is called the Advancing Innovation and Reinvigorating Widespread Access to Viable Electromagnetic Spectrum (AIRWAVES) Act (S. 1682) in the Senate. This was introduced by Senators Cory Gardner (R-CO) and Maggie Hassan (D-NH) and aims to encourage the FCC to continue to free up spectrum for both commercial and unlicensed use.

There are still significant chunks of spectrum that are restricted to government use, and there are other blocks of spectrum that are underutilized due to various restrictions that the FCC has put in place on their usage in the past. The FCC needs to keep looking hard at every viable slice of spectrum to make sure we are getting the best use for each slice.

The one feature of this bill that bothers me is that it would set aside 10% of any future spectrum auctions to directly build wireless broadband infrastructure in rural areas. The goal of freeing more spectrum for rural broadband is obviously a good idea, particularly freeing more unlicensed spectrum. But as we’ve seen from the CAF II program, handing billions of dollars to a company like AT&T to beef up cellular towers does not automatically equate to providing better rural broadband. The federal government needs to stop subsidizing AT&T and Verizon and instead aim any federal funding towards getting real rural broadband. The 10% giveaway is nothing more than a subsidy for these giant companies.

Lifeline Reform. A group of nineteen republicans in Congress recently introduced a bill that would curtail the federal Lifeline program. That program is part of the Universal Service Fund and provides a $9.25 monthly subsidy for lower income homes for telecommunications, which today can be used for landline telephone, cellular telephone or landline broadband, with only one such subsidy allowed per household.

The legislation would eliminate the Lifeline subsidy for cellular service. The wording is not clear, but it seems to also eliminate the subsidy to be used for broadband. The discussion of the bill in the press makes it sound like the intent of this bill is to restore the program to its original goal of only subsidizing landline telephones. With the diminished interest in landlines that original goal now seems largely out of touch.

If enacted this would significantly reduce the payments from the Lifeline fund, and the most troublesome aspect of the bill would be that it would then send these excess lifeline dollars to the US treasury. The entire Universal Service Fund is funded by monthly surcharges on Interstate telecom services. Every telephone subscriber, cellular subscriber and interstate transport customer currently pays these fees. The current surcharge is at a whopping 17.1% added to interstate telecom services – far higher than when the fund first started. The surcharge is now so high that it pushes service providers to engage in arbitrage to define services as something other than Interstate – in effect to cheat and lie about what they are selling.

The FCC and Congress have always worked hard to avoid defining the USF surcharge as a tax – but instead it’s defined as a ‘fee’ that had the purpose of funding the same kinds of telecom services in rural America that are available in urban America. But if Congress raids the USF fund they will have openly made it clear that is just another federal tax. If they are going to reduce the outflow from the USF fund then they need to then reduce the monthly fees.

It’s been obvious for years that the USF funding base needs to be expanded, with the logical expansion to add the fee to broadband services. If that was done the fee would be reduced to a tiny percentage of the cost of broadband instead of the 17.1% surcharge on interstate telecom service. But for reasons I can’t understand it still seems to be off-limits to put tax on broadband services, even though this is now the primary product sold by the industry.

It offends me as both a taxpayer and as a telecom industry guy if the payments made into the Universal Service Fund are now just going to become another hidden federal tax. The USF has done a lot of good over the years to bring better telecom and broadband to rural America. If this funding is not going to be used for that mission then the fee charged to customers needs to be reduced.

I think that current bad press might have prompted this bill. It’s been reported that there is fraud in some of the cellular Lifeline programs. But the way to clean that up is to cut off the offending service providers, not to penalize people for whom this is their only source of Internet connectivity. Studies have shown that for most of the people in the USF cellular programs the cellphone is their only source of Internet connectivity. These subsidies are not being used to subsidize expensive iPhones, as the press sometimes insinuates, but the companies providing the service provide inexpensive basic phones that come with a tiny capped amount of voice minutes and data downloads. A lot of the subsidized phones go to the homeless and other marginalized parts of society and the Lifeline phones provide them with a connection to services that would otherwise be out of their reach. I just can’t see the logic behind keeping the subsidy for landlines but not cellphones.

A New FCC Definition of Broadband?

Section 706 of the Telecommunications Act of 1996 requires that the FCC annually review broadband availability in the country. Further, that section of law then requires the FCC to take immediate action if they find that broadband is not being deployed fast enough. This is the law that in the past prompted the FCC to set a definition of broadband – first set at 4/1 Mbps a decade ago then updated to 25/3 Mbps in 2015. The FCC felt it couldn’t measure broadband deployment without a benchmark.

In this year’s annual proceeding the FCC has suggested a change in the definition of broadband. They are suggesting there should be a minimum benchmark of 10/1 Mbps used to define cellular broadband. That doesn’t sound like a bad idea since almost everybody uses cellular broadband at times and it would be good to know that the cellular companies have a speed target to shoot for.

But I am alarmed at how the FCC wants to use the new proposed cellular broadband standard. They are suggesting that cellular service that meets the 10/1 Mbps standard can be considered as a substitute for a landline broadband connection that meets the 25/3 Mbps test. This would represent a huge policy shift at the FCC because use of the cellular standard would allow them to claim that most Americans can get broadband. And that would eliminate them having to take any action to make broadband better in the country.

We can’t be particularly surprised by this shift in policy because now-Chairman Ajit Pai vociferously objected when the FCC increased the definition of broadband in January 2015 to 25/3 Mbps. He argued at the time that the speed definition of broadband should not be increased and that both satellite and cellular broadband ought to be considered as substitutes for landline broadband.

But as almost anybody with a broadband connection can tell you, speed is not the only parameter that matters with a broadband connection. Speed matters for folks in a busy broadband home like mine when different family members are trying to make simultaneous broadband connections. But even homes with lower broadband needs care about more than speed. The limiting factor with cellular data is the stingy amount of total downloads allowed in a month. The new ‘unlimited’ cellular plans are capped at 20 to 25 gigabytes per month. And satellite data not only has stingy data caps but also suffers from latency issues that means that a satellite customer can’t take part in any real-time activity on the web such as VoIP, distance learning or live streaming video.

There are several possible motives for this policy shift. First, this could just be an attempt by the FCC to take off the pressure of having to promote faster broadband everywhere. If their annual Section 706 examination concludes that most people in the country have broadband then they don’t have to push expensive federal programs to expand broadband coverage. But there is also the potential motive that this has been prompted by the cellular companies that want even more federal money to expand their rural cellular networks. AT&T has already been given billions in the CAF II proceeding to largely improve rural cellular towers.

Regardless of the motivation this would be a terrible policy shift. It would directly harm two huge groups of people – rural America and the many urban pockets without good broadband. This ruling would immediately mean that all urban areas would be considered to have broadband today along with a lot of rural America.

I don’t think this FCC has any concept of what it’s like living in rural America. There are already millions of households that already use cellular or satellite broadband. I’ve heard countless stories from households with schoolkids who spend upwards of $500 per month for cellular broadband – and even at that price these homes closely monitor and curtail broadband usage.

There are also huge swaths of rural America that barely have cellular voice service let alone 10/1 Mbps cellular broadband. I was recently in north-central Washington state and drove for over an hour with zero AT&T cell coverage. But even where there is cellular voice service the quality of broadband diminishes with distance from a cell tower. People living close by a tower might get okay cellular data speeds, but those even just a few miles away get greatly diminished broadband.

I know that Chairman Pai has two kids at home in Arlington Virginia. There he surely has fast broadband available from Comcast, and if he’s lucky he also has a second fast alternative from Verizon FiOS. Before the Chairman decides that cellular broadband ought to be a substitute for a landline connection I would challenge him to cut off his home broadband connection and use only cellular service for a few months. That would give him a taste of what it’s like living in rural America.

Consolidation of Fiber Networks

I’ve written a few recent blogs discussing the amount of fiber that’s going to be needed to support the 5G networks envisioned by Verizon and AT&T. This blog in particular cited a recent Deloitte study that estimates that the cost to build the fiber needed to support a ubiquitous 5G network nationwide would be $130 billion.

We know that adding fiber is now a high priority for Verizon. They announced in April a deal to buy over $1 billion of fiber from Corning over 3 years. (As an aside, all of the press releases and articles about that purchase say that amount buys 12.4 million miles of fiber per year, or 37.2 million miles of fiber. There are only a little over 4 million miles of roads in the US, so that obviously means miles of individual fiber strands. Pardon the interruption, but misleading statistics drive me up the wall.)

We can almost be certain that Verizon plans to build fiber for backhaul to cell sites. There are around 250,000 current cell towers in the country, but the deployment of small neighborhood cell sites is going to explode that number potentially by millions. Years ago both Verizon and AT&T elected to let other companies build and own cell towers, which spun off a major new industry. And in that process both companies largely agreed to lease fiber transport to reach those towers. But as the cell industry margins are tightening the companies are now looking to directly own as many of those fiber routes as possible to hold down lease expenses.

While Verizon plans to build a lot of fiber, they are also on an obvious path to buy existing fiber networks that supply transport to cell towers. Last year they purchased XO Communications and just last week announced they were buying a Chicago-area fiber network from Wide Open West.

I have seen several analysts speculate that Verizon will be considering more fiber purchases. Interestingly the analysts focus on the potential purchase of large ILECs like Consolidated or Cincinnati Bell, which both own a lot of fiber. But much of the fiber in these companies is last-mile fiber to reach customers, and it would be curious to see Verizon buy back into that business. Just last year they sold off a significant chunk of their FiOS fiber network to Frontier and it would be a major reversal of that strategy to turn around and invest this soon in last mile fiber. We’ve seen big companies pivot before, but this would be possibly the biggest such change of mind our industry would ever have seen.

I think it’s more likely that they will consider buying transport fiber networks rather than last-mile networks. The problem the company faces is that there are not that many big fiber providers left. CenturyLink recently purchased the largest such network from Level 3, which owns over 55,000 miles of fiber. The only other fiber transport networks left that own over 10,000 miles of fiber are Birch, Zayo, EarthLink, Cogent and Lightower/Fibertech. There are only another half dozen companies that own fiber transport networks of between 5,000 and 10,000 miles. I have to think that Verizon and AT&T have considered buying many of these companies over the last year or two.

There is one other set of big fiber networks that don’t get as much national attention. These are fiber transport networks built largely by consortiums of independent telephone companies. Most of these networks were constructed as a way for the telcos to gain cheap fiber transport to the world outside of their operating territories. Many of these smaller telcos were held hostage to incredibly expensive special access transport from the RBOCs which made it difficult for them to buy affordable Internet access. Since these networks were originally built a lot of them now have expanded throughout their operating regions and are now connected to cell towers, large businesses, governments, universities and other customers needing fiber transport.

Most of these ILEC-owned networks have joined together to form INDATEL. Here is a map showing the wide-spread footprint of INDATEL-member networks. Through this consortium many of these networks are now interconnected, providing a nearly nationwide fiber footprint. The various members have POPs in all of the biggest cities in their region but then also go to all of the smaller communities that have largely been ignored by most of the other fiber providers, with perhaps the exception of Level 3.

I have no idea if either Verizon or AT&T has considered buying these networks. For a company like Verizon these fiber routes would provide transport into many areas where they don’t have fiber today. The owners of these networks might want to explore the possibility of selling their networks. Now that the networks are in place the ILECs that built these networks are no longer isolated from the rest of the world. A sale would let them capitalize on their investment in fiber at a time when fiber networks have an all-time high valuation.

Of course, the downside to all of this is that if Verizon, AT&T and a few others like CenturyLink gobble up the few remaining independent fiber networks they will have a virtual monopoly on fiber transport. During the XO and Level 3 purchases there were a lot comments filed with regulators expressing concern about the negative impact on competition from fiber consolidation. I’d hate to see us go back to the bad old days where the only option for transport was a handful of the big telcos.

Broken Promises by Big ISPs

One of the most frustrating things for regulators has to be when giant ISPs renege on regulatory deals they’ve negotiated and don’t follow through with their promises. Books could be written listing all of the times when big ISPs have promised to do something and then never did it.

I am reminded of one such deal when I read that New York City is suing Verizon over its broken promise to bring FiOS fiber to the city. The lawsuit states that almost a million households are still unable to get FiOS, although the company had promised full coverage when they got a franchise from the city in 2008. In that agreement Verizon promised to bring fiber service to the whole city by 2014. The agreement with the city required that Verizon bring fiber, in conduit, directly in front of, behind, or otherwise adjacent to every residential building in the City.

Verizon had a similar longstanding dispute with the State of Pennsylvania. Back in 2002 the company made a promise to bring DSL service to cover 80% of the state as a prerequisite for the company being relieved of a lot of regulatory oversight by the state. But Verizon never completed a lot of the needed upgrades and huge parts of rural Pennsylvania still didn’t have DSL a decade later.

I wrote a blog a few months back about Charter in New York. There the state had found that the cable modems deployed by the company were not technically capable of delivering anything close to the speeds that the company was advertising. Charter agreed to fix the problem, but five years later had made almost no upgrades and was recently sued by the State.

I could list more examples all day long and there have been disputes all across the country with major telcos and cable companies that have made deals with regulators and then either ignored the agreements or only implemented them in a half-hearted manner.

The problem is that there are really no regulatory penalties that are big enough to penalize an ISP for not doing what it promised. There have been fines levied, but those fines are never nearly as big as the profits or savings realized by the ISPs for ignoring the agreements with regulators. For example, it’s unlikely that lawsuits or penalties will be able to force Verizon to finish the FiOS build in New York City. I am sure the company built to the parts of NYC that made economic sense and decided, for whatever reason, that there is not sufficient payback to justify building to the remaining parts of the city.

And that’s what regulators fail to recognize – big ISPs make decisions based upon the anticipated return for stockholders. I think it’s likely that in many of these cases that the big ISPs had no intention of complying with their agreements from the start. The cynical side of me says that they are often willing to take the upsides associated with these kinds of deals – be that decreased regulation or the ability to complete a merger – while knowing up front that they are unlikely to ever complete whatever they have agreed to do.

I think we are likely to see another round of broken promises in a few years as we start moving towards the end of the FCC’s CAF II program. The big telcos accepted over $9 billion over six years to improve rural broadband to speeds of at least 10 Mbps. I’ve been getting feedback from a lot of areas in the country that those deployments seem to be behind schedule. It will certainly come as no surprise if one or more of the big telcos spends the CAF II funding without bringing broadband to the promised households, or else will deliver speeds under the promised levels. The FCC recently issued a warning to carriers telling them that it expects them to fulfill the CAF II commitments – and I suspect that warning is due to the same kind of rumblings I’ve been hearing.

But ultimately the FCC doesn’t really have any way to make these telcos complete the builds. They might withhold future funding from the telcos, but as the FCC keeps eliminating regulation it is going to have very little ability to enforce the original CAF II agreements or to take any steps to really penalize the telcos.

The saddest part of these various broken promises is that millions of real people get hurt. It’s been reported that there are significant pockets of residents in urban areas like New York City that still don’t have even one broadband provider. There are huge rural swaths of the country that are desperate for any kind of broadband, which is what CAF II is supposed to deliver for the first time. But I think we need to be realistic in that big ISPs often do not meet their promises – whether deliberately or not. And perhaps it’s finally time to stop making these big deals with companies that have a history of broken promises.

The Economics of Tower Transport

Many of my clients lease towers and/or fiber transport to reach towers to wireless companies. Since most of my clients operate last-mile networks this is not usually a major source of revenue for them, but it is a significant one, and one of the more profitable things they sell.

I have been advising clients that we are in the midst big changes in the cellular industry and that they should expect payments for cell tower connectivity to start dropping. Transport providers and cell tower owners that won’t renegotiate lower prices could risk losing the business entirely.

Let’s look at AT&T as an example of this. AT&T has been aggressively pushing its vendors to lower prices. At an investor meeting last year AT&T’s president of technology operations told investors that the current industry model is not sustainable. And he is right. As I wrote in a recent blog the entire cellular industry seems to have crossed the threshold where cellular service is becoming a commodity, and that is putting huge pressure on the cellular companies to reduce costs.

Last year FierceWireless posted a letter that AT&T sent to many of vendors telling them to expect to renegotiate rates and terms. In that letter AT&T said that they would pushing for early termination of existing contracts with the expectations of lowering fees. They said they would be looking for the ability to modify or upgrade existing towers for free. And they want to eliminate any automatic price increases and instead have “rents reduced to competitive rates”.

There are two major costs for a cellular company to use somebody else’s tower. First they must lease space on towers including paying for power and space underneath to house equipment. Where AT&T doesn’t own the fiber connecting to the towers they also have to pay for fiber transport to reach the towers. And that transport is not cheap because the bandwidth they need at towers is growing at a torrid pace. Where just five years ago there were very few towers that needed more than a gigabit of bandwidth, I’ve seen rural towers where the carriers are now asking for the right over time to grow to five gigabits. And everything I read about cellular data usage tells me that demand for bandwidth at towers will continue to grow rapidly.

Many of my clients operate in rural areas and some think that their physical isolation makes them immune from any price negotiations with the wireless companies. But I think they are wrong for several reasons.

  • First, I think a lot of the billions being spend by the FCC’s CAF II program is being used to construct fiber to rural towers. AT&T is spending a most of the $2.5 billion from that program to extend fiber into rural areas. And where they build fiber they won’t need to lease it from anybody else.
  • I also suspect that the cellular companies are working with Frontier and CenturyLink, the other two big recipients of CAF II money to piggyback on their fiber expansion to reach cellular towers at a lower cost.
  • Both AT&T and Verizon are also undertaking significant fiber expansion, with one of the goals of that program to cut transport costs. I believe they are doing the math and that they will build fiber to the towers that save them money over the long-run – with those places with the most savings at the top of the list. If they sustain this kind of construction for five or ten years they will eventually be able to bypass most of the towers that they lease today. And the cellular companies should be doing this. If there are going to be lower margins in the cellular business then they ought to use their capital, while they have it, to permanently reduce operating costs.
  • I also suspect that, while AT&T and Verizon are competitors that they are cooperating to reach the more rural cell sites and have transport swap plans in place that save them both money.
  • Finally, these companies have been buying fiber network providers, like Verizon’s purchase last year of XO Communications. It would not be surprising to see them continue to buy companies that provide cell site transport.

The cellular companies and their partners don’t communicate well with smaller transport and cell tower owners. I suspect that many of clients will only get an inkling that a cellular company is going to bypass them when they get the cancellation notice of their contracts. So I have been encouraging folks to reach out to the cellular companies to renegotiate terms and prices. I think that those willing to so might be able to keep this as a long-term revenue stream, but that those that want to stick with higher historical prices will eventually get bypassed and will lose the revenue stream altogether. It’s a tough call, because some places are remote enough that they may never be bypassed – but it’s a crap shoot to guess if your own region is on the fiber-expansion list.

A 5G Timeline

Network World recently published their best guess at a timeline for 5G cellular deployment. As happens with all new technologies that make a big public splash, the actual deployment is likely to take a lot longer than what the public expects.

They show the timeline as follows:

  • 2017 – Definition, specification, requirements, technology development and technology field tests
  • 2019/20 – Formal specifications
  • 2021 – Initial production service rollouts
  • 2025 – Critical mass
  • 2030+ – Phase-out of 4G infrastructure begins

There is nothing surprising about this timeline, and in the cellular world we saw something similar with the roll-out of both 3G and 4G and there is no reason to think that 5G will be introduced any faster. There are an incredible number of things that must come to bear before 5G can be widely available.

Just to be clear, this timeline is talking about the use of the 5G standard for cellular service, as opposed to the same 5G terminology that is being used to describe high-speed radio connections used to deliver broadband over short distances. The use of the term 5G is going to be confusing the public for years, until some point where we will need a different name for the two different technologies.

Like with any new technology, it will probably be fifteen years until there is equipment that incorporates the full 5G specification. We are just now finally seeing a full implementation of fully-compliant 4G electronics. This means that early 5G roll-outs will only implement a few of the new features of 5G. Just like with 4G we can then expect successive future 5G roll-outs as new features are introduced and the technology inches forward. We won’t go straight to 5G, but will work our way through 4.1G and 4.2G until we finally get to the full 5G specification.

Here are just a few of the things that have to happen before 5G cellular is widely deployed.

  • Standards have to be completed. Some of the first generation standards will be completed by the end of this year, but that’s not the end of the standards process. There will be continued standards developed over the next few years that look at the practical issues of deploying the technology.
  • Then equipment must be developed that meets the new standards. While many wireless companies are already working on this, it takes a while to go from lab prototype to mass production.
  • True field trials are then needed. In the wireless world we have always seen that there is a big difference between the capabilities that can be tested in a lab versus the real performance that can be had in differing outdoor environments. Real field trials can’t proceed until there are finished deployments that are not prototypes that are then tested in many different environments.
  • Then the cellular companies have to start deploying the equipment into the field. That means not only upgrading the many existing cell towers, but it’s going to mean deploying into smaller neighborhood cell sites. As I’ve written about recently, this means building a lot of new fiber and it means solving the problems of deploying small cell sites in neighborhoods. If we’ve learned anything from the recent attempt by the cell companies to deploy small 4G cell sites it’s that these two issues are going to be a major impediment to 5G deployment. Just paying for all of the needed fiber is a huge hurdle.
  • One of the biggest challenges with a new cellular technology is introducing it into handsets. Handset makers will like the cachet of selling 5G, but the biggest issue with cellphones is battery power and it’s going to be costly and inefficient to deploy the more complicated 5G big-MIMO antennae in handsets. That’s going to make the first generation of 5G handsets expensive. This is always the catch-22 of a new cellular technology – cellphone makers don’t want to commit to making big volumes of more-expensive phones until customer can actually use the new technology, and the cellphone makers won’t deploy too much of the 5G technology until there are enough handsets in the world to use it. I’ve seen some speculation that this impasse could put a real hitch in 5G cellular deployment.

To a large degree the cellular industry it its own worst enemy. They have talked about 5G as the savior of all of our bandwidth problems, when we know that’s not true. Let’s not forget that when 4G was introduced fifteen years ago that the industry touted ubiquitous 100 Mbps cellphone connections – something that is still far above our capabilities today. One thing not shown on the timeline is the time when we finally get actual 5G capabilities on our cellphones. It’s likely to be 15 years from now, at about the time when we have shifted our attention to 6G.

Regulating Edge Providers

The year is only half over and already it seems like this might be the most interesting year for regulations we’ve had in my lifetime. It seems like a lot of the telecom regulations we’ve lived with for decades are being reconsidered and that nothing is guaranteed to stay the same.

Perhaps the most novel new idea I’ve heard comes from Steve Bannon in the White House. He believes that Google and Facebook have become so dominant that they should be regulated as utilities. He envisions this being done in much the same manner as is done with telephone and cable companies.

It’s not an entirely novel concept and the European Union has kicked around ideas for curbing the power of big software companies like Microsoft, Google and Facebook. I find the concept to be a little strange coming out of this administration since they seem to be largely anti-regulation and seem to be intent on lowering regulations for both telephone and cable companies. Trying to regulate these companies would have to mean a lot of new regulations.

The first question that popped into my head when I heard this was to ask what a government might regulate with these companies. The European Union went after Google in 2016 for their practices of requiring that cellphones default to the Google search engine and to the Chrome browser. In 2015 they objected that Google used its market power to insist that cellphones use the Android operating system. But these kinds of issues are related to abuse of monopoly power and there are already rules in the US that can tackle these issues, should the government care to do so. I don’t think this is what Bannon has in mind.

It seems like it would be a real challenge to regulate the main business lines of the two companies. You can’t regulate prices because Google and Facebook are free to users. They don’t directly sell anything to their users on their core platforms. If these companies are large it’s because they have a platform that a lot of people want to use. People have a lot of options for using alternate social media platforms or search engines. People seem to use these two companies because they offer something people want – and I really can’t imagine how you can regulate that.

It’s also hard to envision a single country really regulating these entities. We already know what that looks like today by seeing how these big companies operate in China. Probably lesser known is there are many other countries where the companies offer something different that what we see in the US. My guess is that regulation wouldn’t fundamentally change these companies – but it could make them modify the public face of the company if we tried to regulate – something that their many users would probably strongly resent.

I think perhaps the best argument against regulating these two companies is that there is no guarantee that they are going to maintain their current market dominance, or even survive as companies for the long-haul.

The online world has proven to be fickle and people’s collective tastes change over time. Already today US teenagers have largely bailed on Facebook and view it as a platform for their parents and grandparents. I know my daughter only maintains a presence on the platform to communicate with older relatives and that she communicates with her friends elsewhere. Facebook may have over a billion users today, but that is not to say that over a few decades that something better might come along and that they could lose a lot of that market power.

Google faces an even bigger long-term problem. Google relies on people making searches on computers and cellphones. There are a lot of tech experts predicting that search engines will be passe within only a few decades. They predict that people will begin talking directly to an AI-based personal assistant to perform most of the tasks that cellphones do today.

Both Google and Facebook make most their money today from advertising. But in a future world where everybody communicates through a smart personal assistant the direct interface between people and web platforms like Google or Facebook might nearly disappear. The advertising aspect of the Google search engine will disappear if your smart personal assistant is making choices for you based upon your preferences. In an AI-driven future both search engines and social media are likely to be replaced by something drastically different.

The conclusion I reach is that government is not really in a position to regulate the ever-changing world of the big edge providers. Facebook or Google may have a dominant position in their market niches today but in a decade could be in a different place. Just go back and make a list of the big technology players of twenty years ago. It would have been a waste of time to regulate AOL, Compuserve or the other platforms that dominated the web then. Those companies were usurped by something people found to be of more value. Regulation, by definition, assumes a predictable world – something that is unlikely in the edge provider world.

ATSC 3.0 – More Spectrum for Broadband?

This past February the FCC approved the voluntary adoption of the new over-the-air standard for ATSC 3.0. for television stations. There will be around twenty different standards included within the final protocol that will define such things as better video and audio compression, picture improvement using high dynamic range (HDR), a wider range of colors, the ability to use immersive sound, better closed captioning, an advanced emergency alert system, better security through watermarking and fingerprinting, and the ability to integrate IP delivery.

The most interesting new feature of the new standard is that it allows programmers to tailor their TV transmission signal in numerous ways. The one that is of the most interest to the telecom world is that the standard will allow a TV broadcaster to compress the existing TV transmission into a tiny slice of the spectrum which would free up about 25 Mbps of wireless bandwidth per TV channel.

A TV station could use that extra frequency themselves or could sell it to others. Broadcasters could use the extra bandwidth in a number of ways. For example, it’s enough bandwidth to transmit their signal in 4K. Stations could also transmit their signal directly to cellphones and other mobile devices. TV stations could instead the extra bandwidth to enhance their transmissions by the addition of immersive sound and virtual reality. They could also use the extra bandwidth to transmit additional digital channels inside one slice of spectrum.

But my guess is that a lot of TV stations are going to lease the spectrum to others. This is some of the most desirable spectrum available. The VHF bands range from 30 MHz to 300 MHz and the UHF bands from 300 MHz to 3 GHz. The spectrum has the desirable characteristics of being able to travel for long distances and of penetrating easily into buildings – two characteristics that benefit TV or broadband.

The first broadcasters that have announced plans to implement ATSC 3.0 are Sinclair and Nexstar. Together they own stations in 97 markets, including 43 markets where both companies have stations. The two companies are also driving a consortium of broadcasters that includes Univision and Northwest Broadcasting. This spectrum consortium has the goal of being able to provide a nationwide bandwidth footprint, which they think is essential for maximizing the economic value of leasing the spectrum. But getting nationwide coverage is going to require adding a lot more TV stations to the consortium, which could be a big challenge.

All this new bandwidth is going to be attractive to wireless broadband providers. One has to think that the big cellular companies will be interested in the bandwidth. This also might be an opportunity for the new cellular players like Comcast and Charter to increase their spectrum footprint. But it could be used in other ways. For instance, this could be used by some new provider to communicate with vehicles or to monitor and interface with IoT devices.

The spectrum could provide a lot of additional bandwidth for rural broadband. It’s likely that in metropolitan areas that the extra bandwidth is going to get gobbled up to satisfy one or more of the uses listed above. But in rural areas this spectrum could be used to power point-to-multipoint radios and could add a huge amount of bandwidth to that effort. The channels are easily bonded together and it’s not hard to picture wireless broadband of a few hundred Mbps.

But this may never come to pass. Unlike WiFi, which is free, or 3.65 GHz, which can be cheaply licensed, this spectrum is likely to be costly. And one of the major benefits of the spectrum – the ability to travel for long distances – is also a detriment for many rural markets. Whoever is using this spectrum in urban areas is going to worry about interference from rural uses of the spectrum.

Of course, there are other long-term possibilities. As companies are able to upgrade to the new standard they will have essentially have reduced their need for spectrum. Since the TV stations were originally given this spectrum to transmit TV signals I can’t think of any reason that they should automatically be allowed to keep and financially benefit from the freed spectrum. They don’t really ‘own’ the spectrum – it was provided to them originally by the FCC to launch television technology. There are no other blocks or spectrum I can think of that are granted in perpetuity.

TV station owners like Sinclair and Nexstar are watering at the mouth over the huge potential windfall that has come their way. I hope, though that the FCC will eventually see this differently. One of the functions of the FCC is to equitably allocate spectrum to best meet the needs of all users of spectrum. If the TV stations keep the spectrum then the FCC will have ceded their spectrum management authority and it will be TV stations that determine the future spectrum winners and losers. That can’t be in the best interests of the country.