The Four Internets

Kieron O’Hara and Wendy Hall authored a paper for the Centre for International Governance Innovation (CIGI) looking at Internet data flow from country to country. CIGI is a non-partisan think tank that looks at issues related to multilateral international governance. They explore policies in many areas that are aimed at finding ways to improve international trade and the exchange of ideas. The group was founded by donations from the founders of Blackberry and Rim, matched by the Canadian government.

O’Hara and Hall argue that there are four separate models of the Internet operating in the world today. These models differ in the way they view the use of data:

Silicon Valley Open Internet. This is the original vision for the Internet where there should be a free flow of data with little or no restrictions on how data can be used. This model favors Tor and platforms that allow people to privately exchange data without being monitored.

Beijing Paternal Internet. Often also referred to as the Great Firewall of China, the Chinese Internet closely monitors internet usage. Huge armies of censors monitor emails, social media and website to search for any behavior not sanctioned by the state. The Chinese government blocks foreign apps and web platforms that won’t adhere to their standards. Other authoritarian countries have their own walled-off version of the Beijing model.

Brussels Bourgeois Internet. The European Internet favors the open nature of the Silicon Valley Internet, but then heavily regulates Internet behavior to protect privacy and to try to restrict what it considers to be bad behavior on the Internet. This is the Internet that values people over web companies.

Washington DC Commercial Internet. Washington DC views the Internet as just another market and fosters hands-off policies that essentially equate to zero regulation of the big web players, favoring profitability over privacy and people.

The authors say there may be a fifth internet model emerging, which is the Russian model where the government actively uses the Internet for propaganda purposes.

These various models of the Internet matter as world commerce continues to move online and we have growing volumes of  data exchanged between countries. We are now growing to a point where the different models are conflicting. Simple things, like the nature of the personal data that can be recorded and exchanged with an ecommerce transaction are now different around the world.

We are already seeing big differences arise for how countries treat their own data. For example, the Chinese, Russians, and Indians are insisting that data of all kinds created within the country should be stored in servers within the country and not easily be shared outside. That kind of restriction equates to the creation of international boundaries for the exchange of data. This is likely to grow over time and result in international commerce flowing through some version of data customs rather than flowing freely.

The paper asks some interesting questions on how we resolve these sorts of issues. For example, could there be some sort of international global data space for e-commerce, which would be treated differently than the exchange of other kinds of data?

The issues highlighted in the paper are real ones that are likely to start making news over the next few years. For example, the Chinese shopping site Alibaba is poised to offer a serious challenge to Amazon in the US. Considering the concern in the US of espionage by Chinese firms like Huawei, will the US somehow restrict a Chinese firm from conducting e-commerce within the US (and gathering data on US citizens)? Multiply that one example by hundreds of similar concerns that exist between countries and it’s not hard to picture a major splintering of the international Internet.

Gaming Migrates to the Cloud

We are about to see a new surge in demand for broadband as major players in the game industry have decided to move gaming to the cloud. At the recent Game Developer’s Conference in San Francisco both Google and Microsoft announce major new cloud-based gaming initiatives.

Google announced Stadia, a platform that they tout as being able to play games from anywhere with a broadband connection on any device. During the announcement they showed transferring a live streaming game from desktop to laptop to cellphone. Microsoft announced the new xCloud platform that let’s Xbox gamers play a game from any connected device. Sony Playstation has been promoting online play between gamers from many years and now also offers some cloud gaming on the Playstation Now platform.

OnLive tried this in 2011, offering a platform that was played in the cloud using OnLive controllers, but without needing a computer. The company failed due to the quality of broadband connections in 2011, but also due to limitations at the gaming data centers. Both Google and Microsoft now operate regional data centers around the country that house state-of-the-art whitebox routers and switches that are capable of handling large volumes of simultaneous gaming sessions. As those companies have moved large commercial users to the cloud they created the capability to also handle gaming.

The gaming world was ripe for this innovation. Current gaming ties gamers to gaming consoles or expensive gaming computers. Cloud gaming brings mobility to gamers, but also eliminates need to buy expensive gaming consoles. This move to the cloud probably signals the beginning of the end for the Xbox, Playstation, and Nintendo consoles.

Google says it will support some games at the equivalent of an HD video stream, at 1080p and 60 frames per second. That equates to about 3GB of downloaded per hour. But most of the Google platform is going to operate at 4K video speeds, requiring download speeds of at least 25 Mbps per gaming stream and using 7.2 GB of data per hour. Nvidia has been telling gamers that they need 50 Mbps per 4K gaming connection.

This shift has huge implications for broadband networks. First, streaming causes the most stress on local broadband networks since the usage is continuous over long periods of times. A lot of ISP networks are going to start showing data bottlenecks when significant numbers of additional users stream 4K connections for hours on end. Until ISPs react to this shift, we might return to those times when broadband networks bogged down in prime time.

This is also going to increase the need for download and upload speeds. Households won’t be happy with a connection that can’t stream 4K, so they aren’t going to be satisfied with a 25 Mbps connection that the FCC says is broadband. I have a friend with two teenage sons that both run two simultaneous game streams while watching a steaming gaming TV site. It’s good that he is able to buy a gigabit connection on Verizon FiOS, because his sons alone are using a continuous broadband connection of at least 110 Mbps, and probably more

We are also going to see more people looking at the latency on networks. The conventional wisdom is that a gamer with the fastest connection has an edge. Gamers value fiber over cable modems and value cable modems over DSL.

This also is going to bring new discussion to the topic of data caps. Gaming industry statistics say that the average serious gamer averages 16 hours per week of gaming. Obviously, many play longer than the average. My friend with the two teenagers is probably looking at least at 30 GB per hour of broadband download usage plus a decent chunk of upload usage. Luckily for my friend, Verizon FiOS has no data cap. Many other big ISPs like Comcast start charging for data usage over one terabyte per month – a number that won’t be hard to reach for a household with gamers.

I think this also opens up the possibility for ISPs to sell gamer-only connections. These connections could be routed straight to peering arrangements with the Google or Microsoft to guarantee the fastest connection through their network and wouldn’t mix gaming streams with other household broadband streams. Many gamers will pay extra to have a speed edge.

This is just another example of how the world find ways to use broadband when it’s available. We’ve obviously reached a time when online gaming can be supported. When OnLive tried is there were not enough households with fast enough connections, there weren’t fast enough regional data centers, and there wasn’t a peering network in place where ISPs connect directly to big data companies like Google and bypass the open Internet.

The gaming industry is going to keep demanding faster broadband and I doubt they’ll be satisfied until we have a holodeck in every gamer’s home. But numerous other industries are finding ways to use our increasing household broadband capcity and the overall demand keeps growing at a torrid pace.

 

Comcast the ISP

Occasionally I see a statistic that really surprises me. I just read a quote from Dave Watson, the President of Comcast Cable where he told investors that Comcast has an overall 47% broadband market penetration rate, meaning that 47% of the households in their footprint buy broadband from Comcast. I would have guessed that their market penetration rate was higher. He did say that they have markets where they exceed a 60% market share.

There are a few reasons why their overall market share isn’t higher. For one thing, the company overlaps a lot of the same big markets where Verizon competes with fiber-based FiOS. The company also competes with fiber overbuilders like US Internet in Minneapolis and Sonic in the Bay Area that are chipping away at broadband customers. The company is also competing against a few municipal fiber overbuilders like in Chattanooga where the city-based fiber ISP has won the lion’s share of the market. It’s clear that fiber is a formidable competitor for any cable company.

Comcast also faces significant competition in the MDU market where there are numerous companies vying to serve large apartment buildings and complexes. For example, a big percentage of AT&T’s fiber expansion goal to pass 12 potential customers has been achieved through building fiber to large MDUs all around the country. There are also a number of successful ISPs that compete nationwide in the large MDU market.

Comcast, like all of the big cable companies, was a latecomer in competing for the business market. Historically the cable companies didn’t build their network in business districts and the telcos and CLECs gained early control of this market. Comcast and other cable companies now compete vigorously in the business market, but this is the one market segment that is competitive almost everywhere.

It is clear that Comcast is winning the battle against DSL. Comcast added 1.35 million broadband customers in 2018, while the telcos collectively lost nearly half a million customers.

I believe that the secret to the recent Comcast success is from offering faster broadband speeds. The company has upgraded to DOCSIS 3.1 and now offers gigabit broadband speeds. More importantly, the company has unilaterally increased speeds across-the-board several times to give them a significant speed advantage over DSL. The most recent speed increase last year increased base product speeds to 200 Mbps. It’s now an easy marketing advantage for the company to contrast this with the top DSL speed of 50 Mbps. Comcast is betting that speed wins and looking at the trend of their customers versus the telcos they seem to be right.

Comcast is also benefitting from the fact that many homes now find themselves bumping against the speed limits on slower products. Many homes that use multiple devices simultaneously are starting to find that a broadband speed of even 50 Mbps isn’t adequate for the way they want to use broadband. We are finally reaching the point where even the best DSL is becoming obsolete for many families. This trend is certainly accelerating and we saw 3.5 million new cord-cutting households last year who now watch all video online.

Even knowing all of the above market trends I was still surprised by the 47% market share. My firm does broadband surveys and we’ve never seen a Comcast or Charter market share below 60% in the markets we’ve studied. Of course, our experience is biased by the fact that we are only studying markets where somebody is thinking about building fiber, and there are undoubtedly Comcast markets that are considerably higher or lower than the 47% average market share.

I expect the Comcast market share to keep climbing. I think they have now won the war with DSL and in those markets where they aren’t facing a fiber competitor they will continue to pick up customers who realize they need more speed. As the household demand for broadband continues to double every three years, the migration from DSL to cable broadband is likely to accelerate.

I think it’s likely that telcos with copper networks are starting to lose steam. As the telcos keep losing DSL customers one has to wonder how much money the telcos will spend on advertising to support a sinking market. Just like I’m always surprised when I find out that there are still a few million dial-up customers remaining across the country, I think we have reached the tipping point on DSL, and DSL will start to be considered as a dead and dying technology. It might take another decade for DSL to finally die, but that slow death is finally underway.

FCC Looking at Rural Spectrum Rules

The FCC released a Notice of Proposed Rulemaking on March 15, in WT Docket No. 19-38. This NPRM asks if there are changes to spectrum rules that might make spectrum more easily available for small carriers and in rural markets.

This NPRM was required by the MOBILE NOW Act that was included in the Ray Baum’s Act that reauthorized the FCC. That Act required the FCC to ask the following questions:

  • Should the FCC establish a new program, or modify existing programs to make it easier to partition, disaggregate, or lease spectrum in rural areas and spectrum access by small carriers?
  • Should the FCC allow ‘reaggregation’ of spectrum that has been partitioned or disaggregated on the secondary market, up to the size of the original market area?
  • Would relaxing performance requirements for partitioned or disaggregated licenses make it easier for small carriers to use rural spectrum?
  • Are there any procedural changes that would make it easier to transfer spectrum to small carriers?
  • Are there incentives the FCC can provide to encourage spectrum license holders to lease or sell spectrum to small carriers that will serve rural areas?

If the FCC is serious about helping to solve the rural broadband divide they need to take a hard look at the suggestions various parties will make in this docket. The docket notes that there have been over 1,000 assignments of spectrum over the last decade, but most of these have been from speculators (who buy spectrum with the goal to sell and not use) assigning spectrum to the larger carriers. There are not many examples where the big spectrum holders have peeled off portions of their spectrum for rural use.

Today most spectrum is being used in urban areas but not deployed in the surrounding rural areas. It’s hard to fault the cellular companies for this practice. The low customer density in rural areas doesn’t require cellular carriers to deploy the same mix of spectrum needed to satisfy urban cellular bandwidth needs.

This unused spectrum could be used to provide spectacular fixed wireless broadband – something that is not really a significant part of the business plan of cellular companies. With newer techniques for combining multiple frequencies to serve a single customer, the availability of more swaths of spectrum could be used to significantly increase rural broadband speeds.

There are also regulatory reasons for the pool of unused rural spectrum. The cellular carriers have always lobbied hard to have spectrum auctioned to cover huge geographic footprints. It’s a lot easier for the carriers and the FCC to not bother with auctioning off rural coverage areas separately. The FCC’s coverage rules are also lax that a spectrum license holder can satisfy deployment requirements by deploying spectrum in the urban areas while ignoring the rural parts of the license areas. The FCC has also been extremely lax in enforcing deployment requirements, and license holders in some cases have gone for a decade without deploying spectrum without fear of losing the license.

The big cellular companies have opposed making it easier to deploy frequency in rural areas. They have some legitimate concerns about interference, but there are techical solutions to guard against interference. The big companies mostly don’t want to deal with smaller users of the spectrum. I would expect them to file comments in this docket that say that the existing system is adequate. Today’s rules already allow for leasing or partitioning of spectrum and the big companies don’t want new rules that might force them to work with rural providers.

Probably the most interesting question in the docket is the one asking if there are incentives that would drive the big license holders to work with smaller providers. I can think of several solutions, but the easiest one is what I call ‘use it or lose it’. The FCC ought to change the rules to be able to reclaim licensed spectrum that isn’t being used. The rules should not allow the deployment of spectrum in a city to tie up the use of that same spectrum for a huge surrounding rural area.

While the MOBILE NOW Act required the issuance of this NPRM within a year, it doesn’t require the FCC to act on any of the suggestions made by respondents to the NPRM. I would strongly encourage anybody interested in using rural spectrum to contact their members of Congress and ask them to encourage the FCC to take this NPRM seriously. Over the last two years it’s hard to point to any actions of this FCC that support rural broadband over the interests of the large carriers. The big wireless companies don’t want the hassle of dealing with smaller providers – but that’s the right thing to do. Spectrum ought to benefit all parts of the country, not just the urban areas.

Another Rural Wireless Provider?

T-Mobile announced the start of a trial for a fixed wireless broadband product using LTE. The product is being marketed as “T-Mobile Home Internet”. The company will offer the product by invitation only to some existing T-Mobile cellular customers in “rural and underserved areas”. The company says they might connect as many as 50,000 customers this year. The company is marketing the product as 50 Mbps broadband, with a monthly price of $50 and no data cap. The company warns that speeds may be curtailed during times of network congestion.

The company further says that their ultimate goal is to offer speeds of up to 100 Mbps, but only if they are allowed to merge with Sprint and gain access to Sprint’s huge inventory of mid-range spectrum. They said the combination of the two companies would enable them to cover as many as 9.5 million homes with 100 Mbps broadband in about half of US zip codes.

There are positive aspects the planned deployment, but also a number of issues that make me skeptical. One positive aspect is that some of the spectrum used for LTE can better pass through trees compared to the spectrum used for the fixed wireless technology that is being widely deployed in the open plains and prairies of the Midwest and West. This opens up the possibility of bringing some wireless broadband to places like Appalachia – with the caveat that heavy woods are still going to slow down data speeds. It’s worth noting that this is still a line-of-sight technology and fixed LTE will be blocked by hills or other physical impediments.

The other positive aspect of the announced product is the price and lack of a data cap. Contrast this to the AT&T fixed LTE product that has a price as high as $70 along with a stingy 160 GB monthly cap, and with overage charges that can bring the AT&T price up to $200 per month.

I am skeptical of a number of the claims made or implied by the announcement. The primary concern is download speeds. Fixed LTE will be the same as any other fixed wireless product and speeds will decrease with the distance of a customer from the serving tower. In rural America distances can mount up quickly. LTE broadband is similar to rural cellular voice and works best where customers can get 4 or 5 bars. Anybody living in rural America understands that there are a lot more places with 1 or 2 bars of signal strength than of 4 or 5 bars.

The 50 Mbps advertised speed is clearly an ‘up-to’ speed and in rural America it’s doubtful that anybody other than those who live under a tower could actually get that much speed. This is one of the few times when I’ve seen AT&T advertise truthfully and they market their LTE product as delivering at least 10 Mbps speed. I’ve read numerous online reviews of the AT&T product and the typical speeds reported by customers range between 10 Mbps and 25 Mbps, with only a few lucky customers claiming speeds faster than that.

The online reviews of the AT&T LTE product also indicate that signal strength is heavily influenced by rain and can completely disappear during a downpour. Perhaps even more concerning are reports that in some cases speeds remain slow after a rain due to wet leaves on trees that must be scattering the signal.

Another concern is that T-Mobile is touting this as a solution for underserved rural America.  T-Mobile has far less presence in rural America than AT&T and Verizon and is on fewer rural cellular towers. This is evidenced by their claim that even after a merger with Sprint they’d only be seeing 9.5 million passings – that’s really small coverage for a nationwide cellular network. I’m a bit skeptical that T-Mobile will invest in connecting to more rural towers just to offer this product – the cost of backhaul to rural towers often makes for a lousy business case.

The claim also says that the product will have some aspects of both 4G and 5G. I’ve talked to several wireless engineers who have told me that they can’t see any particular advantage for 5G over 4G when deploying as fixed wireless. A carrier already opens up the available data path fully with 4G to reach a customer and 5G can’t make the spectrum perform any better. I’d love to hear from anybody who can tell me how 5G would enhance this particular application. This might be a case where the 5G term is tossed in for the benefit of politicians and marketing.

Finally, this is clearly a ploy to keep pushing for the merger with Sprint. The claim of the combined companies being able to offer 100 Mbps rural broadband has even more holes than the arguments for achieving 50 Mbps. However, Sprint does have a larger rural presence on rural towers today than T-Mobile, although I think the Sprint towers are already counted in the 9.5 million passings claim.

But putting aside all my skepticism, it would be great if T-Mobile can bring broadband to any rural customers that otherwise wouldn’t have it. Even should they not achieve the full 50 Mbps claim, many rural homes would be thrilled to get speeds at half that level. A wireless product with no data caps would also be a welcomed product. The timing of the announcement is clearly aimed at promoting the merger process with Sprint and I hope the company’s deployment plans don’t evaporate if the merger doesn’t happen.

What Are Small Cells?

By far the most confusing industry term that is widely used today is ‘small cell’. I see at least a couple of different articles every day talking about some aspect of small cell deployment. What becomes quickly clear after reading a few such articles is that the small cell terminology is being used to describe a number of different technologies.

A lot of the blame for this confusion comes from the CTIA, the industry group that representing the large cellular carriers. As part of lobbying the FCC last year to get the ruling that allows the carriers to deploy devices in the public rights-of-way the CTIA constantly characterized small cell devices to be about the size of pizza boxes. In reality, there are devices that range from the size of a pizza box up to devices the size of dorm refrigerators.

There are a number of different kinds of deployments all being referred to as small cells. The term small cell brings to mind the idea of devices hung on poles that perform the same functions as the big cellular towers. Fully functional pole-mounted cellular sites are not small devices. The FCC set a limit for a pole-mounted small cell to be no larger than 28 cubic feet, and a cell tower replacement device will use most of that allotted space. Additionally, a full cell tower replacement device generally requires a sizable box of electronics and power supply that sits on the ground – often in cabinets the size of the traditional corner mailbox.

These cell-tower replacements are the devices that nobody wants in front of their house. They are large and can be an eyesore. The cabinets on the ground can block the sidewalk – although lately the carriers have been getting smarter and are putting the electronics in an underground vault. These are the big ‘small cell’ devices that are causing safety concerns for line technicians from other utilities that have to worry about working around the devices to fix storm damage.

Then there are the devices that actually are the size of pizza boxes. While they are being called small cells just like to giant boxes, I would better classify these smaller devices as cellular repeaters. These smaller devices re-originate cellular signals to boost coverage in cellular dead spots. I happen to live in a hilly city and I would love to see more of these devices. Cellular coverage here varies widely block by block according to line-of-sight to the big cellular towers. Cellular carriers can boost coverage in a neighborhood by placing one of these devices within sight of a large tower and then beaming from there to cover the dead spots.

If you look at the industry vendor web sites they claim shipment of millions of small cell sites last year. It turns out that 95% of these ‘small cell’ devices are indoor cellular boosters. Landlords deploy these in office buildings, apartment buildings and other places where cellular coverage is poor. Perhaps the best terminology to describe these devices is a cellular offload device that relieves traffic on cell sites. The indoor units use cellular frequencies to communicate with cellphones but then dump cellular data and voice traffic onto the broadband connection of the landlord. It turns out in urban downtowns that 90% plus of cellular usage is done indoors, and these devices help to meet urban demand cellular without the hassle of trying to communicate through the walls of larger buildings.

The next use of the term small cell is for the devices that Verizon recently used to test wireless broadband in a few test markets. These devices have nothing to do with cellular traffic and would best be described as wireless broadband loops. Verizon is using millimeter wave spectrum to beam broadband connections for a thousand feet or so from the pole-mounted devices.

The general public doesn’t understand the wide array of different wireless devices that are being deployed. The truly cellular devices, for now, are all 4G devices that are being used by the cellular carriers to meet the rapidly-growing demand for cellular data. The industry term for this is densification and the carriers are deploying full cell-tower substitute devices or neighborhood repeaters to try to relieve the pressure on the big cellular towers. These purely-cellular devices will eventually handle 5G when it is rolled out over the next decade.

The real confusion I see is that most people now equate ‘small cell’ with fast data. I’ve talked to several cities recently who thought that requests for small cell attachments mean they are going to get gigabit broadband. Instead, almost every request for a small cell site today is for the purpose of beefing up the 4G networks. These extra devices aren’t going to increase 4G data speeds, aren’t bringing 5G and are definitely not intended to beam broadband into people’s homes. These small cells are being deployed to divvy up the cellular traffic to relieve overloaded cellular networks.

Reality Pricing Coming for Online Video

I’ve been a cord cutter for many years and over the last few years, I’ve tried the various vMVPDs that offer channel line-ups that somewhat mimic traditional cable TV. I’ve tried Sling TV, DirecTV Now and Playstation Vue. In every case I’ve always scratched my head wondering how these products could offer prices that are lower than the wholesale price of the content from programmers. There are only two possibilities – either these companies have been setting low prices to gain market share or they had been able to negotiate far better deals for content than the rest of the industry.

Of course, the answer is that they’ve been subsidizing these products. And Wall Street is now pressuring these companies to end the subsidies and become profitable. There is probably no better example of this than AT&T’s DirecTV Now service. When DirecTV Now launched it carried a price tag of $35 per month for about a hundred channels of programming. The low price was clearly set as a reaction to a similarly low price from Sling TV which was the first big successful vMVPD.

Both companies offered line-ups including the channels that most households watch. This included the high-price programming from ESPN and numerous other quality networks. The initial pricing was crazy – a similar package on traditional cable was priced at $60 – $70.

The low pricing has worked for DirectTV Now. They are getting close to surpassing the Sling TV in subscribers. AT&T has featured DirecTV Now in its advertising and has been shuttling customers from the satellite-based DirecTV to the online product.

But AT&T company just got realistic with the product. They have collapsed from four options down to two options now priced at $50 and $70 per month. The company got ready for this shift by eliminating special promotional prices in the fourth quarter of last year. They had roughly half a million customers who were paying even less than their published low prices. When AT&T raised the rates they immediately lost over half of those promotional customers.

Not only are prices rising, but the company has significantly trimmed the channel counts. The new $50 package will have only about 40 channels while the $70 package will have 50 channels. It’s worth noting that both packages now include HBO, which is the flagship AT&T product. HBO is by far the most expensive programming in the industry and AT&T has now reconfigured DirecTV Now to be HBO plus other premium channels.

The new prices are realistic and also include a profit margin. It will be interesting to see how the DirecTV Now customer base reacts to such a drastic change. I’m sure many of them will flee to cheaper alternatives. But the company may also attract customers that subscribe directly to HBO to upgrade.

The big question is if there will be cheaper alternatives? The online industry has been around long enough that it is now out of its infancy and investors are starting to expect profits from any company in this space. The new realistic pricing by AT&T is likely to drive the other online programmers to also get more realistic.

These price increases have ramifications for cord-cutting. It’s been easy to justify cutting the cord when you could ditch a $70 per month traditional cable product for a $35 online one that has the channels you most watch. But there is less allure from going online when the alternative choice is just as expensive as the traditional one. There is always going to be some savings from jumping online – if nothing else customers can escape the exorbitant fees for renting a settop box.

It’s clear that AT&T is counting on HBO as the allure for its online offering. That product is available in a number of places on the web for a monthly rate of $15, so including that in the $50 and $70 product still distinguishes DirecTV Now from the other vMVPD providers.

What is clear by this move is that we are approaching the time when companies are willing to eat huge losses to gain online market share. That market share is worthless if customers leave in droves when there is a rate increase. These big companies don’t seem to have fully grasped that there is zero customer loyalty online. Viewers don’t really care who the underlying company is that is carrying their favorite programming – it’s the content they care about. The big cable companies have to break their long history of making decisions like near-monopolies.

Where’s the CAF II Success?

If you’ve read this blog you know I’ve been a big critic of the FCC’s CAF II program that gave over $10 billion in federal subsidies to the biggest telcos to improve rural broadband. My complaint is that the program set the embarrassingly low goal of improving rural broadband to speeds of at least 10/1 Mbps. My complaint is that this money could have done a huge amount of good had it been put up to reverse auction as was done with the leftover customers from this program last year – many ISPs would have used this funding to help to build rural fiber. Instead, the telcos are using the money mostly to upgrade DSL.

While I think the program was ill-conceived and was a giveaway to the big telco lobbyists, I am at least glad that it is improving rural broadband. For a household with no broadband, a 10 Mbps product might provide basic access to broadband services for the first time. We are now into the fifth year of the six-year program, so we ought to be seeing the results of these upgrades. USTelecom just published a blog saying that deployments are ahead of schedule and that CAF II is a quiet success.

The telcos have told the FCC they are largely on track – by the end of 2018 they should have upgraded broadband for at least 60% of the required households. AT&T and Windstream report that they have made at least 60% of the needed upgrades everywhere. Frontier says they are on track in 27 of the 29 states needing upgrades. CenturyLink says they are on track in only 23 of 33 states that are getting CAF II upgrades. According to USTelecom, over 2.1 million households should now be seeing faster speeds.

It’s also worth noting that the CAF II program should improve broadband for many more households that are not covered directly by the program. For example, when upgrading DSL for a CAF II area that surrounds a town, those living in the town should also see better broadband. The secondary benefit of the CAF program is that rural towns should be seeing speeds increasing from 6 Mbps or slower to as fast as 25 Mbps. By now many more millions of households should be seeing faster broadband due to CAF II.

What I find puzzling is that I would expect to see an upward burst of broadband customers for the big telcos because of CAF II. But the numbers aren’t showing that. There were four telcos that accepted more than $1 billion from the program, as follows, and three of them lost broadband customers in 2018:

Funding Households Per Household 2018 Broadband Customers
CenturyLink $3.09 B 1,190,016 $2,593 (262,000)
AT&T $2.96 B 1,265,036 $2,342 (18,000)
Frontier $1.7 B 659,587 $2,578 (203,000)
Windstream $1.07 B 413,345 $2,595 8,400
Total CAF II $10.05 B 4,075,840 $2,467

Windstream is the only telco of the four that gained customers last year. Windstream’s footprint is probably the most rural of the four telcos. We know that every telco is losing the battle for customers in towns where cable companies are increasing speeds on coaxial networks. Windstream seems to be offsetting those losses, and I can conjecture it’s because they have been selling more rural broadband.

AT&T is in a category all by itself. It’s impossible to know how AT&T is faring with CAF II. They are largely implementing CAF II using their cellular network (with the goal of tearing down rural copper). The company has also been deploying fiber past millions of homes and business in urban areas. They are clearly losing the residential broadband battle in urban markets to companies like Comcast and Charter. However, I can tell you anecdotally that AT&T hasn’t given up on urban copper. They have knocked on my door in Asheville, NC at least three times in the last year trying to sell DSL. I have to assume that they are also marketing broadband improvements in rural areas.

CenturyLink and Frontier are clearly bleeding broadband customers and each lost over 200,000 customers just in the last year. I have to wonder how hard these companies are marketing improved rural broadband. Both companies work in urban and suburban markets but also in numerous county seats situated in rural counties. Like every telco they are losing DSL customers in these markets to the cable company competitors.

Just like I have anecdotal evidence that AT&T is still pushing copper I hear stories that say the opposite for CenturyLink and Frontier. I worked in a few rural counties last year where the CAF II upgrades were reported as complete. And yet the communities seemed unaware of the improvements. Local politicians who bear the brunt of complaints from households that want better broadband weren’t aware of any upgrades – which tells me their rural constituents weren’t aware of upgrades.

I honestly don’t know what this all means. I really expected to find more positive evidence of the impact of CAF II. From what I know of rural America, households ought to leap at the opportunity to buy 10/1 Mbps DSL if they’ve had no broadband in the past. Are the upgrades being done but not being followed up with a marketing and public awareness campaign? Are actual upgraded speed not meeting the 10/1 Mbps goal? Are the upgrades really being made as reported to the FCC? We’re perhaps a year and a half away from the completion of CAF II, so I guess we’ll find out soon enough.

Cord Cutting is For Real

It’s obvious in looking at the performance of cable companies in 2018 that cord cutting is now for real. The fourth quarter count of cable customers for the largest providers was recently reported by the Leichtman Research Group. These companies represent roughly 95% of the national cable market.

4Q 2018 4Q 2017 Change
Comcast 21,986,000 22,357,000 (371,000) -1.7%
DirecTV 19,222,000 20,458,000 (1,236,000) -6.0%
Charter 16,606,000 16,850,000 (244,000) -1.4%
Dish 9,905,000 11,030,000 (1,125,000) -10.2%
Verizon 4,451,000 4,619,000 (168,000) -3.6%
Cox 4,015,000 4,130,000 (115,000) -2.8%
AT&T 3,704,000 3,657,000 47,000  1.3%
Altice 3,307,500 3,405,500 (98,000) -2.9%
Frontier 838,000    961,000 (123,000) -12.8%
Mediacom 776,000    821,000 (45,000) -5.5%
Cable ONE 326,423    363,888 (37,465) -10.3%
  Total 85,136,923 88,652,388 (3,515,465) -4.0%

I’m thinking back to 2017 when most analysts were predicting perhaps a 2% drop in 2018 in total market share due to cord cutting. Since 2018 is only the second year with real evidence of cord cutting, the 4% loss of total market share demonstrates big changes in customer sentiment.

The big losers are the satellite companies which lost 2,361,000 customers in 2018. These losses are offset a little bit since the satellite companies also have the largest online video services. Dish’s Sling TV added 205,000 customers in 2018 and AT&T’s DirecTV Now added 436,000 – but the net customer loss for these companies is still 1.7 million for the year.

In 2018 Comcast and Charter didn’t fare as poorly as the rest of the industry. However, their smaller loss of cable customers is probably due to the fact that both companies saw more than 5% growth of new broadband customers (2.6 million in total) in 2018, and those new customers undoubtedly are shielding cord cutting losses by older subscribers.

It’s still too early to make any real predictions about the future trajectory for cord cutting. We know that price is a large factor in cord cutting and cable providers are still facing huge price increases in buying programming. That will continue to drive cable prices higher. The big cable companies have done their best to disguise recent price increases by shoving rate increases into local programming or sports programming ‘fees’. However, the public is catching onto that scheme and also can still see that their overall monthly payments are increasing.

It’s starting to look like online programming might cost as much as traditional cable TV. For the last few years there have been alternatives like DirecTV Now, Playstation Vue and Sling TV that have offered the most-watched networks for bargain prices. But the recent big rate increase from DirecTV Now is probably signaling that the days of subsidized online programming are over.

Further, the online programming world continues to splinter as each owner of programming rolls out their own online products. The cost of replacing what people most want to watch online might soon be higher even than traditional cable TV if it requires separate subscriptions to Disney, CBS, NBC and the many other new standalone packages that a cord cutter must cobble together. A family that really wants to save money on TV has to settle for some subset of the online alternatives, and the big question will be if households are willing to do that.

But at least for now it looks like cord cutting is roaring ahead. The average loss of traditional cable customers in 2018 is almost 300,000 per month, and the rate of loss is accelerating. At least for now, the industry is seeing a rout, and that has to be scaring boards rooms everywhere.

Predicting 5G CAPEX

If you ever want a bad headache, spend a few hours researching predictions about the future trajectory of capital spending by the big players in the telecom industry. It’s a topic worth following since the big ISPs all said that eliminating net neutrality and other regulation would unleash them to spend lavishly on new networks.

I was looking through projections over the past year that were forecasting capital spending for 2019. My main motivation in looking at these projections was to see if the big companies are actually planning on spending money yet on 5G. I figured the best way to get past all of the 5G hype is to follow the advice from the movie All the President’s Men, and “Follow the money”.

I started by looking at projections from the beginning of 2018. The headlines at that time centered around the big benefits to the industry from the Tax Cut and Jobs Act passed in December 2017. That legislation created an annual benefit to AT&T of $2.2 billion and a benefit to Verizon of $4 billion annually. At the beginning of 2018 industry analysts predicted the companies would roll those savings into increased capital spending. However, like with most big corporations those savings were not rolled back into the business.

There were rosy predictions at the start of last year about 2018 and 2019 capital spending. For example, the analysts at Deutsche Bank Research said in February 2018 that capital spending by the wireless carriers would increase by 14% in 2018 and even more into the future. It’s not hard to understand the enthusiasm of the analysts because the carriers were fueling this story. Early in 2018 Verizon said they would be investing $35 billion in 5G and AT&T said they would invest $40 billion.

This enthusiasm was fueled all last year by promises from AT&T and Verizon to roll out 5G by the end of 2018. In June the analysts at Oppenheimer raised forecasts of capital spending for 2019 by $18 billion by Verizon and $25 billion for AT&T. However, at the end of last year we saw the 5G announcements had been nothing but hype when AT&T announced an imaginary 5G product and Verizon installed fixed wireless in a few hundred homes.

As recently as the fourth quarter of last year a number of analysts were still predicting greater capital spending for wireless this year compared to 2018. For example, MoffettNathanson LLC predicted 2019 capital spending would be up 3.3% in 2019. Most other analysts made similar projections.

As the books closed for 2018 it became obvious that the big wireless companies hadn’t spent nearly as much as expected for the year. For example, Verizon actual spending was $1.5 billion less than their own initial projections. AT&T came in $3 billion less than projected. When real spending materializes you start to understand the complexity of these budgets. For example, AT&T said that part of the reason for lower capital spending was due to delays in the deployment of FirstNet, the nationwide public safety network. I sit here wondering why FirstNet was even included in their capital budget since it’s not being funded from AT&T’s own revenues, but 100% by taxpayers.

By the time I got to looking at 2019 the picture gets incredibly muddled. There are still those predicting 2% to 3% more capital spending for 2019. But we also see the big carriers admitting to their investors that there will be little spending on 5G this year. This first big capital expenditure in 5G will be for the core electronics for 5G cell sites called the RAN. It doesn’t look like there will be a 5G RAN available for a few more years. Both cellular carriers admit that they are not spending much on 4G LTE infrastructure other than working on cell site densification in urban areas through the deployment of small cell sites aimed at relieving pressure on the big tower cell sites.

I’ll be honest that all I got out of this reading was the headache because I still have no idea about how much money these big carriers will spend this year. This is probably not abnormal in an industry under so much flux and I would imagine there are still decisions being made inside these companies every day that will change capital spending even in this year. The one thing I came away with was a clear picture that there will be very little spending in 2019 on 5G, which means that the announcements of the carriers to have 5G cellular products by 2020 are clearly still hype.

But there are still those in the industry with rosy predictions. The research firm IDC predicts that spending on 5G core equipment will increase worldwide from $500 million this year to $26 billion per year in 2022. Ericsson is predicting that 5G will account for 50% of mobile subscriptions in the US by 2023 along with a worldwide penetration at 20% that year. That seems to be in conflict with Cisco which recently predicted worldwide 5G penetration of 3% by the end of 2022. I have no idea which of these predictions is right, but I now know that we can’t put any faith in predictions about 5G spending or deployment, so perhaps all of this reading was not in vain.