Immersive Virtual Reality

In case you haven’t noticed, virtual reality has moved from headsets to the mall. At least two companies now offer an immersive virtual reality experience that goes far beyond what can be experienced with only a VR headset at home.

The newest company is Dreamscape Immersive that has launched virtual reality studies in Los Angeles and Dallas, with more outlets planned. The virtual reality experience is enhanced by the use of a headset, hand and foot trackers, and a backpack holding the computers. The action occurs within a 16X16 room with vibrating haptic floor (responds to actions of the participant). This all equates to an experience where a user can reach out and touch objects or can walk around all sides of a virtual object in the environment.

The company has launched with three separate adventures, each lasting roughly 15 minutes. In Alien Zoo the user visits a zoo populated by exotic and endangered animals from around the galaxy. In The Blu: Deep Rescue users try to help reunite a lost whale with its family. The Curse of the Lost Pearl feels like an Indiana Jones adventure where the user tries to find a lost pearl.

More established is The Void, which has launched virtual reality adventure sites in sixteen cities, with more planned. The company is creating virtual reality settings based upon familiar content. The company’s first VR experience was based on Ghostbusters. The current theme is Star Wars: Secrets of the Empire.

The Void lets users wander through a virtual reality world. The company constructs elaborate sets where the walls and locations of objects in the real-life set correspond to what is being seen in the virtual reality world. This provides users with real tactile feedback that enhances the virtual reality experience.

You might be wondering what these two companies and their virtual reality worlds have to do with broadband. I think they provide a peek at what virtual reality in the home might become in a decade. Anybody who’s followed the growth of video games can remember how the games started in arcades before they were shrunk to a format that would work in homes. I think the virtual reality experiences of these two companies are a precursor to the virtual reality we’ll be having at home in the not-too-distant future.

There is already a robust virtual reality gaming industry, but it relies entirely on providing a virtual reality experience through the use of goggles. There are now many brands of headsets on the market, ranging from the simple cardboard headset from Google to more expensive headsets from companies like Oculus Rift, Nintendo, Sony, HTC, and Lenovo. If you want to spend an interesting half an hour, you can see the current most popular virtual reality games at this review from PCGamer. To a large degree, virtual reality gaming has been built modeled on existing traditional video games, although there are some interesting VR games that are now offering content that only makes sense in 3D.

The whole video game market is in the process of moving content online, with the core processing of the gaming experience done in data centers. While most games are still available in more traditional formats, gamers are increasingly connecting to a gaming cloud and need a broadband connection akin in size to a 4K video stream. Historically, many games have been downloaded, causing headaches for gamers with data caps. Playing the games in the cloud can still chew up a lot of bandwidth for active gamers but avoids the giant gigabyte downloads.

If history is a teacher, the technologies used by these two companies will eventually migrate to homes. We saw this migration occur with first-generation video games – there were video arcades in nearly every town, but within a decade those arcades got displaced by the gaming boxes in the home that delivered the same content.

When the kind of games offered by The Void and Dreamscape Immersive reach the home they will ramp up the need for home broadband. It’s not hard to imagine immersive virtual reality needing 100 Mbps speeds or greater for one data stream. These games are the first step towards eventually having something resembling a home holodeck – each new generation of gaming is growing in sophistication and the need for more bandwidth.

Aesthetics and 5G

A recent news article by CBS4 in Denver shows a power supply unit for 5G that was recently installed in Aurora, CO. It’s roughly 5-foot tall and I venture to guess that most homeowners would not want this device at the front of their home.

The cellular companies have convinced the FCC that they need carte blanche authority to place small cell sites where they are needed, and the FCC gave them this authority in September 2018. The FCC order reversed the historic process where cell site placement was under local control. In asking for a national preemption of local rules the cellular carriers argued that they needed blanket authority to put cell sites anywhere in public rights-of-way if the US is to win the 5G war.

Communities all over the country have pushed back hard against the FCC ruling. Numerous cities and states have filed lawsuits against the FCC ruling. Courts have chipped away at that ruling and in August of this year, the US Court of Appeals for the D.C. Circuit ruled that the FCC couldn’t preempt local ordinances concerning environmental and historic preservation reviews of cell site placement. A few cities have passed ordinances that would stop deployment of small cells due to concerns about health, property values, or aesthetics.

When the wireless companies first started deploying pole-mounted small cell sites some of the deployments were major eyesores. Deployments included placing large boxes and antennas and power supplies in the air connected by a maze of live wires. The wireless carriers quickly cleaned up their act in terms of hideous deployments, but in looking at the deployment in Aurora they still have a way to go. One interesting thing about this deployment is that the device sits on the ground. When this order was issued the press covered this as an order about placing devices on poles and they missed that the FCC gave the big carriers the right to put devices anywhere in the public right-of-way.

Historically, carriers would seek homeowner permission to install cabinet-sized boxes. More often than not they would find a place in a neighborhood where the cabinets and boxes were somewhat hidden from sight. Even though the process required voluntary participation by homeowners, it worked well. Sometimes carriers had to go to the city when they were unable to find a location for a needed cabinet, but in most cases, the carriers and the public worked out a solution.

It seems unfair that the first time that a homeowner finds they are getting a large cabinet in their yard is during the installation process. Just because carriers have the right to place anything related to small cells in the right-of-way doesn’t mean they should callously do so without communicating with the public. In this case, the wireless carrier probably had alternatives like placing the needed electronics in an underground vault instead of the large cabinet. That solution would cost more but would eliminate animosity with residents.

That raises an interesting regulatory question. In the long-run regulations are driven by what the public finds acceptable or unacceptable. The public in Aurora is not likely to be upset by this one small cell deployment, but imagine if there are 200, or 500 or 1,000 identical cabinets placed around the city. When carriers deploy solutions that the public doesn’t like, a city is going to fight back against the unpopular practices. New ordinances for small cells are likely to end up in court, and at some point, a judge will decide if the Aurora small cell device somehow crosses the line.

The FCC 5G order is interesting in that it swings to the far extreme of the regulatory pendulum by ruling that the wireless carriers have blanket authority to place any device anywhere they want. Over time, whether done by a future FCC, by the courts, or by Congress, rulings at the extreme fringe of the regulatory pendulum inevitably swing back towards the center. It’s almost inevitable over time that cities will get back more say about the aesthetics of small cell placement.

Counting Gigabit Households

I ran across a website called the Gigabit Monitor that is tracking the population worldwide that has access to gigabit broadband. The website is sponsored by VIAVI Solutions, a manufacturer of network test equipment.

The website claims that in the US over 68.5 million people have access to gigabit broadband, or 21% of the population. That number gets sketchy when you look at the details. The claimed 68.5 million people includes 40.3 million served by fiber, 27.2 million served by cable company HFC networks, 822,000 served by cellular and 233,000 served by WiFi.

Each of those numbers is highly suspect. For example, the fiber numbers don’t include Verizon FiOS or the FiOS properties sold to Frontier. Technically that’s correct since most FiOS customers can buy maximum broadband speeds in the range of 800-900 Mbps. But there can’t be 40 million people other people outside of FiOS who can buy gigabit broadband from other fiber providers. I’m also puzzled by the cellular and WiFi categories and can’t imagine there is anybody that can buy gigabit products of either type.

VIAVI makes similar odd claims for the rest of the world. For example, they say that China has 61.5 million people that can get gigabit service. But that number includes 12.3 million on cellular and 6.2 million on WiFi.

Finally, the website lists the carriers that they believe offer gigabit speeds. I have numerous clients that own FTTH networks that are not listed, and I stopped counting when I counted 15 of my clients that are not on the list.

It’s clear this web site is flawed and doesn’t accurately count gigabit-capable people. However, it raises the question of how to count the number of people who have access to gigabit service. Unfortunately, the only way to do that today is by accepting claims by ISPs. We’ve already seen with the FCC broadband maps how unreliable the ISPs are when reporting broadband capabilities.

As I think about each broadband technology there are challenges in defining gigabit-capable customers. The Verizon situation is a great example. It’s not a gigabit product if an ISP caps broadband speeds at something lower than a gigabit – even if the technology can support a gigabit.

There are challenges in counting gigabit-capable customers on cable company networks as well. The cable companies are smart to market all of their products as ‘up to’ speeds because of the shared nature of their networks. The customers in a given neighborhood node share bandwidth and the speeds can drop when the network gets busy. Can you count a household as gigabit-capable if they can only get gigabit speeds at 4:00 AM but get something slower during the evening hours?

It’s going to get even harder to count gigabit capability when there are reliable cellular networks using millimeter wave spectrum. That spectrum is only going to able to achieve gigabit speeds outdoors when in direct line-of-site from a nearby cell site. Can you count a technology as gigabit-capable when the service only works outdoors and drops when walking into a building or walking a few hundred feet away from a cell site?

It’s also hard to know how to count apartment buildings. There are a few technologies being used today in the US that bring gigabit speeds to the front of an apartment building. However, by the time that the broadband suffers packet losses due to inside wiring and is diluted by sharing among multiple apartments, nobody gets a true gigabit product. But ISPs routinely count them as gigabit customers.

There is also the issue of how to not double-count households that can get gigabit speeds from multiple ISPs. There are urban markets with fiber providers like Google Fiber, Sonic, US Internet, EPB Chattanooga, and others where customers can buy gigabit broadband on fiber and also from the cable company. There are even a few lucky customers in places like Austin, Texas and the research triangle in North Carolina where some homes have three choices of gigabit networks after the telco (AT&T) also built fiber.

I’m not sure we need to put much energy into accurately counting gigabit-capable customers. I think everybody would agree an 850 to 950 Mbps connection on Verizon FiOS is blazingly fast. Certainly, a customer getting over 800 Mbps from a cable company has tremendous broadband capability. Technically such connections are not gigabit connections, but the difference between a gigabit connection and a near-gigabit connection for a household is so negligible as to not practically matter.

It’s All About the Collateral

I’m often asked if a business plan is solid enough to take to the bank for financing. I disappoint a lot of folks when I tell them that, while a solid business plan is important, getting loans is all about the collateral.

Banks are not in the business of understanding your business. They don’t know how to evaluate a broadband business plan. It’s important to understand that in a given week a bank might be offered your broadband business plan, a plan to roll-out a dozen yogurt stores, a plan to combine several farms, and a plan to start a new brewery. They can’t begin to be able to understand the nuances of the many business plans they see.

It’s very easy to become too invested in your business plan. I often hear people describing their business plan as ‘can’t fail’. I can usually demonstrate that this is not so by changing a few of their key assumptions. It’s the rare broadband business plan that can’t be worsened by lowering the customer penetration rate, slowing down the speed of sales, or increasing the interest rate on debt.

Banks understand this. Every bank has a portfolio of failed projects where the bank lost a lot of their loan investment even though a project looked solid. Banks are skeptics by nature because they deal all day with prospective borrowers who are convinced that they are bringing a no-fail project. If a loan is large enough, a bank might hire an expert like me to check the assumptions in a business plan to help to identify the most sensitive variables. However, even with expert advice, a bank is still going to assume that a business can fail.

That’s why I say that the most important thing is collateral. Collateral represents the ability of the bank to recover some of their funds should a project fail. The stronger the collateral, the easier it is for banks to make the loan.

There are various types of collateral. The best collateral is a payment guarantee that kicks in even should a project be a total bust. This is the reason why municipal bonds that are backed by tax revenues can get lower interest rates. If a city builds a fiber network, a golf course, or an arena and the expected revenues don’t materialize, a tax-backed loan requires the city to raise taxes to make the bond payments.

Many new ISPs become familiar with the idea of collateral when banks ask them for a personal guarantee, meaning a borrower must pledge their home and savings as back-up for a project. That guarantee is rarely as powerful as tax-collateral, but it improves a borrower’s chance of getting the loan.

Established ISPs also face loan guarantees. If a telco wants to undertake a large new fiber project, they generally end up pledging their entire existing company to get the new loan. Communities often wonder why existing ISPs don’t expand faster, and more often than not it’s because they’ve already used up all of their collateral on existing loans. Just like with households, every business has a natural lending cap, at which no bank will loan them more.

Banks do consider other issues other than collateral. For example, a bank might consider track record when lending to an ISP that has been successful many times in the past – and that track record might lower the needed collateral. Banks love grants, but love owner equity even more since it means the owner has skin in the game.

Occasionally I see a new fiber venture that gets funded when it probably shouldn’t. There are local banks that lend to a local fiber project because they think their community needs fiber to thrive and survive. A bank in that situation is putting themselves on the line since they see their survival tied to the survival of the community.

The bottom line is that a project without collateral is not easily bankable. Unsophisticated borrowers think the numbers in their business plan tell a bank all they need to know. The truth is that the business plan is several items down the checklist for a bank, with collateral at the top of the list.

Our Digital Illiteracy

Pew Research Center recently surveyed 4,272 adults and tested their knowledge of basic computer topics. The results showed that there was a lack of general knowledge about a few of the terms that are important for how people use the Internet.

For example, the survey showed that only 30% of survey takers knew that website starting with https:// means that the information provided over that site is encrypted.

Only 28% of respondents understood the concept of two-factor authentication – something that Google and Microsoft say can eliminate nearly 100% of hacking of a connection.

Only 24% understood the purpose of private browsing.

The respondents fared better on a few topics. For example, two-thirds of respondents understood the danger of phishing, but it’s a bit scary that one out of three users didn’t. 63% understand that cookies allow websites to track user visits and other activities on web sites.

48% of respondents understood the concept of net neutrality – the technology topic that has gotten the most press over the last four years.

A few of the questions were a bit smug. Only 15% of people could identify a picture of Jack Dorsey, the founder of Twitter. I have to admit that this is a question I would also have failed because I don’t much care about the personalities of the people behind web companies – even though I follow the issues involving these companies closely.

It’s probably not surprising that younger users did better on the survey question than older users. It’s still a bit shocking, though that only 1% of survey takers got every question right.

The bottom line of this survey is that the general public probably has a much lower knowledge of the Internet that many web companies and ISPs assume. I think this survey highlights an opportunity for small ISPs to educate customers by passing on safety tips or important knowledge about the web.

ISPs communicate with users on log-in pages, when billing and on their web site. It wouldn’t be hard to add some recurring messages such as. “Did you know that web sites that start with https use an encrypted connection with users and provide for a safer connection?” Experienced web users will blow past such messages, but we know that repeating messages eventually make an impression on most people.

It’s easy for technical folks to assume that the public understands basic concepts about the web – but surveys like this one remind us that’s necessarily true.

Why I am Thankful – 2019

It’s Thanksgiving again and I pause every year to look at the positive events and trends for the small ISP industry. I found a number of things to be thankful for at the end of 2019.

FCC Finally Admits Its Maps Suck. The FCC has begrudgingly admitted that its broadband mapping sucks and is considering several proposals for improving the mapping. It looks like the proposals will fix the ‘edge’ problem, where today rural customers that live close to cities and towns are lumped in with the broadband available in those places. Sadly, I don’t believe there will ever be a good way to measure and map rural DSL and fixed wireless. But fixing the edge problem will be a great improvement.

FCC Released the CBRS Spectrum. The 3.65 GHz, (Citizens Band Radio Spectrum) should provide a boost to rural fixed broadband. There are some restrictions where there is existing government use and there will be frequency sharing rules, so the frequency is not fully unrestricted. The 80 MHz of free spectrum should prove to be powerful in many parts of the country. The FCC is considering other frequencies like white space, C Band, and 6 GHz that also will be a benefit to rural broadband.

States Are Reversing a Few Draconian Laws. Several states have removed barriers for electric cooperatives to get into the broadband business. Arkansas softened a prohibition against municipal broadband. Local politicians are now telling state legislators that broadband is the top priority in communities that don’t have access to good broadband. It’s been obvious for a long time that the best solutions to fix rural broadband are local – it makes no sense to restrict any entity that wants to invest in rural broadband.

The FCC Has Made it Easier for Indian Tribes to Provide Broadband. Various rule changes have streamlined the process of building and owning broadband infrastructure on tribal lands. Many tribes are exploring their options.

Local Broadband Activists Make a Difference. It seems like every community I visit now has a local broadband committee or group that is pushing local politicians to find a solution for poor broadband coverage. These folks make a difference and are prodding local governments to get serious about finding broadband solutions.

The FCC Announces a Monstrous Grant Program. I hope the RDOF grants that will award over $16 billion next year will make a real dent in the rural digital divide. Ideally, a lot of the grants will fund rural fiber, since any community with fiber has achieved a long-term broadband solution. However, I worry that much of the funding could go to slower technologies, or even to the satellite companies – so we’ll have to wait and see what happens in a massive reverse auction.

States Take the Lead on Net Neutrality. When the US Appeals Court ruled that the FCC had the authority to undo net neutrality, the court also rules that states have the authority to step into that regulatory void. Numerous states have enacted some version of net neutrality, but California and Washington have enacted laws as comprehensive as the old FCC rules. My guess at some point is that the big ISPs will decide that they would rather have one set of federal net neutrality rules than a host of different state ones.

The Proliferation of Online Programming. The riches of programming available online is amazing. I’m a Maryland sports fan and there are only three basketball or football games that I can’t watch this season even though I don’t live in the Maryland market. I don’t understand why there aren’t more cord cutters because there is far more entertainment available online than anybody can possibly watch. A decade ago, I didn’t even own a TV because there was nothing worth watching – today I keep a wish list of programming to watch later.

NC Broadband Matters. Finally, I’m thankful for NC Broadband Matters. This is a non-profit in North Carolina that is working to bring broadband to communities that don’t have it today. The group invited me to join their Board this year and I look forward to working with this talented group of dedicated folks to help find rural broadband solutions in the state.

T-Mobile Offering Broadband Solutions

As part of the push to get approval for the proposed merger with Sprint, T-Mobile pledged that it will offer low-cost data plans, give free 5G to first responders and provide free broadband access to underserved households with school students. These offers are all dependent upon regulators and the states approving the merger.

The low-price broadband plans might be attractive to those who don’t use a lot of cellular data. The lowest-price plan offers 2 GB of data for $15 monthly. The price is guaranteed for 5 years and the data cap grows by 500 MB per year to reach 4 GB in the fifth year. The second plan offers 5 GB for $25 and also grows by 500 Mb per year to reach 7 GB by the fifth year. I assume adding voice and texting is extra.

The offer for free phones for first responders is just that. T-Mobile will offer free voice, texting, and data to first responders for 10 years. There will be no throttling of data and data will always get priority. The company estimates that this would save $7.7 billion nationwide for first responders over the ten years if they all switch to T-Mobile. Not surprisingly the other carriers are already unhappy with this offer, particularly AT&T which is busy building the nationwide FirstNet first responder network. This may be a somewhat hollow offer. The FirstNet network has some major advantages such as automatically interconnecting responders from different jurisdictions. But at least some local governments are going to be attracted to free cellular service.

The offer for school students is intriguing. For the next five years, the company is offering 100 GB per month of downloaded data to eligible student households. The company will also provide a free WiFi hotspot that converts the cellular data into WiFi for home use. T-Mobile estimates that roughly 10 million households would be eligible. Studies have shown that cost is the reason that many homes with students don’t have home broadband. In urban areas, the T-Mobile effort could largely eliminate the homework gap, at least for five years. That would give the country five years to find a more permanent solution. While T-Mobile would also help in rural America, many rural homes are not in range of a T-Mobile tower capable of delivering enough broadband to be meaningful. However, in many cases, this offer would be bringing broadband for homework to homes with no other broadband alternatives.

If the merger goes through, T-Mobile plans to mobilize the big inventory of 2.5 GHz spectrum owned by Sprint as well as activating 600 MHz spectrum. These are interesting spectrum, particularly the 600 MHz. This spectrum is great at penetrating buildings and can reach deep into most buildings. The spectrum also carries far, up to 10 miles from a transmitter. However, compared to higher frequencies, the 600 MHz spectrum won’t carry as much data. Further, data speeds decrease with distance from a cell sites and the data speeds past a few miles are likely to be pretty slow.

This plan makes me wonder how allowing millions of students onto the cellular network for homework will affect cell sites. Will some cell sites bog down when kids are all connected to the school networks to do homework?

I further wonder if the promise to offer free broadband to students also comes with a promise to supply enough backhaul bandwidth to poor neighborhoods to support the busy networks. Without good backhaul, the free bandwidth might be unusable at peak hours. I don’t mean to denigrate an offer that might mean a broadband solution for millions of kids – but I’ve also learned over the years that free doesn’t always mean good.

I’ve seen where a few states like New York are still against the merger, so there is no guarantee it’s going to happen. It sounds like the courts will have to decide. I suspect these offers will be withdrawn if the decision is made by courts rather than by the states.

C-Band Announcement Moot on Rural Wireless

On November 18, FCC Chairman Ajit Pai told several members of Congress that he had decided there should be a public auction for the C-Band spectrum that sits between 3.7 GHz and 4.2 GHz. The spectrum has historically been used by satellite companies for communication between satellites and earth stations. This is prime spectrum for 5G cellular broadband, but also could provide a huge benefit to fixed wireless providers in rural America. Chairman Pai will be asking the rest of the FCC commissioners to approve an order sometime after the first of next year. Making an early announcement is a bit unusual since major orders like this are usually announced by releasing a written order that comes after a vote of the Commission.

The letters from Chairman Pai describe four reasons behind the decision: First, we must make available a significant amount of C-Band spectrum for 5G. Second, we must make spectrum available for 5G quickly. Third, we must generate revenue for the federal government. And, Fourth, we must protect the services that are currently delivered using the C-Band so that they can continue to be delivered to the American people. 

Missing from Chairman Pai’s letter was any mention of making the C-Band spectrum available for rural fixed wireless. WISPA and other rural proponents have been lobbying for sharing the spectrum so that the C-Band could be used for urban 5G while also benefitting faster rural broadband.

This has been an unusual docket from the start because the satellite providers, under the name of the C-Band Alliance (CBA) offered to relocate to the higher part of the spectrum if they could hold a private auction to sell the vacated spectrum to the cellular carriers. There were several problems with that offer. First, the satellite providers would make billions of dollars of windfall profits through selling spectrum that they don’t own. Federal law makes it clear that the FCC has the right to award or take-back spectrum and it would have been a major precedent for license holders to be able to sell spectrum for a huge profit. There were also obvious concerns about transparency, and it was feared that backroom deals would be struck to give spectrum to the big cellular carriers for bargain prices while still benefitting the satellite companies.

There was also a political nuance. The CBA proposed to give some of the proceeds of the private auction to the federal government, similar to what happens in an FCC auction. However, money given that way would go towards paying off the federal deficit. Proceeds of FCC auctions can be earmarked for specific uses and legislators all wanted to see the spectrum sold by FCC auction so that they could use some of the money.

The rural spectrum-sharing idea might not be not dead since the announcement was made by short letter. However, the Chairman could easily have mentioned rural broadband in the letters to legislators and didn’t. The Chairman has made numerous speeches where he said that solving the rural digital divide is his primary goal. It’s clear by his actions during the last few years that deregulation and giveaways to the big carriers under the guise of promoting 5G are the real priority of this FCC.

The C-Band spectrum sits next to the recently released CBRS spectrum at 3.5 GHz. Just as additional spectrum benefits 5G, fixed wireless technology improves significantly by combining multiple bands of frequency. Rural carriers have been arguing for years that the FCC should allow for the sharing of spectrum. Proponents of rural broadband argue that urban and rural use of spectrum can coexist since most 5G spectrum is only going to be needed in urban areas. They believe that such spectrum can be used in a point-to-point or point-to-multipoint configuration in rural America without interfering with urban 5G. The big cellular carriers are reluctant to share spectrum because it causes them extra effort, so only the FCC can make it happen.

If the final order doesn’t require frequency sharing, it will be another slap in the face for rural broadband. Since there is not yet a written order, proponents of rural broadband still have an opportunity to be heard at the FCC on the topic. However, I fear that the issue has already been decided and that rural broadband will again be ignored by the FCC.

Broadband Still Growing – 3Q 2019

Leichtman Research Group recently released the broadband customer statistics for the third quarter of 2019 for the largest cable and telephone companies. Leichtman compiles most of these numbers from the statistics provided to stockholders other than Cox, which is estimated.

The numbers provided to investors are lower than broadband customers these same companies report to the FCC, and I think that most of the difference is due to the way many of these companies count broadband to apartment buildings. If they provide a gigabit pipe to serve an apartment building, they might that as 1 customer, whereas for FCC reporting they likely count the number of apartment units served.

Following are the broadband customer counts for the third quarter and a comparison to the second quarter of this year.

3Q 2019 Added % Change
Comcast 28,186,000 379,000 1.4%
Charter 26,325,000 380,000 1.5%
AT&T 15,575,000 (123,000) -0.8%
Verizon 6,961,000 (7,000) -0.1%
Cox 5,145,000 25,000 0.5%
CenturyLink 4,714,000 (36,000) -0.8%
Altice 4,180,300 14,900 0.4%
Frontier 3,555,000 (71,000) -2.0%
Mediacom 1,316,000 13,000 1.0%
Windstream 1,040,000 5,700 0.6%
Consolidated 784,151 1,143 0.1%
WOW 773,900 10,420 1.3%
Cable ONE 689,138 7,376 1.1%
Atlantic Broadband 446,137 2,441 0.6%
TDS 437,700 4,300 1.0%
Cincinnati Bell 425,100 (400) -0.1%
100,553,426 605,660 0.6%

Leichtman says this group of companies represents 96% of all US broadband customers. I’m not sure how they calculated that percentage. That implies that there are only about 4 million broadband customers for companies not on this list, and that feels a little low to me.

For the quarter, these companies collectively saw growth that annualizes to 2.4%. This is a significant uptick over the second quarter of 2019 that saw an annualized growth rate of 1.7%.

On an annualized basis the third quarter of 2019 added about the same number of customers that were added for the calendar year of 2018. However, the cable companies are performing better this year while the losses continue to accelerate for the big telcos. The big telco losers for the quarter are Frontier, which lost 2% of its customer base, and AT&T and CenturyLink which each lost 0.8% of their customer base. Following are the annualized changes in customers in 2018 and 2019:

‘                                          2018                2019

Cable Companies        2,987,721        3,317,904

Telcos                            ( 472,124)        ( 895,564)

Total                              2,425,597        2,422,640

Both Comcast and Charter had spectacular quarters and continue to account for most of the growth in broadband, as each company added around 380,000 customers for the quarter. It would be interesting to understand what is driving that growth. Some of that comes from providing broadband to new homes. Some comes from customers converting away from DSL. And some comes from expansion – I know of examples where both companies are building new network around the fringes of their service areas.

Auditing the Universal Service Fund

I recently heard FCC Commissioner Geoffrey Starks speak to the Broadband Communities meeting in Alexandria, Virginia. He expressed support for finding broadband solutions and cited several examples of communities that don’t have good broadband access today – both due to lack of connectivity and due to the lack of affordable broadband.

One of his more interesting comments is that he wants the FCC to undertake a ‘data-driven’ analysis of the effectiveness of the Universal Service Fund over the last ten years. He wants to understand where the fund has succeeded and where it has failed. Trying to somehow measure the effectiveness of the USF sounds challenging. I can think of numerous successes and failures of USF funding, but I also know of a lot of situations that I would have a hard time classifying as a success or failure.

Consider some of the challenges of looking backward. Over the last decade, the definition of broadband has changed from 4/1 Mbps to 25/3 Mbps. Any USF funds that supported the older speeds will look obsolete and inadequate today. Was using USF funding nine years ago to support slow broadband by today’s standards a success or a failure?

One of the biggest challenges of undertaking data-driven analysis is that the FCC didn’t gather the needed data over time. For example, there has only been a limited amount of speed testing done by the FCC looking at the performance of networks built with USF funding. A more rigorous set of testing starts over the next few years, but I think even the new testing won’t tell the FCC what they need to know. For example, the FCC just changed the rules to let the big telcos off the hook when they decided that USF recipients can help to decide which customers to test. The big telcos aren’t going to test where they didn’t build upgrades or where they know they can’t meet the FCC speed requirements.

The FCC will find many successes from USF funding. I’m aware of many rural communities that have gotten fiber that was partially funded by the ACAM program. These communities will have world-class broadband for the rest of this century. But ACAM money was also used in other places to build 25/3 DSL. I’m sure the rural homes that got this DSL are thankful because it’s far better than what they had before. But will they be happy in a decade or two as their copper networks approach being a century old? Are the areas that got the DSL a success or a failure?

Unfortunately, there are obvious failures with USF funding. Many of the failures come from the inadequate mapping that influenced USF funding decisions. There are millions of households for which carriers have been denied USF funding because the homes have been improperly classified as having broadband when they do not. Commissioner Stark said he was worried about using these same maps for the upcoming RDOF grants – and he should be.

Possibly the biggest failures come from what I call lack of vision by the FCC. The biggest example of this is when they awarded $11 billion to fund the CAF II program for the big telcos, requiring 10/1 Mbps speeds at a time when the FCC had already declared broadband to be 25/3 Mbps. That program was such a failure that the CAF II areas will be eligible for overbuilding using the RDOF grants, barely after the upgrades are slated to be completed. The Universal Service Fund should only support building broadband to meet future speed needs and not today’s needs. This FCC is likely to repeat this mistake if they award the coming RDOF grants to provide 25/3 Mbps speeds – a speed that’s arguably inadequate today and that clearly will be inadequate by the time the RDOF networks are completed seven years from now.

I hope the data-driven analysis asks the right questions. Again, consider CAF II. I think there are huge numbers of homes in the CAF II service areas where the big telcos made no upgrades, or upgraded to speeds far below 10/1 Mbps. I know that some of the big telcos didn’t even spend much of their CAF II funding and pocketed it as revenue. Is the audit going to look deep at such failures and take an honest look at what went wrong?

Commissioner Stark also mentioned the Lifeline program as a failure due to massive fraud. I’ve followed the Lifeline topic closely for years and the fraud has been nowhere near the magnitude that is being claimed by some politicians. Much of the blame for problems with the program came from the FCC because there was never any easy way for telcos to check if customers remained eligible for the program. The FCC is in the process of launching such a database – something that should have been done twenty years ago. The real travesty of the Lifeline program is that the big telcos have walked away. For example, AT&T has stopped offering Lifeline in much of its footprint. The FCC has also decided to make it exceedingly difficult for ISPs to join the program, and I know of numerous ISPs that would love to participate.

I try not to be cynical, and I hope an ‘audit’ isn’t just another way to try to kill the Lifeline program but is instead an honest effort to understand what has worked and not worked in the past. An honest evaluation of the fund’s problems will assign the blame for many of the fund’s problems to the FCC, and ideally, that would stop the current FCC from repeating the mistakes of the past.