Categories
Regulation - What is it Good For?

Revisiting the Definition of Broadband

There has been talk for over a year that the new FCC under Chairwoman Jessica Rosenworcel is planning to raise the definition of broadband to 100/20 Mbps. It looks like that probably doesn’t happen until Congress approves a fifth Commissioner.

As much of a welcome relief as that would be, I think we also need to understand that a 100/20 Mbps definition of broadband is not forward-looking and will start being obsolete and too slow from the day it is approved. I’ve always argued that we need a mechanism to change the definition of broadband annually, or at least more often than we have been doing.

Consider a few facts that ought to be part of the discussion of the definition of broadband. The first is that the need for faster speeds has been growing since the 1980s, and there is no reason to think it will stop growing today. If we accept that 25 Mbps download was a decent definition of speed when it was adopted in 2015 and that 100 Mbps is a decent definition in 2022, then that is an acknowledgment that the public’s need for speed has increased by 21% annually during those years.

As it turns out if we look back at history that the demand for broadband speed has been growing at the same pace for a long time. The FCC set the definition of broadband at 200 kbps/200 kbps in 1996 and upped the definition to 4/1 Mbps in 2010. Plot those on a growth curve, and we can see the steady and inexorable growth of broadband since the dial-up days. You’d have to be a fool to think that we’ve reached the end of that growth curve. We’re finding new ways to use broadband in our homes every year, and the demand for better broadband keeps growing.

We have other evidence that the public demand for faster broadband continues to grow. That is evidenced by the new customer adoption statistics announced by OpenVault for December 31, 2021.

Dec 2021
Under 50 Mbps 9.4%
50 – 99 Mbps 7.6%
100 – 199 Mbps 36.9%
200 – 499 Mbps 28.5%
500 – 999 Mbps 5.5%
1 Gbps 12.2%

According to OpenVault, only 17% of broadband subscribers are buying broadband products with advertised speeds under 100 Mbps. 46% of all households are buying broadband of 200 Mbps or faster – and that’s going to climb quickly as the big cable companies push faster speeds on all of their customers.

What do these statistics say about using 100 Mbps download as the definition of broadband? First, I think the market has already told the FCC that 100 Mbps is quickly becoming last year’s news. Within a year, when 60% or 70% of the public is buying broadband speeds of at least 200 Mbps, it will be obvious that 100 Mbps broadband is already in the rearview mirror for most Americans.

But we can also go back to the historic growth curve. If the demand for broadband keeps growing at the rate it’s grown since 1996, then the future demand for download speeds will be as follows:

Download Speeds in Megabits / Second

2022 2023 2024 2025 2026 2027 2028
100 121 146 177 214 259 314

Hopefully, the FCC doesn’t change the definition and then rest on its laurels. Even by the time of the next presidential election, the definition ought to be 150 Mbps, and by 2028 would be expected to grow to over 300 Mbps.

Unfortunately, the definition of broadband has political and financial overtones. It determines who can win grants. A higher definition of broadband can declare that certain technologies are no longer considered to be broadband. In a perfect world, directed by the public demand for broadband, the definition of broadband would increase every year, something like the above.

 

Categories
Technology The Industry

Farms Need Broadband Today

I recently saw a presentation by Professor Nicholas Uilk of South Dakota State University. He is the head of the first bachelor degree program in the country for Precision Agriculture. That program does just what the name suggests – they are teaching budding farmers how to use technology in farming to increase crop yields – and those technologies depend upon broadband.

Precision agriculture is investigating many different aspects of farming. Consider the following:

  • There has been a lot of progress creating self-driving farm implements. These machines have been tested for a few years, but there are not a lot of farmers yet willing to set machines loose in the field without a driver in the cab. But the industry is heading towards the day when driverless farming will be an easily achievable reality.
  • Smart devices have moved past tractors and now include things like automated planters, fertilizer spreaders, manure applicators, lime applicators, and tillage machines.
  • The most data-intensive farming need is the creation of real-time variable rate maps of fields. Farmers can use smart tractors or drones to measure and map important variables that can affect a current crop like the relative amounts of key nutrients, moisture content, and the amount of organic matter in the soil. This mapping creates massive data files that are sent off-farm. Experts agronomists review the data and prepare a detailed plan to get the best yields from each part of the field. The problem farms have today is promptly getting the data to and from the experts. Without fast broadband, the time required to get these files to and from the experts renders the data unusable if the crop grows too large to allow machines to make the suggested changes.
  • Farmers are measuring yields as they harvest so they can record exactly which parts of their fields produced the best results.
  • SDSU is working with manufacturers to develop and test soil sensors that could wirelessly transmit real-time data on pH, soil moisture, soil temperature, and transpiration. These sensors are too expensive today to be practical – but the cost of sensors should drop over time.
  • Research is being done to create low-cost sensors that can measure the health of individual plants.
  • Using sensors for livestock is the most technologically advanced area and there are now dairy farms that measure almost everything imaginable about every milking cow. The sensors for monitoring pigs, chickens, and other food animals are also advanced.
  • The smart farm today measures an immense amount of data on all aspects of running the business. This includes gathering data for non-crop parts of the business such as the performance of vehicles, buildings, and employees. The envisioned future is that sensors will be able to sense a problem in equipment and a send a replacement part before a working machine fails.
  • One of the more interesting trends in farming is to record and report on every aspect of the food chain. When the whole country stopped eating romaine last year because of contamination at one farm, the industry has started to develop a process where each step of the production of crops is recorded, with the goal to report the history of food to the consumer. In the not-too-distant future, a consumer will be able to scan a package of lettuce and know where the crop was grown, how it was grown (organic) when it was picked, shipped and brought to the store. This all requires creating a blockchain with an immutable history of each crop, from farm to store.

The common thread of all of these developments in precision agriculture is the need for good broadband. Professor Uilk says that transmitting the detailed map scans for crop fields realistically requires 100 Mbps upload to get the files to and from the experts in a timely exchange. That means fiber to the farm.

A lot of the other applications require reliable wireless connections around the farm, and that implies a much better use of rural spectrum. Today the big cellular carriers buy the rights to most spectrum and then let it lie fallow in rural areas. We need to find a way to bring spectrum to the farm to take advantage of measuring sensors everywhere and for directing self-driving farm equipment.

Categories
Regulation - What is it Good For?

Setting the Right Goals for Grants

Most past federal broadband grant programs had very specific goals. For example, the USDA Community Connect grants that have been around for many years target grants to the poorest parts of the country – the awards are weighted towards communities with the highest levels of poverty. For any grant program to be effective the goals of the program need to be clearly defined, and then the award process needs to be aligned with those goals.

The FCC needs to define the goals of the upcoming $20.4 billion grant program. It the goals are poorly defined then the resulting grant awards are likely to be all over the board in terms of effectiveness. What are the ideal goals for a grant program of this magnitude?

The first goal to be decided is the scope of the coverage – will the goal be to bring somewhat better broadband to as many households as possible, or will it be to bring a long-term broadband solution to a smaller number of households? If the goal is to serve the most households possible, then the grants are going to favor lower-cost technologies and the grants will likely go to the wireless providers and satellite providers – as we saw happen in the recent CAF II reverse auction.

If the grants are aimed at a more permanent solution, then the grants will favor fiber. Perhaps the grants could also go towards anybody willing to extend a cable hybrid-fiber coaxial network into rural areas – but no other technology can be considered as a permanent solution.

There are huge consequences for choosing the first option of serving as many households as possible. These new grants are mostly going to be awarded in the geographic areas covered by the original CAF II program. That program awarded over $11 billion to the big telcos to beef up broadband to speeds of at least 10/1 Mbps. Now, before that program is even finished the FCC is talking about overbuilding those same areas with another $20 billion grant program. If this grant program is used to upgrade homes to fixed wireless, it doesn’t take a crystal ball to understand that in ten years from now we’ll be talking about overbuilding these areas again with fiber. It would be incredibly wasteful to use multiple rounds of grants to upgrade the same geographic areas several times.

The other big issue for these grants to deal with is defining which parts of the country are eligible for the grants. What should be the criteria to decide which homes can be upgraded?

If the test is going to be related to existing speeds, the FCC is going to have to deal with the existing broadband coverage maps that everybody in the industry knows to be badly flawed. The FCC is talking about tackling a new mapping effort – but it’s highly likely that the new maps will just swap old mapping errors for new mapping errors. The reality on the ground is that it’s virtually impossible to map the real speeds on copper or fixed wireless networks. In real life, two rural neighbors can have drastically different speeds due to something as simple as being on different copper pairs. It’s impossible to accurately map DSL or wireless broadband coverage.

To make matters even worse, the current Re-Connect grants are saddled with a rule that says that no more than 10% of grant-covered homes can have existing broadband of more than 10/1 Mbps. Layering that kind of rule on top of terrible maps creates an environment where an ISP is largely unable to define a believable grant footprint.

The FCC must figure out some way to rectify the mapping problem. One of the easiest ways is what I call the technology test – anybody that wants to overbuild copper with fiber should automatically be eligible without trying to figure out the current speeds on the copper. Perhaps the easiest rule could be that any place where there is telco copper and no cable company network should be grant-eligible for fiber overbuilders.

Assuming the grants won’t all go to fiber, then there has to be an alternate way for an ISP or a community to challenge poor maps. Perhaps the FCC needs to provide a realistic time frame to allow local governments to demonstrate the actual speeds in an area, much like what was done in the recent Mobility II grant process.

This blog is part of a series on Designing the Ideal Federal Grant Program.

Categories
The Industry

Going Wireless-only for Broadband

According to New Street Research (NSR), up to 14% of homes in the US could go all-wireless for broadband. They estimate that there are 17 million homes which are small enough users of bandwidth to justify satisfying their broadband needs strictly using a cellular connection. NSR says that only about 6.6 million homes have elected to go all-wireless today, meaning there is a sizable gap of around 10 million more homes for which wireless might be a reasonable alternative.

The number of households that are going wireless-only has been growing. Surveys by Nielsen and others have shown that the trend to go wireless-only is driven mostly by economics, helped by the ability of many people to satisfy their broadband demands using WiFi at work, school or other public places.

NSR also predicts that the number of homes that can benefit by going wireless-only will continue to shrink. They estimate that only 14 million homes will benefit by going all-wireless within five years – with the decrease due to the growing demand of households for more broadband.

There are factors that make going wireless an attractive alternative for those that don’t use much broadband. Cellular data speeds have been getting faster as cellular carriers continue to implement full 4G technology. The first fully compliant 4G cell site was activated in 2017 and full 4G is now being deployed in many urban locations. As speeds get faster it becomes easier to justify using a cellphone for broadband.

Of course, cellular data speeds need to be put into context. A good 4G connection might be in the range of 15 Mbps. That speed feels glacial when compared to the latest speeds offered by cable companies. Both Comcast and Charter are in the process of increasing data speeds for their basic product to between 100 Mbps and 200 Mbps depending upon the market. Cellphones also tend to have sluggish operating systems that are tailored for video and that can make regular web viewing feel slow and clunky.

Cellular data speeds will continue to improve as we see the slow introduction of 5G into the cellular network. The 5G specification calls for cellular data speeds of 100 Mbps download when 5G is fully implemented. That transition is likely to take another decade, and even when implemented isn’t going to mean fast cellular speeds everywhere. The only way to achieve 100 Mbps speeds is by combining multiple spectrum paths to a given cellphone user, probably from multiple cell sites. Most of the country, including most urban and suburban neighborhoods are not going to be saturated with multiple small cell sites – the cellular companies are going to deploy faster cellular speeds in areas that justify the expenditure. The major cellular providers have all said that they will be relying on 4G LTE cellular for a long time to come.

One of the factors that is making it easier to go wireless-only is that people have access throughout the day to WiFi, which is powered from landline broadband. Most teenagers would claim that they use their cellphones for data, but most of them have access to WiFi at home and school and at other places they frequent.

The number one factor that drives people to go all-wireless for data is price. Home broadband is expensive by the time you add up all of the fees from a cable company. Since most people in the country already has a cellphone then dropping the home broadband connection is a good way for the budget-conscious to control their expenses.

The wireless carriers are also making it easier to go all wireless by including some level of video programming with some cellular plans. These are known as zero-rating plans that let a customer watch some video for free outside of their data usage plan. T-Mobile has had these plans for a few years and they are now becoming widely available on many cellular plans throughout the industry.

The monthly data caps on most wireless plans are getting larger. For the careful shopper who lives in an urban area there are usually a handful of truly unlimited data plans. Users have learned, though, that many such plans heavily restrict tethering to laptops and other devices. But data caps have creeped higher across-the-board in the industry compared to a few years ago. Users who are willing to pay more for data can now buy the supposedly unlimited data plans from the major carriers that are actually capped between 20 – 25 GB per month.

There are always other factors to consider like cellular coverage. I happen to live in a hilly wooded town where coverage for all of the carriers varies block by block. There are so many dead spots in my town that it’s challenging to use cellular even for voice calls. I happen to ride Uber a lot and it’s frustrating to see Uber drivers get close to my neighborhood and get lost when they lose their Verizon signal. This city would be a hard place to rely only on a cellphone. Rural America has the same problem and regardless of the coverage maps published by the cellular companies there are still huge areas where rural cellular coverage is spotty or non-existent.

Another factor that makes it harder to go all-wireless is working from home. Cellphones are not always adequate when trying to log onto corporate WANs or for downloading and working on documents, spreadsheets and PowerPoints. While tethering to a computer can solve this problem, it doesn’t take a lot of working from home to surpass the data caps on most cellular plans.

I’ve seen a number of articles in the last few years talking claiming that the future is wireless and that we eventually won’t need landline broadband. This claim ignores the fact that the amount of data demanded by the average household is doubling every three years. The average home uses ten times or more data on their landline connection today than on their cellphones. It’s hard to foresee the cellphone networks able to close that gap when the amount of landline data use keeps growing so rapidly.

Categories
Regulation - What is it Good For?

Winners of the CAF II Auction

The FCC CAF II reverse auction recently closed with an award of $1.488 billion to build broadband in rural America. This funding was awarded to 103 recipients that will collect the money over ten years. The funded projects must be 40% complete by the end of three years and 100% complete by the end of six years. The original money slated for the auction was almost $2 billion, but the reverse auction reduced the amount of awards and some census blocks got no bidders.

The FCC claims that 713,176 rural homes will be getting better broadband, but the real number of homes with a benefit from the auction is 513,000 since the auction funded Viasat to provide already-existing satellite broadband to 190,000 homes in the auction.

The FCC claims that 19% of the homes covered by the grants will be offered gigabit speeds, 53% will be offered speeds of at least 100 Mbps and 99.75% will be offered speeds of at least 25 Mbps. These statistics have me scratching my head. The 19% of the homes that will be offered gigabit speeds are obviously going to be getting fiber. I know a number of the winners who will be using the funds to help pay for fiber expansion. I can’t figure what technology accounts for the rest of the 53% of homes that supposedly will be able to get 100 Mbps speeds.

As I look through the filings I note that many of the fixed wireless providers claim that they can serve speeds over 100 Mbps. It’s true that fixed wireless can be used to deliver 100 Mbps speeds. To achieve that speed customers either need to be close to the tower or else a wireless carrier has to dedicate extra resources to that customer to achieve that speed – meaning less of that tower can be used to serve other customers. I’m not aware of any WISPs that offer ubiquitous 100 Mbps speeds, because to do so means serving a relatively small number of customers from a given tower. To be fair to the WISPs, their CAF II filings also say they will be offering slower speeds like 25 Mbps and 50 Mbps. The FCC exaggerated the results of the auction by claiming that any recipient capable of delivering 100 Mbps to a few customers will be delivering it to all customers – something that isn’t true. The fact is that not many of the households over the 19% getting fiber will ever buy 100 Mbps broadband. I know the FCC wants to get credit for improving rural broadband, but there is no reason to hype the results to be better than they are.

I also scratch my head wondering why Viasat was awarded $122 million in the auction. The company is the winner of funding for 190,595 households, or 26.7% of the households covered by the entire auction. Satellite broadband is every rural customer’s last choice for broadband. The latency is so poor on satellite broadband that it can’t be used for any real time applications like watching live video, making a Skype call, connecting to school networks to do homework or for connecting to a corporate WAN to work from home. Why does satellite broadband even qualify for the CAF II funding? Viasat had to fight to get into the auction and their entry was opposed by groups like the American Cable Association. The Viasat satellites are already available to all of the households in the awarded footprint, so this seems like a huge government giveaway that won’t bring any new broadband option to the 190,000 homes.

Overall the outcome of the auction was positive. Over 135,000 rural households will be getting fiber. Another 387,000 homes will be getting broadband of at least 25 Mbps, mostly using fixed wireless, with the remaining 190,000 homes getting the same satellite option they already have today.

It’s easy to compare this to the original CAF II program that gave billions to the big telcos and only required speeds of 10/1 Mbps. That original CAF II program was originally intended to be a reverse auction open to anybody, but at the last minute the FCC gave all of the money to the big telcos. One has to imagine there was a huge amount of lobbying done to achieve that giant giveaway.

Most of the areas covered by the first CAF II program had higher household density than this auction pool, and a reverse auction would have attracted a lot of ISPs willing to invest in faster technologies than the telcos. The results of this auction show that most of those millions of homes would have gotten broadband of at least 25 Mbps instead of the beefed-up DSL or cellular broadband they are getting through the big telcos.

Categories
The Industry

Upgrading Broadband Speeds

A few weeks ago Charter increased my home broadband speeds from 60 Mbps to 130 Mbps with no change in price. My upload speed seems to be unchanged at 10 Mbps. Comcast is in the process of speed upgrades and is increasing base speeds to between 100 Mbps and 200 Mbps download speeds in various markets.

I find it interesting that while the FCC is having discussions about keeping the definition of broadband at 25 Mbps that the big cable companies – these two alone have over 55 million broadband customers – are unilaterally increasing broadband speeds.

These companies aren’t doing this out of the goodness of their hearts, but for business reasons. First, I imagine that this is a push to sharpen the contrast with DSL. There are a number of urban markets where customers can buy 50 Mbps DSL from AT&T and others and this upgrade opens up a clear speed difference between cable broadband and DSL.

However, I think the main reason they are increasing speeds is to keep customers happy. This change was done quietly, so I suspect that most people had no idea that the change was coming. I also suspect that most people don’t regularly do speed tests and won’t know about the speed increase – but many of them will notice better performance.

One of the biggest home broadband issues is inadequate WiFi, with out-of-date routers or poor router placement degrading broadband performance. Pushing faster speeds into the house can overcome some of these WiFi issues.

This should be a wake-up call to everybody else in the industry to raise their speeds. There are ISPs and overbuilders all across the country competing against the giant cable companies and they need to immediately upgrade speeds or lose the public relations battle in the market place. Even those who are not competing against these companies need to take heed, because any web search is going to show consumers that 100 Mbps broadband or greater is now the new standard.

These unilateral changes make a mockery of the FCC. It’s ridiculous to be having discussions about setting the definition of broadband at 25 Mbps when the two biggest ISPs in the country have base product speeds 5 to 8 times faster than that. States with broadband grant programs also have the speed conversation and this will hopefully alert them that the new goal for broadband needs to be at least 100 Mbps.

These speed increases were inevitable. We’ve known for decades that the home demand for broadband has been doubling every three years. When the FCC first started talking about 25 Mbps as the definition of acceptable broadband, the math said that within six years we’d be having the same discussion about 100 Mbps broadband – and here we are having that discussion.

The FCC doesn’t want to recognize the speed realities in the world because they are required by law to try to bring rural speeds to be par with urban speeds. But this can’t be ignored because these speed increases are not just for bragging rights. We know that consumers find ways to fill faster data pipes. Just two years ago I saw articles wondering if there was going to be any market for 4K video. Today, that’s the first thing offered to me on both Amazon Prime and Netflix. They shoot all new programming in 4K and offer it at the top of their menus. It’s been reported that at the next CES electronics shows there will be several companies pushing commercially available 8K televisions. This technology is going to require a broadband connection between 60 Mbps and 100 Mbps depending upon the level of screen action. People are going to buy these sets and then demand programming to use them – and somebody will create the programming.

8K video is not the end game. Numerous companies are working on virtual presence where we will finally be able to converse with a hologram of somebody as if they were in the same room. Early versions of this technology, which ought to be available soon will probably use the same range of bandwidth as 8K video, but I’ve been reading about near-future technologies that will produce realistic holograms and that might require as much as a 700 Mbps connection – perhaps the first real need for gigabit broadband.

While improving urban data speeds is great, every increase in urban broadband speeds highlights the poor condition of rural broadband. While urban homes are getting 130 – 200 Mbps for decent prices there are still millions of homes with either no broadband or with broadband at speeds of 10 Mbps or less. The gap between urban and rural broadband is growing wider every year.

If you’ve been reading this blog you know I don’t say a lot of good things about the big cable companies. But kudos to Comcast and Charter for unilaterally increasing broadband speeds. Their actions speak louder than anything that we can expect out of the FCC.

Categories
Regulation - What is it Good For?

The Definition of Broadband

The FCC recently issued the Notice of Inquiry (NOI) seeking input on next years broadband progress report. As usual, and perhaps every year into the future, this annual exercise stirs up the industry as we fight to define the regulatory speed of broadband. That definition matters because Congress has tasked the FCC to undertake efforts to make sure that everybody in the country has access to broadband. Today broadband is defined as 25 Mbps downstream and 3 Mbps upstream, and households that can’t buy that speed are considered underserved if they can get some broadband and unserved if they have no broadband options.

The NOI proposes keeping the 25/3 Mbps definition of broadband for another year. They know if they raise it that millions of homes will suddenly be considered to be underserved. However, the FCC is bowing to pressure and this year will gather data to see how many households have access to 50/5 Mbps broadband.

It was only a year ago when this FCC set off a firestorm by suggesting a reversion to the old definition of 10/1 Mbps. That change would have instantly classified millions of rural homes as having adequate broadband. The public outcry was immediate, and the FCC dropped the idea. For last year’s report the FCC also considered counting mobile broadband as a substitute for landline broadband – another move that would have reclassified millions into the served category. The FCC is not making that same recommendations this year – but they are gathering data on the number of people who access to cellular data speeds of 5/1 Mbps and 10/3 Mbps.

The FCC has also been tasked by Congress for getting faster broadband to schools. This year’s NOI recommends keeping the current FCC goal for all schools to immediately have access of 100 Mbps per 1,000 students, with a longer-term goal of 1 Gbps per 1,000 students.

Commissioner Jessica Rosenworcel has suggested in the current NOI that the official definition of broadband be increased to 100 Mbps download. She argues that our low target for defining broadband is why “the United States is not even close to leading the world” in broadband.

I think Commissioner Rosenworcel is on to something. The gap between the fastest and slowest broadband speeds is widening. This year both Comcast and Charter are unilaterally raising broadband speeds to customers. Charter kicked up the speed at my house from 60 Mbps to 130 Mbps a few weeks ago. AT&T is building fiber to millions of customers. Other fiber overbuilders continue to invest in new fiber construction.

The cable companies decided a decade ago that their best strategy was to stay ahead of the speed curve. This is at least the third round of unilateral speed increases that I can remember. A customer who purchased and kept a 20 Mbps connection a decade ago is probably now receiving over 100 Mbps for that same connection. One way to interpret Commissioner Rosenworcel’s suggestion is that the definition of broadband should grow over time to meet the market reality. If Charter and Comcast both think that their 50 million urban customers need speeds of at least 100 Mbps, then that ought to become the definition of broadband.

However, a definition of broadband at 100 Mbps creates a major dilemma for the FCC. The only two widely deployed technologies that can achieve that kind of speed today are fiber and cable company hybrid fiber/coaxial networks. As I wrote just a few days ago, there are new DSL upgrades available that can deliver up to 300 Mbps for 3,000 – 4,000 feet from a DSL hub – but none of the US telcos are pursuing the technology. Fixed wireless technology can deliver 100 Mbps – but only to customers living close to a wireless tower.

If the FCC was to adopt a definition of broadband at 100 Mbps, they would be finally recognizing that the fixes for rural broadband they have been funding are totally inadequate. They spent billions in the CAF II program to bring rural broadband up to 10/1 Mbps broadband. They are getting ready to give out a few more billion in the CAF II reverse auction which will do the same, except for a few grant recipients that use the money to help fund fiber.

By law, the FCC would have to undertake programs to bring rural broadband up to a newly adopted 100 Mbps standard. That would mean finding many billions of dollars somewhere. I don’t see this FCC being bold enough to do that – they seem determined to ignore the issue hoping it will go away.

This issue can only be delayed for a few more years. The country is still on the curve where the need for broadband at households doubles every three or so years. As the broadband usage in urban homes grows to fill the faster pipes being supplied by the cable companies it will become more apparent each year that the definition of broadband is a lot faster than the FCC wants to acknowledge.

Categories
The Industry

The Lack of Broadband Competition

There is one statistic from the FCC annual report on the state of broadband that I’ve been meaning to write about. There is still a massive lack of broadband competition at speeds that most households are coming to think of as broadband.

Here are the key statistics from that report:

  • 13% of all households can’t get broadband that meets the FCC’s definition of 25/3 Mbps
  • 31% of homes have access to 25/3 Mbps, but not speeds of 100 Mbps
  • 15% have access to 100 Mbps from more than one provider
  • 41% have access to 100 Mbps from only one provider

It’s the last statistic that I find astounding. The current FCC declared with this report that the state of broadband in the country is healthy and that the market is taking care of the country’s broadband needs. I’ve written number blogs about the households in the bottom 13% that have little or no broadband, but I want to look closer at the top two categories.

Households in the 15% category are in markets where there is a fiber provider in addition to the incumbent cable company. The biggest fiber provider is still Verizon FiOS, but there are numerous others building fiber like AT&T, CenturyLink, Google Fiber, smaller telcos, small fiber overbuilders and municipalities.

This means that 41% of households (51 million homes) only have one option for fast broadband – the cable company. I see numerous problems related to this huge monopoly that has been won by the big cable companies. Consider the following:

  • The US already has some of the most expensive broadband in the developed world. The high prices are directly the result of the lack of competition.
  • This lack of competition is likely the driving factor for why most of the big ISPs in the US are rated at the bottom of all US corporations in terms of customer service. We know that customer service improves in markets where is broadband competition, but the big ISPs don’t make the same effort elsewhere.
  • We also know that competition between a cable company and a smaller fiber overbuilder lowers broadband prices. For example, there are markets where competitors like Google have set the price of a gigabit connection at $70, and the cable companies generally come close to matching the lower price. But preliminary pricing from Comcast and Charter for their new gigabit products where there are no competitors will be significantly north of $100 per month.
  • Even where there are competing networks, if both networks are owned by large ISPs we see duopoly competition where the big ISPs don’t push each other on price. For example, Comcast largely is able to offer the same prices when competing against Verizon FiOS as it does in markets where there is no fiber provider.
  • Industry analysts expect the big ISPs to start raising broadband rates for various reasons. The ISPs continue to lose telephone and cable customers and the national penetration rate for broadband is nearing a market saturation point. In order to satisfy Wall Street the big ISPs will have little choice other than raising broadband prices to maintain earnings growth.

I’m sure that the households in the bottom 13% of the market that can’t get good broadband are not sympathetic to those who can only buy fast broadband from one provider. But these statistics say that 41% of the whole market are dealing with a monopoly situation for fast broadband. Telecom is supposed to be a competitive business – but for the majority of the country the competitors have never showed up. For the FCC to declare that we have a healthy broadband market astounds me when so many households are hostage to a broadband monopoly.

There is always the chance that over the next decade that fixed 5G will bring more broadband competition. My guess, however, is that at least for a few years that this is going to be a lot more competition by press release than real competition. Deploying gigabit 5G like the big ISPs are all touting is going to require a lot more fiber than we have in place today. Deploying 5G without fiber backhaul might still result in decent broadband, but it’s not going to be the robust gigabit product that the ISPs are touting. But even poorly deployed 5G networks might bring 100+ Mbps broadband to a lot more homes after the technology gets a little more mature.

Unfortunately there is also the risk that 5G might just result in a lot more duopoly competition instead of real competition. If 5G is mostly deployed by big ISPs like Verizon and AT&T there is no reason to think that they will compete on price. Our only hope for real market competition is to see multiple non-traditional ISPs who will compete on price. However, it’s so tempting for ISPs to ride the coattails of the big ISPs in terms of pricing that 5G might bring more of the same high prices rather than real competition.

Categories
The Industry

Google Fiber and the Triple Play

There is some interesting news from Google Fiber lately about new product offerings. It was reported at the end of January that Google is testing a voice product for its fiber customers. And in early February Google announced that it was adding a 100 Mbps data product in the Atlanta roll-out.

News leaked out that Google is experimenting with Fiber Phone with members of its Trusted Tester Program. Google offered phone service to those customers and wrote the following:

With Fiber Phone, you can use the right phone for your needs, whether it’s your mobile device on the go or your landline at home. No more worrying about cell reception or your battery life when you’re home… Spam filtering, call screening and do-not-disturb make sure the right people can get in touch with you at the right time.

Google is installing the needed equipment for test customers and is at the beta stage of testing. There has been news about possible pricing or when this might be made available to all customers.

In early February Google announced it is now offering a 100 Mbps data product for $50 to go along with the $70 gigabit offering. In Atlanta the company has eliminated the ‘free’ Internet product where customers paid a one-time fee of $300 and got a 5 Mbps product for 7 years with no additional fees.

With these changes Google is looking more and more like a typical triple-play provider. It’s not hard to understand why they would make these changes. It’s very expensive to build a fiber network and the best way to pay for it is to get as many high-margin customers as possible on the network to pay for it.

As exciting as the $70 gigabit product is there are a huge number of households that just can’t afford that price. So by adding a $50 product that is still blazingly fast Google will make their broadband affordable to a lot more people in each market.

There is one interesting market dynamic that Google is probably going to soon see. In looking at the customer penetration rates for many of my client ISPs I’ve almost always seen that the fastest Internet product (assuming it isn’t priced too high) will get 10% to 15% of the customers in a given market. Given a choice, the rest of the customers will take something slower if it saves them money. This is not something that’s true only for fast fiber networks, but I’ve seen this same relationship hold true for cable companies with HFC networks and for DSL networks. There are only a few markets where a higher percentage of customers buy the premium data product.

If Google goes back and introduces the 100 Mbps product in their older markets they will probably see two things. First, they will add customers who find the $50 price affordable. But they are also going to see gigabit customers downgrade to 100 Mbps to save $20 per month. Overall I would guess this change will produce a significant net change upward in total revenues in Google’s older markets. In Atlanta I predict they will get a lot more 100 Mbps customers than gigabit customers.

And Google ought to do okay with voice. My experience is that they will have a hard time selling voice to existing customers but that they will do okay with new customers as they add them. The FCC reported that voice just fell under a 50% nationwide penetration, and that is still a lot of potential customers. I see clients still doing surprisingly well with residential voice and still doing extremely well with business voice.

It’s interesting to see that after a few years in the market that Google is morphing into a more normal triple play provider. I’ve expected this from the start because my take is that a large majority of the households still wants the double play or triple play and if you want to get a lot of customers you have to provide what customers want to buy. Anybody that expects customers to buy from more than one vendor to get what they want is going to drive away a lot of potential customers.

Exit mobile version