The Zero-rating Strategy

The cable companies are increasingly likely to be take a page from the cellular carriers by offering zero-rating for video. That’s the practice of providing video content that doesn’t count against monthly data caps.

Zero-rating has been around for a while. T-Mobile first started using zero-rating in 2014 when it provided its ‘Music Freedom’ plan that provided free streaming music that didn’t count against cellular data caps. This highlights how fast broadband needs have grown in a short time – but when data caps were at 1 GB per month, music streaming mattered.

T-Mobile then expanded the zero-rating in November 2015 to include access to several popular video services like Netflix and Hulu. AT&T quickly followed with the first ‘for-pay’ zero-rating product, called FreeBee Data that let customers (or content providers) pay to zero-rate video traffic. The AT&T plan was prominent in the net neutrality discussions since it’s a textbook example of Internet fast lanes using sponsored data where some video traffic was given preferential treatment over other data.

A few of the largest cable companies have also introduced a form of zero-rating. Comcast started offering what it called Stream TV in late 2015. This service allowed customers to view video content that doesn’t count against the monthly data cap. This was a pretty big deal at the time because Comcast was in the process at the time of implementing a 300 GB monthly data cap and video can easily push households over that small cap limit. There was huge consumer pushback against the paltry data caps and Comcast quickly reset the data cap to 1 terabyte. But the Stream TV plan is still in effect today.

What’s interesting about the Comcast plan is that the company had agreed to not use zero-rating as part of the terms of its merger with NBC Universal in 2011. The company claims that the Stream TV plan is not zero-rating since it uses cable TV bandwidth instead of data bandwidth – but anybody who understands a cable hybrid-fiber coaxial network knows that this argument is slight-of-hand, since all data uses some portion of the Comcast data connection to customers. The prior FCC started to look into the issue, but it was dropped by the current FCC as they decided to eliminate net neutrality.

The big cable companies have to be concerned about the pending competition with last-mile 5G. Verizon will begin a slow roll-out of its new 5G technology in October in four markets, and T-Mobile has announced plans to begin offering it next year. Verizon has already announced that they will not have any data caps and T-Mobile is also unlikely to have them.

The pressure will be on the cable companies to not charge for exceeding data caps in competitive markets. Cable companies could do this by eliminating data caps or else by pushing more video through zero-rating plans. In the case of Comcast, they won’t want to eliminate the data caps for markets that are not competitive. They view data caps as a potential source of revenue. The company OpenVault says that 2.5% of home currently exceed 1 TB in monthly data usage, up from 1.5% in 2017 – and within a few years this could be a lucrative source of extra revenue.

Comcast and the other big cable companies are under tremendous pressure to maintain earnings and they are not likely to give up on data caps as a revenue source. They are also likely to pursue sponsored video plans where the video services pay them to provide video outside of data caps.

Zero-rating is the one net neutrality practice that many customers like. Even should net neutrality be imposed again – through something like the California legislation or by a future FCC – it will be interesting to see how firmly regulators are willing to clamp down on a practice that the public likes.

More FCC Mapping Woes

The FCC has another new billion dollar grant program, this one aimed to improve rural cellular coverage. Labeled as the Mobility Fund II the program will conduct a reverse auction sometime next year to give $4.53 billion to cellular carriers to extend wireless coverage to the most remote parts of the country. For taking the funding a cellular carrier must bring 4G LTE coverage to the funded areas and achieve cellular download speeds of at least 10 Mbps. Funding will be distributed over 10 years with build out requirements sooner than that.

Just like with the CAF II program, the areas eligible for funding are based upon the FCC’s broadband maps using data collected by the existing cellular carriers. As you might expect, the maps show that the parts of the country with the worst coverage – those eligible for funding – are mostly in the mountains and deserts of the west and in Appalachia.

The release of the Mobility Fund II maps instantly set off an uproar as citizens everywhere complained about lack of cellular coverage and politicians from all over the country asked the FCC why there wasn’t more funding coming to their states. The FCC received letters from senators in Mississippi, Missouri, Maine and a number of other states complaining that their states have areas with poor or non-existent cellular coverage that were not covered be the new fund.

If you’ve traveled anywhere in rural America you know that there are big cellular dead spots everywhere. I’ve been to dozens of rural counties all across America in the last few years and every one of them has parts of their counties without good cellular coverage. Everybody living in rural America can point to areas where cellphones don’t work.

The issue boils down to the FCC mapping used to define cellular and broadband coverage. The maps for this program were compiled from a one-time data request to the cellular carriers asking for existing 4G coverage. It’s obvious by the protests that the carriers claim cellular coverage where it doesn’t exist.

In August, the Rural Wireless Association (RWA) filed a complaint with the FCC claiming that Verizon lied about its cellular coverage by claiming coverage in many areas that don’t have it. This is the association of smaller wireless companies (they still exist!). They say that the Verizon’s exaggerated coverage claims will block the funding to many areas that should be eligible.

The Mobility Fund II program allows carriers to challenge the FCC’s maps by conducting tests to identify areas that don’t have good cellular coverage. The smaller carriers in the RWA have been filing these challenges and the FCC just added 90 additional days for the challenge process. Those challenges will surely add new eligible coverage areas for this program.

But the challenge program isn’t going to uncover many of these areas because there are large parts of the country that are not close to an RWA carrier, and which won’t be challenged. People with no cellular coverage that are not part of the this grant program might never get good cellular coverage – something that’s scary as the big telcos plan to tear down copper in rural America.

The extent of the challenges against the Verizon data are good evidence that Verizon overstated 4G LTE coverage. The RWA members I know think Verizon did this purposefully to either block others from expanding cellular networks into areas already served by Verizon or to perhaps direct more of this new fund to areas where Verizon might more easily claim some of the $4.5 billion.

To give Verizon a tiny amount of credit, knowing cellular coverage areas is hard. If you’ve ever seen a coverage map from a single cell tower you’ll instantly notice that it looks like a many-armed starfish. There are parts of the coverage area where good signal extends outward for many miles, but there are other areas where the signal is blocked by a hill or other impediments. You can’t draw circles on a map around a cell tower to show coverage because it only works that way on the Bonneville Salt Flats. There can be dead spots even near to the cell tower.

The FCC fund is laudable in that it’s trying to bring cellular coverage to those areas that clearly don’t have it. But there are countless other holes in cellular coverage that cannot be solved with this kind of fund, and people living in the many smaller cellular holes won’t get any relief from this kind of funding mechanism. Oddly, this fund will bring cellular coverage to areas where almost nobody lives while not addressing cellular holes in more populated areas.

Verizon’s Residential 5G Broadband

We finally got a look at the detail of Verizon’s 5G residential wireless product. They’ve announced that it will be available to some customers in Houston, Indianapolis, Los Angeles and Sacramento starting on October 1.

Verizon promises average download data speeds of around 300 Mbps. Verizon has been touting a gigabit wireless product for the last year, but the realities of wireless in the wild seems to have made that unrealistic. However, 300 Mbps is a competitive broadband product and in many markets Verizon will become the fastest alternative competitor to the cable companies. As we’ve seen everywhere across the country, a decent competitor to the big cable companies is almost assured of a 20% or higher market penetration just for showing up.

The product will be $50 per month for customers who use Verizon wireless and $70 for those that don’t. These prices will supposedly include all taxes, fees and equipment – although it’s possible that there are add-ons like using a Verizon WiFi router. That pricing is going to be attractive to anybody that already has Verizon cellular – and I’m sure the company is hoping to use this to attract more cellular customers. This is the kind of bundle that can make cellular stickier and is exactly what the Comcast and Charter have in mind as they are also offering cellular. Verizon is offering marketing inducements for the roll-out and are offering 3 months free of YouTube TV or else a free Apple TV 4K or a Google Chromecast Ultra.

Theoretically this should set off a bit of a price war in cities where Comcast and Charter are the incumbent cable providers. It wouldn’t be hard for those companies to meet or beat the Verizon offer since they are already selling cellular at a discount. We’re going to get a fresh look at oligopoly competition – will the cable companies really battle it out? The cable companies have to be worried about losing significant market share in major urban markets.

We’re also going to have to wait a while to see the extent of the Verizon coverage areas. I’ve been speculating about this for a while and I suspect that Verizon is going to continue with their history of being conservative and disciplined. They will deploy 5G where there is fiber that can affordably support it – but they are unlikely to undertake any expensive fiber builds just for this product. Their recently announced ‘One Fiber’ policy says just that – the company wants to capitalize on the huge amount of network that they have already constructed for other purposes. This means it’s likely in any given market that coverage will depend upon a customer’s closeness to Verizon fiber.

There is one twist to this deployment that means Verizon might not be in a hurry to deploy this too quickly. The company has been working with Ericsson, Qualcomm, Intel and Samsung to create proprietary equipment based upon the 5GTF standard. But the rest of the industry has adopted the 3GPP standard for 5G and Verizon admits it will have to replace any equipment installed with their current standard.

Verizon also said over the last year that they wanted this to be self-installed by customers. At least for now the installations are going to require a truck roll, which will add to the cost and the rate of deployment of the new technology.

Interestingly, these first markets are outside of Verizon’s telco footprint. This means that Verizon will not only be taking on cable companies, but that they might be putting the final nail in the coffin of DSL offered by AT&T and other telcos in the new markets. Verizon is unlikely to roll this out to compete with their own FiOS product unless deployments are incredibly inexpensive. But this might finally bring a Verizon broadband product to neighborhoods in the northeast that never got FiOS.

It’s going to be a while under we understand the costs of this deployment. Verizon has been mum about the specific network elements and reliance on fiber needed to support the product. And they have been even quieter about the all-in cost of deployment.

Cities all over the country are going to get excited about this deployment in the hope of getting a second competitor to their cable company which are often a near-monopoly. It appears that the product is going to work best where there is already a fiber-rich environment. Most urban areas, while having little last mile-fiber, are crisscrossed with fiber used to get to large businesses, governments, schools, etc.

The same is not necessarily the same in suburbs and definitely not true of smaller communities and rural America. The technology depends upon local last-mile fiber backhaul. Verizon says that they believe their potential market will be to eventually pass 30 million households, or a little less than 25% of the US market. I’d have to think that the map for others, except perhaps for AT&T largely coincide with the Verizon map. It seems that Verizon wants to be the first to market to potentially dissuade other entrants. We’ll have to wait and see if a market can reasonably support more than one last-mile 5G provider – because companies like T-Mobile also have plans for wide deployment.

Massive MIMO

One of the technologies that will bolster 5G cellular is the use of massive MIMO (multiple-input, multiple-output) antenna arrays. Massive MIMO is an extension of smaller MIMO antennas that have been use for several years. For example, home WiFi routers now routinely use multiple antennas to allow for easier connections to multiple devices. Basic forms of the MIMO technology have been deployed in LTE cell sites for several years.

Massive MIMO differs from current technology by the use of big arrays of antennas. For example, Sprint, along with Nokia demonstrated a massive MIMO transmitter in 2017 that used 128 antennas, with 64 for receive and 64 for transmit. Sprint is in the process of deploying a much smaller array in cell sites using the 2.5 GHz spectrum.

Massive MIMO can be used in two different ways. First, multiple transmitter antennas can be focused together to reach a single customer (who also needs to have multiple receivers) to increase throughput. In the Sprint trial mentioned above Sprint and Nokia were able to achieve a 300 Mbps connection to a beefed-up cellphone. That’s a lot more bandwidth than can be achieved from one transmitter, which at the most could deliver whatever bandwidth is possible on the channel of spectrum being used.

The extra bandwidth is achieved in two ways. First, using multiple transmitters means that multiple channels of the same frequency can be sent simultaneously to the same receiving device. Both the transmitter and receiver must have the sophisticated and powerful computing power to coordinate and combine the multiple signals.

The bandwidth is also boosted by what’s called precoding or beamforming. This technology coordinates the signals from multiple transmitters to maximize the received signal gain and to reduce what is called the multipath fading effect. In simple terms the beamforming technology sets the power level and gain for each separate antenna to maximize the data throughput. Every frequency and its channel operates a little differently and beamforming favors the channels and frequency with the best operating capabilities in a given environment. Beamforming also allows for the cellular signal to be concentrated in a portion of the receiving area – to create a ‘beam’. This is not the same kind of highly concentrated beam that is used in microwave transmitters, but the concentration of the radio signals into the general area of the customer means a more efficient delivery of data packets.

The cellular companies, though, are focused on the second use of MIMO – the ability to connect to more devices simultaneously. One of the key parameters of the 5G cellular specifications is the ability of a cell site to make up to 100,000 simultaneous connections. The carriers envision 5G is the platform for the Internet of Things and want to use cellular bandwidth to connect to the many sensors envisioned in our near-future world. This first generation of massive MIMO won’t bump cell sites to 100,000 connections, but it’s a first step at increasing the number of connections.

Massive MIMO is also going to facilitate the coordination of signals from multiple cell sites. Today’s cellular networks are based upon a roaming architecture. That means that a cellphone or any other device that wants a cellular connection will grab the strongest available cellular signal. That’s normally the closest cell site but could be a more distant one if the nearest site is busy. With roaming a cellular connection is handed from one cell site to the next for a customer that is moving through cellular coverage areas.

One of the key aspects of 5G is that it will allow multiple cell sites to connect to a single customer when necessary. That might mean combining the signal from a MIMO antenna in two neighboring cell sites. In most places today this is not particularly useful since cell sites today tend to be fairly far apart. But as we migrate to smaller cells the chances of a customer being in range of multiple cell sites increases. The combining of cell sites could be useful when a customer wants a big burst of data, and coordinating the MIMO signals between neighboring cell sites can temporarily give a customer the extra needed bandwidth. That kind of coordination will require sophisticated operating systems at cell sites and is certainly an area that the cellular manufacturers are now working on in their labs.

More Crowding in the OTT Market

It seems like I’ve been seeing news almost weekly about new online video providers. This will put even more pressure on cable companies as more people find an online programming option to suit them. This also means that a likely shakeout of the OTT industry with such a crowded field of competitors all vying for the same pool of cord-cutters.

NewTV. This is an interesting new OTT venture that was founded by Jeffrey Katzenberg, former chairman of Walt Disney and headed by Meg Whitman, former CEO of Hewlett Packard Enterprise and also from Disney. The company has raised $1 billion in and has support from every major Hollywood studio including 21st Century Fox, Disney, NBCUniversal, Sony Pictures Entertainment, and Viacom.

Rather than take on Netflix and other OTT content directly the company plans to develop short 10-minute shows aimed exclusively at cellphone users. They plan both free content supported by advertising and a subscription plan that would use the ‘advertising-light’ option used by Hulu.

AT&T already owns a successful OTT product with HBO Now that has over 5 million customers. John Stankey, the head of WarnerMedia says the plan is to create additional bundles of content centered around HBO that bring in other WarnerMedia content and selected external content. He admits that HBO alone does not represent enough content to be a full-scale OTT alternative for customers.

AT&T’s goal is to take advantage of HBO’s current reputation and to position their content in the market as premium and high quality as a way to differentiate themselves from other OTT providers.

Apple has been talking about getting into the content business for a decade, and they have finally pulled the trigger. The company invested $1 billion this year and now has 24 original series in production as the beginning of a new content platform. Among the new shows is a series about a morning TV show starring Reese Witherspoon and Jennifer Aniston.

The company hired Jamie Erlicht and Zack Van Amburg from Sony Pictures Television to operate the new business and has since hired other experienced television executives. They also are working on other new content and just signed a multiyear deal with Oprah Winfrey. The company has not announced any specific plans for airing and using the new content, but that will be coming soon since the first new series will probably be ready by March of 2019.

T-Mobile. As part of the proposed merger with Sprint, T-Mobile says they plan to launch a new ‘wireless first’ TV platform that will deliver 4K video using its cellular platform. On January T-Mobile purchased Layer3 which has been offering a 275 channel HD line-up in a few major markets.

The T-Mobile offering will be different than other OTT in that the company is shooting for what they call the quad play that bundles video, in-home broadband (delivered using cellular frequency), mobile broadband and voice. The company says that the content will only be made available to T-Mobile customers and they view it as a way to reduce churn and gain cellular market share.

The Layer 3 subsidiary will also continue to pursue partnerships to gain access to customers through fiber networks, such as the arrangement they currently have with the municipal fiber network in Longmont, Colorado.

Disney. Earlier this year the company announced the creation of a direct-to-consumer video service based upon the company’s huge library of popular content. Disney gained the needed technology by purchasing BAMTech, the company that supports Major League Baseball online. Disney also is bolstering its content portfolio through the purchase of Twenty-First Century Fox.

Disney plans to launch an ESPN-based sports bundle in early 2019. They have not announced specific plans on how and when to launch the rest of their content, but they canceled an agreement with Netflix for carrying Disney content.

FCC Speed Tests for ISPs

ISPs awarded CAF II funding in the recent auction need to be aware that they will be subject to compliance testing for both latency and speeds on their new broadband networks. There are financial penalties for those failing to successfully meet these tests. The FCC revised the testing standards in July in Docket DA 18-710. These new testing standards become effective with testing starting in the third quarter of 2019. There new standards will replace the standards already in place for ISPs that receive funding from earlier rounds of the CAF program as well as ISPs getting A-CAM or other rate-of-return USF funding.

ISPs can choose between three methods for testing. First, they may elect what the FCC calls the MBA program, which uses an external vendor, approved by the FCC, to perform the testing. This firm has been testing speeds for the network built by large telcos for many years. ISPs can also use existing network tools if they are built into the customer CPE that allow test pinging and other testing methodologies. Finally, an ISP can install ‘white boxes’ that provide the ability to perform the tests.

The households to be tested are chosen at random by the ISP every two years. The FCC doesn’t describe a specific method for ensuring that the selections are truly random, but the ISP must describe to the FCC how this is done. It wouldn’t be hard for an ISP to fudge the results of the testing if they make sure that customers from slow parts of their network are not in the testing sample.

The number of tests to be conducted varies by the number of customers for which a recipient is getting CAF support; if the number is CAF households is 50 or fewer they must test 5 customers; if there are 51-500 CAF households they must test 10% of households. For 500 or greater CAF households they must test 50. ISPs that declare a high latency must test more locations with the maximum being 370.

ISPs must conduct the tests for a solid week, including weekends in every quarter to eliminate seasonality. Tests must be conducted in the evenings between 6:00 PM and 12:00 PM. Latency tests must be done every minute during the six-hour testing window. Speed tests – run separately for upload speeds and download speeds – must be done once per hour during the 6-hour testing window.

The FCC has set expected standards for the speed tests. These standards are based upon the required speeds of a specific program – such as the first CAF II program that required speeds of at least 10/1 Mbps. In the latest CAF program the testing will be based upon the speeds that the ISP declared they could meet when entering the action – speeds that can be as fast as 1 Gbps.

ISPs are expected to meet latency standards 95% of the time. Speed tests must achieve 80% of the expected upland and download speed 80% of the time. This might surprise people living in the original CAF II areas, because the big telcos only need to achieve download speeds of 8 Mbps for 80% of customers to meet the CAF standard. The 10/1 Mbps standard was low enough, but this lets the ISPs off the hook for underperforming even for that incredibly slow speed. This requirement means that an ISP guaranteeing gigabit download speeds needs to achieve 800 Mbps 80% of the time. ISPs that meet the speeds and latencies for 100% of customers are excused from quarterly testing and only have to test once per year.

There are financial penalties for ISPs that don’t meet these tests.

  • ISPs that have between 85% and 100% of households that meet the test standards lose 5% of their FCC support.
  • ISPs that have between 70% and 85% of households that meet the test standards lose 10% of their FCC support.
  • ISPs that have between 55% and 75% of households that meet the test standards lose 15% of their FCC support.
  • ISPs with less than 55% of compliant households lose 25% of their support.

For CAF II auction winners these reductions in funding would only be applied to the remaining time periods after they fail the tests. This particular auction covers a 10-year period of time and the testing would start once the new networks are operational, which is required to be completed between years 3 and 6 after funding.

This will have the biggest impact on ISPs that overstated their network capability. For instance, there were numerous ISPs that claimed the ability in the CAF auction to deliver 100 Mbps and they are going to lose 25% of the funding if they deliver speeds slower than 80 Mbps.

The Continued Growth of Data Traffic

Every one of my clients continues to see explosive growth of data traffic on their broadband networks. For several years I’ve been citing a statistic used for many years by Cisco that says that household use of data has doubled every three years since 1980. In Cisco’s last Visual Networking Index published in 2017 the company predicted a slight slowdown in data growth to now double about every 3.5 years.

I searched the web for other predictions of data growth and found a report published by Seagate, also in 2017, titled Data Age 2025: The Evolution of Data to Life-Critical. This report was authored for Seagate by the consulting firm IDC.

The IDC report predicts that annual worldwide web data will grow from the 16 zettabytes of data used in 2016 to 163 zettabytes in 2025 – a tenfold increase in nine years. A zettabyte is a mind-numbingly large number that equals a trillion gigabytes. That increase means an annual compounded growth rate of 29.5%, which more than doubles web traffic every three years.

The most recent burst of overall data growth has come from the migration of video online. IDC expects online video to keep growing rapidly, but also foresees a number of other web uses that are going to increase data traffic by 2025. These include:

  • The continued evolution of data from business background to “life-critical”. IDC predicts that as much as 20% of all future data will become life-critical, meaning it will directly impact our daily lives, with nearly half of that data being hypercritical. As an example, they mention the example of how a computer crash today might cause us to lose a spreadsheet, but that data used to communicate with a self-driving car must be delivered accurately. They believe that the software needed to ensure such accuracy will vastly increase the volume of traffic on the web.
  • The proliferation of embedded systems and the IoT. Today most IoT devices generate tiny amounts of data. The big growth in IoT data will not come directly from the IoT devices and sensors in the world, but from the background systems that interpret this data and make it instantly usable.
  • The increasing use of mobile and real-time data. Again, using the self-driving car as an example, IDC predicts that more than 25% of data will be required in real-time, and the systems necessary to deliver real-time data will explode usage on networks.
  • Data usage from cognitive computing and artificial intelligence systems. IDC predicts that data generated by cognitive systems – machine learning, natural language processing and artificial intelligence – will generate more than 5 zettabytes by 2025.
  • Security systems. As we have more critical data being transmitted, the security systems needed to protect the data will generate big volumes of additional web traffic.

Interestingly, this predicted growth all comes from machine-to-machine communications that are a result of us moving more daily functions onto the web. Computers will be working in the background exchanging and interpreting data to support activities such as traveling in a self-driving car or chatting with somebody in another country using a real-time interpreter. We are already seeing the beginning stages of numerous technologies that will require big real time data.

Data growth of this magnitude is going to require our data networks to grow in capacity. I don’t know of any client network that is ready to handle a ten-fold increase in data traffic, and carriers will have to beef up backbone networks significantly over time. I have often seen clients invest in new backbone electronics that they hoped to be good for a decade, only to find the upgraded networks swamped within only a few years. It’s hard for network engineers and CEOs to fully grasp the impact of continued rapid data growth on our networks and it’s more common than not to underestimate future traffic growth.

This kind of data growth will also increase the pressure for faster end-user data speeds and more robust last-mile networks. If a rural 10 Mbps DSL line feels slow today, imagine how slow that will feel when urban connections are far faster than today. If the trends IDC foresees hold true, by 2025 there will be many homes needing and using gigabit connections. It’s common, even in the industry to scoff at the usefulness of residential gigabit connections, but when our use of data needs keeps doubling it’s inevitable that we will need gigabit speeds and beyond.

Going Wireless-only for Broadband

According to New Street Research (NSR), up to 14% of homes in the US could go all-wireless for broadband. They estimate that there are 17 million homes which are small enough users of bandwidth to justify satisfying their broadband needs strictly using a cellular connection. NSR says that only about 6.6 million homes have elected to go all-wireless today, meaning there is a sizable gap of around 10 million more homes for which wireless might be a reasonable alternative.

The number of households that are going wireless-only has been growing. Surveys by Nielsen and others have shown that the trend to go wireless-only is driven mostly by economics, helped by the ability of many people to satisfy their broadband demands using WiFi at work, school or other public places.

NSR also predicts that the number of homes that can benefit by going wireless-only will continue to shrink. They estimate that only 14 million homes will benefit by going all-wireless within five years – with the decrease due to the growing demand of households for more broadband.

There are factors that make going wireless an attractive alternative for those that don’t use much broadband. Cellular data speeds have been getting faster as cellular carriers continue to implement full 4G technology. The first fully compliant 4G cell site was activated in 2017 and full 4G is now being deployed in many urban locations. As speeds get faster it becomes easier to justify using a cellphone for broadband.

Of course, cellular data speeds need to be put into context. A good 4G connection might be in the range of 15 Mbps. That speed feels glacial when compared to the latest speeds offered by cable companies. Both Comcast and Charter are in the process of increasing data speeds for their basic product to between 100 Mbps and 200 Mbps depending upon the market. Cellphones also tend to have sluggish operating systems that are tailored for video and that can make regular web viewing feel slow and clunky.

Cellular data speeds will continue to improve as we see the slow introduction of 5G into the cellular network. The 5G specification calls for cellular data speeds of 100 Mbps download when 5G is fully implemented. That transition is likely to take another decade, and even when implemented isn’t going to mean fast cellular speeds everywhere. The only way to achieve 100 Mbps speeds is by combining multiple spectrum paths to a given cellphone user, probably from multiple cell sites. Most of the country, including most urban and suburban neighborhoods are not going to be saturated with multiple small cell sites – the cellular companies are going to deploy faster cellular speeds in areas that justify the expenditure. The major cellular providers have all said that they will be relying on 4G LTE cellular for a long time to come.

One of the factors that is making it easier to go wireless-only is that people have access throughout the day to WiFi, which is powered from landline broadband. Most teenagers would claim that they use their cellphones for data, but most of them have access to WiFi at home and school and at other places they frequent.

The number one factor that drives people to go all-wireless for data is price. Home broadband is expensive by the time you add up all of the fees from a cable company. Since most people in the country already has a cellphone then dropping the home broadband connection is a good way for the budget-conscious to control their expenses.

The wireless carriers are also making it easier to go all wireless by including some level of video programming with some cellular plans. These are known as zero-rating plans that let a customer watch some video for free outside of their data usage plan. T-Mobile has had these plans for a few years and they are now becoming widely available on many cellular plans throughout the industry.

The monthly data caps on most wireless plans are getting larger. For the careful shopper who lives in an urban area there are usually a handful of truly unlimited data plans. Users have learned, though, that many such plans heavily restrict tethering to laptops and other devices. But data caps have creeped higher across-the-board in the industry compared to a few years ago. Users who are willing to pay more for data can now buy the supposedly unlimited data plans from the major carriers that are actually capped between 20 – 25 GB per month.

There are always other factors to consider like cellular coverage. I happen to live in a hilly wooded town where coverage for all of the carriers varies block by block. There are so many dead spots in my town that it’s challenging to use cellular even for voice calls. I happen to ride Uber a lot and it’s frustrating to see Uber drivers get close to my neighborhood and get lost when they lose their Verizon signal. This city would be a hard place to rely only on a cellphone. Rural America has the same problem and regardless of the coverage maps published by the cellular companies there are still huge areas where rural cellular coverage is spotty or non-existent.

Another factor that makes it harder to go all-wireless is working from home. Cellphones are not always adequate when trying to log onto corporate WANs or for downloading and working on documents, spreadsheets and PowerPoints. While tethering to a computer can solve this problem, it doesn’t take a lot of working from home to surpass the data caps on most cellular plans.

I’ve seen a number of articles in the last few years talking claiming that the future is wireless and that we eventually won’t need landline broadband. This claim ignores the fact that the amount of data demanded by the average household is doubling every three years. The average home uses ten times or more data on their landline connection today than on their cellphones. It’s hard to foresee the cellphone networks able to close that gap when the amount of landline data use keeps growing so rapidly.

Winners of the CAF II Auction

The FCC CAF II reverse auction recently closed with an award of $1.488 billion to build broadband in rural America. This funding was awarded to 103 recipients that will collect the money over ten years. The funded projects must be 40% complete by the end of three years and 100% complete by the end of six years. The original money slated for the auction was almost $2 billion, but the reverse auction reduced the amount of awards and some census blocks got no bidders.

The FCC claims that 713,176 rural homes will be getting better broadband, but the real number of homes with a benefit from the auction is 513,000 since the auction funded Viasat to provide already-existing satellite broadband to 190,000 homes in the auction.

The FCC claims that 19% of the homes covered by the grants will be offered gigabit speeds, 53% will be offered speeds of at least 100 Mbps and 99.75% will be offered speeds of at least 25 Mbps. These statistics have me scratching my head. The 19% of the homes that will be offered gigabit speeds are obviously going to be getting fiber. I know a number of the winners who will be using the funds to help pay for fiber expansion. I can’t figure what technology accounts for the rest of the 53% of homes that supposedly will be able to get 100 Mbps speeds.

As I look through the filings I note that many of the fixed wireless providers claim that they can serve speeds over 100 Mbps. It’s true that fixed wireless can be used to deliver 100 Mbps speeds. To achieve that speed customers either need to be close to the tower or else a wireless carrier has to dedicate extra resources to that customer to achieve that speed – meaning less of that tower can be used to serve other customers. I’m not aware of any WISPs that offer ubiquitous 100 Mbps speeds, because to do so means serving a relatively small number of customers from a given tower. To be fair to the WISPs, their CAF II filings also say they will be offering slower speeds like 25 Mbps and 50 Mbps. The FCC exaggerated the results of the auction by claiming that any recipient capable of delivering 100 Mbps to a few customers will be delivering it to all customers – something that isn’t true. The fact is that not many of the households over the 19% getting fiber will ever buy 100 Mbps broadband. I know the FCC wants to get credit for improving rural broadband, but there is no reason to hype the results to be better than they are.

I also scratch my head wondering why Viasat was awarded $122 million in the auction. The company is the winner of funding for 190,595 households, or 26.7% of the households covered by the entire auction. Satellite broadband is every rural customer’s last choice for broadband. The latency is so poor on satellite broadband that it can’t be used for any real time applications like watching live video, making a Skype call, connecting to school networks to do homework or for connecting to a corporate WAN to work from home. Why does satellite broadband even qualify for the CAF II funding? Viasat had to fight to get into the auction and their entry was opposed by groups like the American Cable Association. The Viasat satellites are already available to all of the households in the awarded footprint, so this seems like a huge government giveaway that won’t bring any new broadband option to the 190,000 homes.

Overall the outcome of the auction was positive. Over 135,000 rural households will be getting fiber. Another 387,000 homes will be getting broadband of at least 25 Mbps, mostly using fixed wireless, with the remaining 190,000 homes getting the same satellite option they already have today.

It’s easy to compare this to the original CAF II program that gave billions to the big telcos and only required speeds of 10/1 Mbps. That original CAF II program was originally intended to be a reverse auction open to anybody, but at the last minute the FCC gave all of the money to the big telcos. One has to imagine there was a huge amount of lobbying done to achieve that giant giveaway.

Most of the areas covered by the first CAF II program had higher household density than this auction pool, and a reverse auction would have attracted a lot of ISPs willing to invest in faster technologies than the telcos. The results of this auction show that most of those millions of homes would have gotten broadband of at least 25 Mbps instead of the beefed-up DSL or cellular broadband they are getting through the big telcos.

Upgrading Broadband Speeds

A few weeks ago Charter increased my home broadband speeds from 60 Mbps to 130 Mbps with no change in price. My upload speed seems to be unchanged at 10 Mbps. Comcast is in the process of speed upgrades and is increasing base speeds to between 100 Mbps and 200 Mbps download speeds in various markets.

I find it interesting that while the FCC is having discussions about keeping the definition of broadband at 25 Mbps that the big cable companies – these two alone have over 55 million broadband customers – are unilaterally increasing broadband speeds.

These companies aren’t doing this out of the goodness of their hearts, but for business reasons. First, I imagine that this is a push to sharpen the contrast with DSL. There are a number of urban markets where customers can buy 50 Mbps DSL from AT&T and others and this upgrade opens up a clear speed difference between cable broadband and DSL.

However, I think the main reason they are increasing speeds is to keep customers happy. This change was done quietly, so I suspect that most people had no idea that the change was coming. I also suspect that most people don’t regularly do speed tests and won’t know about the speed increase – but many of them will notice better performance.

One of the biggest home broadband issues is inadequate WiFi, with out-of-date routers or poor router placement degrading broadband performance. Pushing faster speeds into the house can overcome some of these WiFi issues.

This should be a wake-up call to everybody else in the industry to raise their speeds. There are ISPs and overbuilders all across the country competing against the giant cable companies and they need to immediately upgrade speeds or lose the public relations battle in the market place. Even those who are not competing against these companies need to take heed, because any web search is going to show consumers that 100 Mbps broadband or greater is now the new standard.

These unilateral changes make a mockery of the FCC. It’s ridiculous to be having discussions about setting the definition of broadband at 25 Mbps when the two biggest ISPs in the country have base product speeds 5 to 8 times faster than that. States with broadband grant programs also have the speed conversation and this will hopefully alert them that the new goal for broadband needs to be at least 100 Mbps.

These speed increases were inevitable. We’ve known for decades that the home demand for broadband has been doubling every three years. When the FCC first started talking about 25 Mbps as the definition of acceptable broadband, the math said that within six years we’d be having the same discussion about 100 Mbps broadband – and here we are having that discussion.

The FCC doesn’t want to recognize the speed realities in the world because they are required by law to try to bring rural speeds to be par with urban speeds. But this can’t be ignored because these speed increases are not just for bragging rights. We know that consumers find ways to fill faster data pipes. Just two years ago I saw articles wondering if there was going to be any market for 4K video. Today, that’s the first thing offered to me on both Amazon Prime and Netflix. They shoot all new programming in 4K and offer it at the top of their menus. It’s been reported that at the next CES electronics shows there will be several companies pushing commercially available 8K televisions. This technology is going to require a broadband connection between 60 Mbps and 100 Mbps depending upon the level of screen action. People are going to buy these sets and then demand programming to use them – and somebody will create the programming.

8K video is not the end game. Numerous companies are working on virtual presence where we will finally be able to converse with a hologram of somebody as if they were in the same room. Early versions of this technology, which ought to be available soon will probably use the same range of bandwidth as 8K video, but I’ve been reading about near-future technologies that will produce realistic holograms and that might require as much as a 700 Mbps connection – perhaps the first real need for gigabit broadband.

While improving urban data speeds is great, every increase in urban broadband speeds highlights the poor condition of rural broadband. While urban homes are getting 130 – 200 Mbps for decent prices there are still millions of homes with either no broadband or with broadband at speeds of 10 Mbps or less. The gap between urban and rural broadband is growing wider every year.

If you’ve been reading this blog you know I don’t say a lot of good things about the big cable companies. But kudos to Comcast and Charter for unilaterally increasing broadband speeds. Their actions speak louder than anything that we can expect out of the FCC.