How Safe is that New Toy?

Every Christmas season the Mozilla Foundation reviews a list of IoT devices that do not protect privacy. It seems like almost anything we buy today that includes electronics also connects to the Internet. We are filling our homes up with devices that provide feedback to the manufacturers or others, often without our knowledge or approval.

The Mozilla Foundation has established a wish list set of security standards that they think all devices sold to the public should meet. The Mozilla standards include:

  • Transmissions between the devices and the Internet should be encrypted.
  • Devices should be able to receive and implement security updates.
  • Devices should allow users to create strong passwords.
  • Devices should include a system to allow users to control or disable vulnerabilities.
  • Manufacturers should provide a clear and accessible privacy policy.
  • Newly added to Mozilla’s list is that manufacturers should disclose when they are deploying AI in the interaction with users or with their data.

The Foundation reviewed 136 devices this year that it thinks are likely presents this Christmas season. This includes a wide range of devices including smart home, home office, toys and games, entertainment, wearables, health and exercise, and devices for pets. Mozilla applies a “Privacy Not Included’ rating for any device that badly fails the Mozilla privacy standards. Mozilla has been rating devices annually since 2017 and says that a number of manufacturers have beefed up security measures on devices in response to a poor rating by Mozilla.

Some of the reviewed devices got a good rating from Mozilla, but even most of the best devices have some security issues. An example is the Nintendo Switch handheld gaming device. Mozilla says that Nintendo does a good job with privacy, security, and parental controls, but the company had a data breach in 2020 where the personal data from 300,000 accounts were compromised and that revealed name, date of birth, email addresses – enough for anybody to then hack individuals. They also note that while Nintendo doesn’t share user data with third parties, there is no guarantee that the gaming companies using the Nintendo device aren’t selling and sharing user data. This review is a good example of the kinds of security risks that the public is taking with devices we routinely bring into our homes. Every device that was reviewed had at least a few security concerns.

Some devices have almost no security features. An example is the Ubtech Jimu Robot Kits. These are coding robot kits where kids can program robots to navigate around the home using infrared sensors and sonar. The robots come in a variety of shapes like a dragon, a truck, or a combat robot. The robots contain almost none of Mozilla’s wish list of protections. The app associated with the robots can track what is seen and heard in the home and can convey user location – not great traits for a kid’s toy. The company provides no privacy policy for users and doesn’t disclose if and how it collects or uses data from the robots. Like many new electronic devices, the robots can only be used when connected to the cloud – so using the device automatically transmits a wide range of data to the company servers.

This annual list provides a good reminder that many of the devices we buy are unsafe. Most of us turn on new electronic devices without reading any instructions or privacy warnings. We type in our personal data into apps to enable the device (or let it gather that data automatically by signing up through Facebook or some other app). Even when there are privacy settings available, we rarely enable them. Device manufacturers largely count on the public not taking even basic precautions. This enables them to say they are safe while still gathering information from most of the users of the devices.

Tis’ the Season

In a year that has been tough on everybody, the biggest ISPs have decided to give us all a holiday present this year with a new round of rate increases.

Charter usually starts the rate increase season and raised broadband rates again this year on December 1. This year the rate increase is a large one as Charter raised rates on all standalone broadband products by $5 per month. That raises the rate for the basic broadband tier to $74.99, an increase of 7.1%. Charter raised cable rates in August.

Charter has also petitioned the FCC to allow the company to start charging for data caps in May 2021. The company has been prohibited from charging data caps until May 2023 as a condition of the merger with Time Warner Cable – but the company is asking to be released from that condition two years early.

Comcast is raising broadband, cable, and other rates effective January 1, 2021. Broadband rates are increasing by $3 per month for all packages. The company’s primary product, Performance broadband is increasing to $76, an increase of 4.1%.

Comcast is also increasing cable rates. For example, the cost of its basic TV product Choice TV is increasing from $25 to $30. That increase is carrying through larger cable plans. In addition to increasing the rate if basic cable, Comcast is increasing the hidden fees it charges. These are rates that are not advertised for new subscribers but for which all cable customers pay. Depending upon the market, Comcast is raising the Broadcast Fee by $4.50 per month. For many markets, this means an increase from $11.70 per month to $16.70 per month. Comcast is also increasing the Regional Sports Network fee by $2 per month. Altogether, for a customer that is not locked into a promotional special, this is an increase of as much as $11.50. Even customers on promotional plans will see the increases in the hidden fees. As usual, Comcast blames the increase of cable fees on programmers (without mentioning that it owns the NBC family of broadcast channels).

Comcast is not stopping there and is increasing other rates as well. Installation and home-visit fees are increasing from $70 to $100. Comcast is raising the cost of the first settop box from $5.00 to $7.50 but is decreasing the cost of additional boxes from $9.95 to $7.50. Comcast is increasing the cost of its home security and smart home packages by $10 per month.

AT&T is increasing the cost of its cable products for DirecTV and U-verse TV effective January 17. AT&T is increasing the price of the 160-channel Entertainment package from $97 to $102. The price of the 185-channel Choice package is going from $115 per month to $122. AT&T is raising the rates on the 250-channel Ultimate package from $142 to $151 and is raising the rates on the 330-channel Premier package from $197 to $208. AT&T is increasing U-verse TV by similar amounts, between a $5 and $9 increase. Similar to Comcast, AT&T blames the cost increases in programmers although it owns the suite of programming it acquired with Warner Media.

AT&T is also adding a ‘Federal Cost Recover fee of $0.19 per month that it says covers the expenses that DirecTV pays to the FCC. However, there seems to be no basis for this fee and it’s not being charged elsewhere in the industry.

One thing is clear from this round of rate increases – the big cable companies are feeling their monopoly power and feel free to raise broadband fees by 4% to 7% annually. You don’t have to do a lot of math to foresee basic broadband rates reaching $90 per month within five years. Unfortunately, higher broadband rates are the only way for the big cable companies to keep meeting Wall Street earnings expectations.

It’s worth noting that AT&T and Verizon are not raising rates for fiber broadband. The cable companies don’t face competition from fiber in the majority of their footprint and have largely won the monopoly broadband battle. AT&T recently announced it will no longer sell DSL products and it won’t be surprising to see Verizon follow suit.

Funding Copper Again? Really?

One of the oddest things in an extremely odd RDOF auction is that the FCC allowed bidders to seek funding with promises of upgrading telephone fiber. Both CenturyLink and Windstream won funding in some places for improving rural DSL. This is disturbing on several levels.

First, these companies took money in the past to make these same upgrades, in some instances to the same properties. The FCC awarded over $11 billion in 2015 with the CAF II grants for the big telcos to upgrade rural DSL to speeds of at least 10/1 Mbps. That grant program just finishes this month since the telcos had six years to make the upgrade. Anybody working in the CAF II areas will tell you that the speed increases never happened. We conducted speed tests in rural counties without ever seeing a single DSL reading even approaching the 10/1 goal. We’ve had engineers crawl around the rural DSL deployments in counties and could see no evidence of upgrades.

I can’t say that the big telcos didn’t make any upgrades in CAF II. I’ve seen evidence of DSL getting better in county seats, and by definition, such improvements might stretch a mile or so into the areas surrounding towns. But such upgrades don’t help the rural areas covered by CAF II that are farther outside of towns. There may be entire counties where the big telcos legitimately made the rural upgrades, but I haven’t seen or heard about any of them.

To add insult to injury for the rural people that never saw faster broadband, the FCC is allowing the big telcos to get a seventh year of CAF II funding, to be paid in 2021. The telcos have zero obligations to make any improvements with the nearly $2.4 billion that will be coming their way. That money could have been better used being given to a program to expand rural broadband.

Or could it? The FCC has not learned any lessons from the CAF II debacle and allowed big telcos to again receive funding to improve rural DSL in the RDOF auction. This time the funding will be paid out over ten years and the winners of the RDOF grant have six years to make the upgrades. And this time the speed requirements are a lot higher, depending upon what the telcos promised in the RDOF short forms. The minimum speeds required from RDOF are 25/3 Mbps, although I’m hearing the telcos promised even faster speeds from DSL in some cases.

The real situation on the ground is that the big telcos can’t make these kinds of upgrades in rural America even if they wanted to. The big telcos have ignored maintenance on rural copper for decades and the copper is dead or dying. Month after month, additional pairs of rural copper go permanently dark and can’t be used. The networks are now dying.

Any big telco that says that they can bring rural copper to speeds of 25/3 Mbps or faster is lying – ask any of the big telco copper technicians. That would be a challenge in a rural area even somebody built brand-new copper plant. The smaller telcos made upgrades to DSL on rural copper twenty years ago, and even then, it was a big challenge. The expected life on a copper network is between 40 and 50 years, and most of the rural copper plant was built in the 1960s and 1970, with some even older. We’re entering a new decade where most of our rural copper networks are between 50 and 70 years old.

We need to stop pretending that allowing big telcos to take money to upgrade copper is anything other than a regulatory gift. AT&T finally told the truth about copper networks and is no longer adding any new customers onto DSL. It’s now only a matter of time before the company starts tearing down copper.

Most people don’t realize how remote and rural the areas are that are being funding by the RDOF grants. These are not areas that are close to county seats – but are instead the most remote parts of counties. These networks probably could not be upgraded to 25/3 Mbps speeds on copper with twice the money being awarded by the RDOF grants.

And the real killer of all of this is that any such upgrades would be a giant waste of federal money even if the upgrades could be made. The same FCC that is still allowing telcos to get grant money for DSL also has its head in the sand pretending that 25/3 Mbps is adequate broadband.

The big losers in all of this are people that live in the areas that will see grant awards go to copper. They were told they would get better broadband in 2015 with CAF II and that never happened. They now will be string out until 2027 with no improvement in broadband. When you reduce this debacle to homes where people can’t work from home and where students can’t do homework, this is a tragedy. But obviously not enough of a tragedy for the FCC to stop funding upgrades to telephone copper.

I’m Still Confused by the RDOF Grants

On the day after the RDOF awards were announced I wrote a blog lamenting that it looked like the FCC had allowed wireless carriers to walk away with huge amounts of the grant funding while promising gigabit speeds. As I started digging in the details of the awards, it’s a whole lot more complicated than that.

For example, it turns out that some of the WISPs that won big awards actually bid to build fiber. That sounds wonderful if they actually build fiber, but when you look at the details of the bidding that looks unlikely.

Let me provide one example. I helped a client estimate the cost of building fiber in one of the RDOF areas that extended over two counties. We had determined in this specific set of Census blocks that it didn’t make sense to take anything less than 50% of the RDOF reserve price. The math showed there were not enough customers to support the debt that would be incurred by accepting less than that. This particular area was finally awarded to a bidder claiming to bid on fiber at 10% of the reserve price – only 20% of the grant that made sense for my client. I can envision operators who might have been satisfied at a 40% grant, but it’s inconceivable to me that somebody could seriously be proposing to build fiber with the small amount of grant they accepted for this one area. Only a bidder with a huge amount of equity could make this work, because no bank will loan money for a fiber project that doesn’t generate enough cash to cover the debt payments.

If this was an isolated example I wouldn’t be bothered by it, but there are similar examples all around the country. This explains why the total RDOF grant closed at only $9 billion instead of the anticipated $16 billion – some grant bidders are promising to build fiber for absurdly low amount of grant awards. And frankly, this is not financially possible.

I can only think of one possible explanation for this. I can only suppose that such bidders think the FCC will let them swap out to a lower-cost technology such as fixed wireless during the long-form process. These bidders have to be betting that the FCC will work with them rather than be embarrassed by reversing huge amounts of grant awards. We all have to hope that the FCC doesn’t let that happen. Any such agreements would screw ISPs that were willing to build fiber in these same areas. The FCC would be punishing WISPs that bid fixed wireless honestly at the 100 Mbps tier. It’s hard not to use the word fraud to describe somebody that accepted grant awards to build fiber if there was no intention of actually building fiber.

The other thing that confuses me is how many of the grant winners will get financed. The RDOF awards are paid over ten years, but the networks must be built in six years or less. Even should somebody win an RDOF award that covers 100% of the cost of construction, they have to finance roughly 50% of the project to get it built. Since most RDOF grants cover less than 100% of the cost of construction (particularly where bidders bid down the size of the awards), the average winner will have to raise a lot of money to build the required networks. When I see that some of the big grant winners are small companies, I have serious doubt about the ability to borrow the needed funding. There are winners of the grants that will likely need to borrow hundreds of millions or even more than a billion dollars to make these grants work. You can count the participants in this auction that are capable of borrowing hundreds of millions on one hand.

I am really disheartened by the FCC’s imposed quiet period for the grant winners. I understand why there is a quiet period before the grant bidding so that parties don’t collude on the bids. I’d really like to see the press asking the same questions I have to some of the grant winners. Right now a grant winner can hide behind the quiet period.

I’m really hoping the long forms are made public long before the FCC accepts or rejects a grant winner. I think the people who live in the areas covered by these grants ought to have an opportunity to see what the grant applicant intends in terms of technology and time frame. I think all of the ISPs that didn’t win the grants deserve to see that applicants aren’t being allowed to negotiate to use a different technology than the one they bid to win the grant. I also hope the FCC requires iron-clad proof of the needed financing to make the grants work. I also hope the FCC levies the maximum penalties possible against bidders that made bids that can’t be fulfilled.

I still think this grant is likely to be a disaster before it’s over. If the FCC does their job on the long forms, I foresee billions of dollars being handed back to the FCC – something that will be a major embarrassment for the agency. It will be even worse if the FCC does a poor job on the long forms since unqualified ISPs will eventually fail to deliver the promised networks. I also fear that the big telcos will pull another CAF II and take the awards and not build the promised broadband. I still haven’t stopped scratching my head wondering why a satellite company should get a grant to cover places it that will already cover automatically. I bet that the FCC staff is looking at the details of the grant awards and already wondering how they can come close to making this work. The more I dig into the details, the more messes I find, and that is disheartening.

AT&T Aids the Shift to Cable Monopolies

If you live in a city where AT&T is the incumbent telephone company, the chances are high that the cable company is now a broadband monopoly. Unless some other ISP is building fiber, you no longer have a choice of broadband provider – it’s the cable company or nobody. When AT&T announced that it is no longer connecting DSL customers as of October 1, the company has fully ceded its historic telephone properties to its cable company competitors.

Broadband customers are not going to like what having a monopoly provider will mean. The changes won’t happen overnight, but when the cable company becomes a monopoly provider in a market they will eventually act like a monopoly.

  • The cable company will get slower on repairs because there is no threat of losing customers. Technicians will miss more appointments.
  • Customer service will deteriorate. Call waiting times will increase. Customer service reps will become less interested in fixing problems. It will become harder to win billing disputes.
  • Network performance will deteriorate because the cable company will be able to save money by shaving on maintenance budgets. Outages that used to last an hour might last for a day. Outages that used to last for a day might stretch to a week.
  • Financial incentives for new customers will disappear when it becomes clear that the cable company gets every broadband customer without trying.
  • Prices will increase. Data caps will be more strictly enforced in monopoly markets. Customers will no longer be able to negotiate lower rates and will be forced to pay full list prices.

If you don’t think monopoly abuse is real, talk to any rural DSL customer. They can describe in detail how service slowly deteriorated over time once it was clear that the big telcos had a rural broadband monopoly. The abuses that have been heaped upon rural DSL customers are almost unbelievable.

The slide into monopoly behavior won’t happen immediately – but it is inevitable. Most monopoly behavior originates in the local market as regional managers come to understand that they can improve financial performance (and bonuses) by cutting corners. There’s no reason to pay overtime to put a customer back in service – waiting until the next day is fine. There’s no reason to give a complaining customer a discount because they can’t leave. Slowly, bit by bit, the abuses that are to be expected in a monopoly market will become the new normal.

AT&T walking away from DSL will accelerate the shift to monopoly, but it’s a shift that’s already underway everywhere there is no fiber alternative. The telco DSL market has been sinking for a decade as cable broadband became faster. Quarter after quarter, hundreds of thousands of DSL customers are making the shift to cable company broadband. My firm does broadband surveys and we’re surprised when we find a city with a DSL market share over 30%, and in most markets, DSL penetration is now under 15%.

For the most part, urban DSL providers have already given in to the inevitable. The big telcos have continued to cut technicians who understand copper technology. The copper wires continue to age and every year more pairs of copper wire go dark and can’t be used. There are no longer manufacturers supporting some of the older versions of DSL and it’s nearly impossible to get replacement electronics. Long before AT&T formally announced they won’t connect new DSL customers, local telco technicians have been regularly telling that same story to customers.

AT&T is the first big telco to announce the end of DSL support, but they won’t be the last. I find it hard to think that Verizon won’t soon follow now that AT&T has taken a public stance. CenturyLink management has made it clear that they would love to get out of the copper business. Frontier will continue to try to make copper work because the company has no other revenue stream to fall back upon. But within the next decade, the copper wires are finally going to stop working for all of the telcos.

It will be interesting to see how long it takes the FCC to acknowledge this new reality. When AT&T announced the end of DSL, it took away the second broadband option from millions of households. But AT&T will continue to report its dwindling number of DSL customers, and my bet is that the FCC won’t recognize or admit that millions of homes now have only one broadband choice.

Microsoft Looking at Broadband and Agriculture

This past summer Microsoft announced a strategic alliance with the giant farm cooperative Land O’Lakes. The company is one of the country’s largest farmer-owned cooperatives and is a huge dairy producer and controls over 150 million acres of cropland.

The partnership intends to explore ways that Microsoft can leverage technology to improve farm production. Land O’Lakes has created a portfolio of software tools for members and Microsoft will work to unify the software in its Azure cloud platform. The hope is that a large and standardized agricultural software platform will be the best way to bring technology improvements to individual farmers in the cooperative.

One example of this initiative is a tool for dealing with early mitigation of plant stress. Crops are most susceptible to problems at the beginning of the growing cycle, and the software platform will help farmers to survey their fields with sensors and to take actions to optimize growth conditions. The software will suggest optimum fertilizer applications that will lower the amount of fertilizer used by applying the right kind of fertilizer only where needed. Over time the goal is to identify the right seed varieties for each farm to maximize output.

The biggest challenge in the Microsoft initiative is something that readers of this blog are well aware of – many farms have inadequate broadband. Rather than be stopped by lack of broadband, the partnership will be exploring solutions that work for well-connected farms as well as for those with poor broadband.

An example of a solution for areas with poor broadband is the Digital Dairy solution. This will use edge computing located at the farm that will be powerful enough to process data without having to send the data to the cloud. This initiative starts with tracking herd health with innovative practices that will tailor the feed to each cow to maximize health and milk production. Microsoft also will be concentrating on the supply chain to find strategies to make sure that milk doesn’t go bad during storage and transit. The ultimate goal is to provide traceability so that stores in the supply chain, and ultimately consumers will be able to know the source, quality, and freshness of dairy products.

One of the most exciting parts of the partnership is to use software tools to help with sustainability. This means studying the local soil to develop strategies to improve soil conditions since healthier soil ultimately means better crops and healthier food. One important component of sustainability is developing strategies for carbon sequestration, which is the process of permanently storing excess carbon in the soil. That’s good for the planet but also good for the soil.

Microsoft will also be working to bring better broadband to some of the Land O’Lakes farms. For several years, the company has been engaged with rural ISPs in its Airband program to use white space spectrum to bring better broadband to rural areas. That program got a huge shot in the arm a few months ago when the FCC finally agreed to free up more rural white space spectrum – something for which Microsoft has been lobbying for almost a decade.

This initiative is emblematic of the new approach that Microsoft is taking in the market. Rather than only developing generic software products, the company wants to work with individual industries to develop new and improved software tools specific for each industry. There is probably no better place for Microsoft to start than our farms.

The FCC Drops the Ball on RDOF

My Twitter feed is full of self-congratulations from FCC and other federal officials about the success of the recently completed RDOF grant. But I look at the results and I just see another big FCC failure. I see a grant where billions of federal dollars were misallocated due to another giant gaffe by the FCC.

How did the FCC fail? They allowed fixed wireless technology to bid as a gigabit technology. This means the FCC believes that fixed wireless technology is the functional equivalent of fiber. This is such an easily disprovable concept that it would be laughable if the FCC hadn’t just awarded billions of dollars to an imaginary gigabit wireless technology.

I don’t dislike fixed wireless technology, and in fact, I spent most of a decade as a customer of a WISP that did a great job bringing broadband to a place that otherwise would have had no broadband. Fixed wireless is a decent technology. When serving large areas there are many places today where it’s routinely being used today to deliver 50 Mbps to 75 Mbps broadband. In unique cases where a customer is near to a tower, I’ve seen speeds approaching 150 Mbps and I wouldn’t be surprised if WISPs can point out customers getting 200 Mbps. But as nice as that is, that’s not gigabit speeds and fixed wireless is not a gigabit technology, nor is it a functional equivalent to fiber.

This technology can be deployed in two ways – in a point-to-point configuration or as point-to-multipoint. Point-to-point wireless shoots bandwidth from a transmitter on a tower to reach a single endpoint. This is the technology used to beam backhaul between towers or is used in urban areas to reach between high-rise buildings. This is the technology that we’ve always referred to in the industry as microwave backhaul. This technology can deliver gigabit speeds, but is not a practical technology to use for residential broadband because there is only room for a small number of transmitters on any given tower or rooftop.

The technology used to provide WISP broadband is point-to-multipoint technology. A single antenna on a tower can connect to multiple customers. This technology is aim at delivering modest broadband to lots of customers. It can’t be used to deliver giant bandwidth to more than a few customers – and it’s not really designed to deliver gigabit download, and certainly not a symmetrical gigabit.

By allowing WISPS to claim gigabit capabilities, the FCC cheated huge numbers of people out of getting fiber. There were numerous electric cooperatives, small telcos, CLECs, fiber overbuilders, and public/private partnerships in the auction hoping to bring fiber to entire rural counties. In looking at the footprints won due to this fiction, I’m guessing the FCC’s decision to allow fixed wireless to falsely bid as gigabit technology killed fiber construction to at least a few hundred rural counties.

Six of the top ten winners of the auction will be deploying wireless technology and together account for over $3.2 billion – more than a third of the entire auction award. That list includes four wireless companies along with Windstream and Frontier.

Interestingly, WISPs that didn’t exaggerate the capability of the technology got clobbered in this grant. For example, Midcontinent won grant money in the CAF II reverse auction bidding fixed wireless as capable of 100 Mbps. They did the same in this auction and got steamrolled by the WISPs that won by bidding with the identical technology but falsely claiming gigabit capability.

I’m sure the people that get the networks built from this auction will be glad to get better broadband. But a few million of them could have instead gotten fiber that would have future-proofed them for the rest of the century. And sadly, some of the people in these grant areas won’t get broadband because they’re located in a hollow or behind a hill, out of reach of the wireless towers.

I don’t understand why the FCC couldn’t get this right. The FCC could have talked to any one of a hundred telecom engineers I know who would have laughed at the idea that fixed wireless can deliver gigabit speeds across big tracts of extremely rural America. A huge portion of this auction was based upon this lie, and that never bodes well for the long run.

There is an easy fix for this going forward. If the FCC is going to let WISPs exaggerate the technical ability to deliver gigabit speeds, then fiber providers should be allowed to bid in a 10-gigabit tier. That’s something that any fiber winner could easily guarantee, and which wouldn’t be a false claim. I also hope for severe penalties, up to having to return all of the grant money, for any grant winners in this auction that claimed gigabit speeds but then deliver 50 Mbps networks.

My Wish List for a New FCC

A change of administration will bring a change at the FCC as the majority swings from Republican to Democratic. I’ve always maintained a regulatory wish list, and the following are my hopes for what we’ll see out of the new FCC. Note that these aren’t predictions – just my own hopes.

Keep the Politics Out. This was added to the blog as it went to press. There is talk of a new Congress refusing to seat a new Chairman and a fifth commissioner in an attempt to thwart any attempt to re-regulate broadband. That would be a disaster for the industry (as it would be for any other regulatory agency). A partisan FCC with no voting majority is going to accomplish very little and will deadlock on most issues.

Kill the Seventh Year of CAF II. The CAF II program that handed over $10 billion to the big telcos to upgrade rural America to 10/1 Mbps broadband was a total bust. To rub salt into the wounds for the failed program, the current FCC just awarded the big telcos an additional $2.5 billion in a seventh year of subsidy – a payment for which the ISPs have no expected performance obligations. It’s just free money. I hope a new FCC kills that funding and uses that money to support new rural broadband.

Adopt a Realistic Definition of Broadband. It’s unbelievable that the current FCC is sticking to 25/3 Mbps as the definition of broadband. The FCC proudly claims in the 2020 report to Congress that 85% of homes in the country can buy broadband of at least 250/25 Mbps (a claim I think is overstated). If the FCC thinks that claim is true, then how can they think that the remaining 15% of homes deserve only one-tenth of the broadband speeds available to everybody else?

Fix the Damned Maps. The FCC has dallied for a few years to pull the trigger for new mapping, always with some excuse. They need to make this happen and make it happen right. The current FCC plans still don’t penalize ISPs for reporting marketing speeds instead of actual speeds. Unless that problem is fixed, any new mapping will be just as dreadful as existing mapping. And please, don’t hand out any more giant grants based upon badly flawed maps.

Stop Funding Slow Broadband Technologies. It’s mind-boggling that the current RDOF grants allow technologies as slow as 25/3 Mbps to claim grant funds – for a program that allows six years to implement the funded solution. That means that not only does the FCC pretend that 25/3 Mbps is adequate broadband today, but they are willing to saddle parts of rural America with those speeds for the next decade.

Bring Back Broadband Regulation. This FCC gutted broadband regulation. It probably raises eyebrows to see me ask for the return of regulation, but the FCC can’t currently even scold big ISPs for abusing customers. It’s highly unlikely that any FCC would go so far as to implement rate regulation, but one of the most important industries in the country needs a cop at the top to protect citizens against monopoly abuses.

Drop the 5G Rhetoric. The FCC has no business pushing 5G as the solution to everything broadband. The FCC is an independent agency. While the administration and Congress have every right to climb aboard the 5G bus, the FCC is supposed to be a neutral regulator and has no business supporting 5G over other technologies. The cellular companies behind 5G are extremely well-funded and we should let 5G play out as the market sees fit.

Don’t Sponsor a New National Broadband Plan. That’s what the government does when it wants to kick an issue down the road. We don’t need another panel of experts telling us what is wrong with rural broadband.

Say No to a Big ISP Once in a While. The current FCC seems to have decided every issue in favor of the biggest ISPs. I understand that AT&T, Comcast, Charter, and Verizon serve 73% of the broadband and cable customers and most of the cellular customers in the country. The current FCC approved everything on the big ISP’s regulatory wish lists. The role of a regulator is to strike a balance between the companies it regulates and the public – we need to get back to a balance between those two interests.

Understanding Oversubscription

It’s common to hear that oversubscription is the cause of slow broadband – but what does that mean? Oversubscription comes into play in any network when the aggregate subscribed customer demand is greater than the available bandwidth.

The easiest way to understand the concept is with an example. Consider a passive optical fiber network where up to 32 homes share the same neighborhood fiber. In the most common GPON technology, the customers on one of these neighborhood nodes (called a PON) share a total of 2.4 gigabits of download data.

If an ISP sells a 100 Mbps download connection to 20 customers on a PON, then in aggregate, those customers could use as much as 2 gigabits of data, meaning there is still unsold capacity – meaning that each customer is guaranteed the full 100 Mbps connection inside the PON. However, if an ISP sells a gigabit connection to 20 customers, then there are 20 gigabits of potential customer usage that have been pledged over the same 2.4-gigabit physical path. The ISP has sold more than 8 times more capacity to customers than is physically available, and this particular PON has an oversubscription ratio of 8.

When people first hear about oversubscription, they are often aghast – they think an ISP has done something shady and is selling people more bandwidth than can be delivered. But in reality, an oversubscription ratio recognizes how people use bandwidth. It’s highly likely in the example of selling gigabit connections that customers will always have access to their bandwidth.

ISPs understand how customers use bandwidth and they can take advantage of the real behavior of customers in deciding oversubscription ratios. In this example, it’s highly unlikely that any residential customer ever uses a full gigabit of bandwidth – because there is almost no place on the web that where a residential customer can connect at that speed.

But more importantly, a home subscribing to a gigabit connection mostly doesn’t use most of the bandwidth they’ve purchased. A home isn’t using much bandwidth when people are asleep or away from home. The residents of a gigabit home might spend the evening watching a few simultaneous videos and barely use any bandwidth. The ISP is banking on the normal behavior of its customers in determining a safe oversubscription ratio. ISPs have come to learn that households buying gigabit connections often don’t use any more bandwidth than homes buying 100 Mbps connections – they just complete web transactions faster.

Even should bandwidth in this example PON ever get too busy, the issue is likely temporary. For example, if a few doctors lived in this neighborhood and were downloading big MRI files at the same time, the neighborhood might temporarily cross the 2.4-gigabit available bandwidth limit. Since transactions happen quickly for a gigabit customer, such an event would not likely last very long, and even when it was occurring most residents in the PON wouldn’t see a perceptible difference.

It is possible to badly oversubscribe a neighborhood. Anybody who uses a cable company for broadband can remember back a decade when broadband slowed to a crawl when homes started watching Netflix in the evening. The cable company networks were not designed for steady video streaming and were oversubscribing bandwidth by factors of 200 to one or higher. It became routine for the bandwidth demand for a neighborhood to significantly surpass network capacity, and the whole neighborhood experienced a slowdown. Since then, the cable companies have largely eliminated the problem by decreasing the number of households in a node.

As an aside, ISPs know they have to treat business neighborhoods differently. Businesses might engage in steady large bandwidth uses like connecting to multiple branches, using software platforms in the cloud, using cloud-based VoIP, etc. An oversubscription ratio that works in a residential neighborhood is likely to be far too high in some business neighborhoods.

To make the issue even more confusing, the sharing of bandwidth at the neighborhood level is only one place in a network where oversubscription comes into play. Any other place inside the ISP network where customer data is aggregated and combined will face the same oversubscription issue. The industry uses the term chokepoint to describe a place in a network where bandwidth can become a constraint. There is a minimum of three chokepoints in every ISP network, and there can be many more. Bandwidth can be choked in the neighborhood as described above, can be choked in the primary network routers that direct traffic, or can be choked on the path between the ISP and the Internet. If any chokepoint in an ISP network gets over-busy, then the ISP has oversubscribed the portion of the network feeding into the chokepoint.

Are There any Cable Companies Left?

Are there any companies left that we can still call cable companies? Everything in the business press still refers to Comcast and Charter as cable companies and AT&T and Verizon as telephone companies. It’s getting harder to justify using these traditional labels and maybe the time is finally here to just start calling them all ISPs.

After all, these four companies collectively have 80 million broadband customers, meaning these four ISPs now have around 73% of all broadband customers in the country. They also have about 73% of all traditional cable customers, at 58 million, but that number has been tumbling and is down from 64 million just a year ago. It was only a few years ago where the broadband and cable TV markets crossed and broadband became the predominant product for these companies – but since then, the gap is growing quickly between the two product lines.

FierceVideo published an article in September that interviewed the CEOs of Comcast, Charter, and AT&T that asked each their views on the future of cable TV. Their responses are not surprising in an industry where traditional cable subscribers are shrinking quickly.

Brian Roberts of Comcast said he is “indifferent” for having customers on traditional cable TV or in Comcast’s Flex product that is free and ad-supported. And that doesn’t even count in the 14 million people who are now watching Comcast’s online Peacock service. Comcast sees all video products as important in making Comcast’s broadband customers stickier. AT&T’s John Stankey said something similar. He says he values the traditional cable TV product, but that the company is betting on the online offerings like AT&T TV and HBO Max.  Charter is the only large company still on the traditional track, and the company added cable customers in the second quarter of this year. But Charter CEO Tom Rutledge foresees growth coming to an end since the company feels obligated to pass video content rate increases on to cable customers.

Both Comcast and Charter have made up some of the loss in cable customers by launching a successful cellular product. At the end of the third quarter this year, Comcast had 2.6 million cellular customers and Charter had grown to 2 million. Both companies will be working to increase the profit margins of the cellular product by shifting traffic from resold cellular to company-owned small cell sites. Both companies have a built-in advantage in that they already own fiber deep into neighborhoods, so both should be able to deploy cellular small cells without having to lease transport. I find it interesting that these two traditional cable companies seem to be doing a better job of bunding in cellular service than was ever done by AT&T and Verizon – those two companies never seemed to find a way to do that.

In my writing about the industry, I have lately been referring to these big companies as ISPs or incumbents because the terms cable company and telephone company seems to have lost relevance. It’s becoming hard to distinguish between Comcast and AT&T in markets where AT&T is competing against Comcast using gigabit fiber.

I’m at a loss to explain why the industry continues to call Comcast a cable company. The percentage of revenue that comes from cable TV is dropping quickly, and the share of margin from cable is dropping even faster. The amount of money that AT&T makes from traditional telephone service is so small that it’s a challenge to even find the word telephone in the company’s financial report. But I guess old habits are hard to break. We instantly know who is being referred to when somebody says “large cable companies” or “large telcos”. But I’m still looking forward to a time when these monikers are so rare that we’ll have to explain what they mean to children.