Disclosing Hidden Fees

The FCC recently proposed a requirement that companies that sell traditional cable TV must disclose the full cost of video to customers. I’ve written about hidden fees many times over the years, and the fees have grown to become a big issue for customers.

Hidden fees are those that a cable provider doesn’t disclose when they advertise to attract new customers. At best, these fees are mentioned vaguely in the small print but are often difficult or impossible to find.

Consider the hidden fees that Comcast charges (I can make a similar list for any of the big cable companies). Comcast’s hidden fees differ by market, and the following is from a market we recently studied.

  • The broadcast fee is $28.70 per month. This is a fee where Comcast accumulates increases in programming costs each year instead of billing the cost increases into the price of cable.
  • The regional sports fee in the market we looked at is $6.10 per month – the fee varies depending upon the local sports networks carried. This fee accumulates increases in sports programming fees that Comcast has chosen not to include in the advertised price of cable TV.
  • Comcast charges $9.00 extra for each settop box – a fee that is not mentioned in advertised prices.

For Comcast, these fees are almost $43 per month in this market for a customer with one settop box. A customer that signs up for a $40 special promotion for video on the web will be shocked when the first bill shows up at $83.

The FCC is the federal regulator for cable TV, and it has always had the full authority to require cable companies to do disclose honest rates. What I find disappointing is that they’ve done nothing until now. This announcement is clearly in response to President Biden criticism of hidden fees in many industries, including airline and online ticket prices. Why hasn’t the FCC tackled hidden fees for cable TV until now? In this example, the hidden fees are greater than the advertised special price for the cable service. I think the average person would think that hidden fees probably mean paying a few extra bucks – not a fee that is more than the advertised price of the purchased service.

This topic has been taken on a few times at the state level. In 2018, Lori Swanson, the Attorney General of Minnesota, sued Comcast and asked for refunds for all cable customers who were billed hidden fees, retroactive to 2014, in violation of the states Prevention of Consumer Fraud Act and Uniform Deceptive Trade Practices Act. The suit concentrated on the Broadcast TV fee and the regional sports fee. In January 2020, Comcast settled with the Minnesota Attorney General’s Office and agreed to pay $1.4 million in refunds to 15,600 Minnesota customers. That’s a pretty small penalty for a practice that must net the company a huge cash flow nationwide.

To be fair to Comcast and the other cable providers, there are underlying costs that are covered by these fees, so the fees are not extra profit. Local television stations, nationwide TV networks, and sports programming have continued to increase the cost of programming at a much faster pace than inflation. Comcast has no choice but to pass on these costs to subscribers. What the FCC is finally criticizing is that cable companies sign new customers to a year-long contract based on advertised low rates and then surprise them with these giant hidden fees.

It’s troubling that the FCC could have done something about this at any time and never acted until now. But even more annoying, at least to me, is that FCC has this authority for cable TV but can’t ask similar questions about broadband rates. The price for broadband from the big ISPs has been rising at a rapid pace. As an example, I looked recently at some rate research I did in 2016, and the broadband prices for Charter have more than doubled since then. Unlike cable TV programming, there are no big underlying costs that have driven the big cable companies to increase broadband rates, and those increases were far in excess of inflation.

The FCC under Chairman Ajit Pai killed the agency’s ability to do anything about broadband prices when it killed Title II regulation. The FCC recently opened an inquiry into data caps, which might be a hopeful sign that the FCC is thinking about getting back into broadband regulation. The history of the FCC tells me to be cautious with any optimism when it comes to regulating the biggest companies in the industry. We’ve watched the FCC do nothing for years about hidden cable fees while it also killed its own ability to regulate broadband – two moves that clearly favor the big monopoly providers over the public.

Revisiting the Impact of Killing Net Neutrality

Ajit Pai recently wrote an article in the National Review where he talks about how his decision as head of the FCC to repeal net neutrality was the right one. He goes on to claim that repealing net neutrality was the driver behind the current boom in building fiber and upgrading other broadband technologies. He contrasts the progress of broadband in the U.S. with Europe and says that the FCC’s action is the primary reason we are seeing a fiber boom in the U.S.

He points out that his opponents who wanted to keep net neutrality made all sorts of crazy claims about how killing net neutrality would mean killing most of what people like about the Internet. He’s right that the arguments for keeping net neutrality got wrapped into politics, and most of the predicted consequences of ending net neutrality were exaggerated by those in favor of net neutrality. But the claims of the benefits for killing net neutrality were also badly exaggerated by the big carriers.

Why is he writing this now? With the possibility of seating a fifth Commissioner, he knows that the issue of reinstating net neutrality and Title II authority is going to be raised at the FCC. Killing net neutrality was his crowning achievement at the FCC, and he’s defending it as a way to lobby against bringing back net neutrality. I think we’re going to see a lot of this kind of rhetoric this year about how repealing net neutrality was the right thing to do. The big ISPs will be repeating the same rhetoric being told by Pai.

But Pai is not telling the real story. Industry insiders and experts didn’t expect much change to come from repealing net neutrality. The CEOS of all of the big cable companies admitted that keeping or killing net neutrality would have almost no impact on their businesses.

The real purpose of killing net neutrality was to kill Title II authority over broadband. That is an esoteric policy wonk issue and rarely got discussed during the debate. The Ajit Pai FCC gave up all rights of the agency to regulate broadband except for a few rules that are mandated by Congress. While there was a huge noise on both sides of the argument about killing net neutrality, the big ISPs only cared about killing regulation. That was the number one agenda item for Ajit Pai, and he handed the big ISPs everything on their wish list. If you want to understand the net neutrality issue from the big ISP perspective, substitute the word regulation for net neutrality every time they talk about the topic.

Pai cannot say with a straight face that there have been no repercussions about the end of broadband regulation. Consider Comcast and Charter, the two largest ISPs that together have over half of the broadband market. Since the end of Title II regulation, Comcast has raised rates for basic broadband to around $100, Charter is over $90 and is in the process of catching up to the Comcast rates.

At the same time, the FCC dropped all semblance of representing the public. The FCC complaint process for broadband customers might as well not even exist since nothing happens when a customer complains about mistreatment by an ISP.

Pai is taking credit for the boom in broadband competition. I’ve been advising ISPs on their expansion plans for decades, both before and after the death of Title II regulation, and I’ve never heard an ISP consider regulation as part of any discussion of expanding to a new market. Perhaps Pai can take credit for making it easier for others to compete against big cable companies since they have been free to raise rates at will – but I don’t think that’s something he wants to claim out loud. The real impetus for broadband competition came from the pandemic when many millions of customers found out that their broadband was inadequate. That experience has convinced people that they need fiber broadband and faster speeds, and fiber overbuilders are reacting to that market demand. The cable companies are rushing to upgrade speeds in response to the pressure from fiber competition.

None of the fiber boom is due to killing regulation. All that killing regulation did was allow big ISPs to run roughshod over customers without consequences. The FCC can’t even pull ISPs in to talk about their bad broadband behavior.

Ajit Pai’s accomplishment was not killing net neutrality – it was handing the reins of the broadband business to the big ISPs by allowing the ultimate regulatory capture of having the FCC walk away from its regulatory responsibilities. I’m sure that Pai is quite happy with that outcome, but you’ll never see Pai talking about what really happened.

Those Crazy Critters

I don’t expect a lot of readers on the day before the Fourth of July, so I’m updating my most popular blog ever.

Most people don’t realize the damage done every year to fiber and other wired networks by animals.

Squirrels. These cute rodents are the number one culprit for animal damage to aerial fiber. To a lesser degree, fiber owners report similar damage by rats and mice. Squirrels mainly chew on cables as a way to sharpen their teeth. Squirrel teeth grow up to 8 inches per year, and if squirrels aren’t wearing their teeth down from their diet, they look for other things to chew. There has been speculation that squirrels prefer fiber to other cables due to some oil or compound used in the fiber manufacturing process that attracts them.

Level 3, before they were part of CenturyLink, reported that 17% of their aerial fiber outages were caused by squirrels. A Google search turns up numerous outages caused by squirrels.

Companies use a wide variety of techniques to try to protect from squirrel damage – but anybody that has ever put out a bird feeder knows how persistent they can be. One deterrent is to use hardened cables that are a challenge for squirrels to chew through. However, there have been cases reported where squirrels partially chew through such cables, which still lets in water and can cause future damage.

A more common solution is adding a barrier to keep squirrels away from the cable. There are barrier devices that can be mounted on the pole to block squirrels from moving higher. There are also barriers that are mounted where cables meet a pole to keep the squirrels away from the fiber. There are companies that have tried more exotic solutions, like deploying ultrasonic blasters to drive squirrels away from fiber. In other countries, the fiber providers sometimes deploy poisonous or obnoxious chemicals to keep squirrels away from the fiber. These techniques are frowned upon or illegal in the US.

Gophers. Buried fiber also has a gnawing pest in the pocket gopher. There are thirteen species of pocket gophers that range from 5 to 13 inches in length. The two parts of the country with the most pocket gophers are the Midwest plains and the Southwest. Gophers live on plants and either eat roots or pull plants down through the soil.

Pocket gophers can cause considerable damage to fiber. These rodents will chew almost anything, and there have been reported outages from gophers that chewed through buried gas, water, and electric lines. Gophers typically dig between 6 and 12 inches below the surface and are a particular threat to buried drops.

There are several ways to protect against gophers. The best protection is to bury fiber deep enough to be out of gopher range, but that can add a lot of cost to buried drops. I have a few clients that bore drops rather than trench or vibrate them for this reason. Another protection is to enclose the fiber in a sheathe that is over 3 inches in diameter. A tube of that size is too big for the gophers to bite. Again, this is an expensive solution for buried drops. Another solution is to surround the buried fiber with 6 – 8 inches of gravel of at least 1-inch size – anything smaller gets pushed to the side by gophers.

There are examples of even more exotic animal damage to fiber. Large birds of prey have sharp talons that can create small cuts in the sheathe and allow in water. Flocks of birds repeatedly sitting on a fiber can cause sag and stretching of the fiber. I can remember living in Florida and seeing end-to-end birds sitting on wires – that has to add a lot of weight to a 200-foot fiber between poles. Over the last year I’ve seen several reports of sharks chewing on undersea fibers.

Finally, although not directly animal related, a common cause of rural fiber cuts happens when farmers hit fiber with a backhoe when burying dead livestock. They typically bury wherever an animal died, including in the buried fiber right-of-way.

Interesting Science Summer 2023

I ran across two interesting new technologies.

Power Out of Humidity. Jun Yao, a professor of engineering at the University of Massachusetts at Amherst, published a paper in Advanced Materials that shows that energy can be pulled from the moisture in the air using material that has been harvested from bacteria. The study says that almost any material can be used for this purpose as long as the material can be smashed into tiny particles and then reformed to include microscopic pores that are less than 100 nanometers in size. The tiny holes work by allowing water vapor to pass through the device in a way that creates an electric charge imbalance between the top and bottom of the device, Each nanotube is effectively a tiny battery that can continuously produce electricity.

The test devices created by the Jun Yao team have been labeled as Air-gen. One Air-gen unit is about the size of a fingernail and as thin as a hair. One unit can produce continuous energy that is strong enough to power a dot on a computer screen.

Jun Yao refers to the Air-gen as creating a tiny man-made cloud. The next step for the team is to determine the materials that produce the most electricity. There are also challenges to efficiently harvesting and storing the power from multiple Air-gen units. Making this into a useful technology will mean somehow stacking large numbers of these units together.

The potential for the technology is immense if it can ever be scaled. This would enable power to be generated locally in a way that produces no waste or byproducts. Since humidity drives the electric power generation, this would best be used in places with high humidity instead of in deserts. The ideal clean energy technology has always been described as one that can pull power out of nature – and this technology might become the ideal source of power if it can pull electricity out of the water vapor in the air.

The Anti-Laser. Physicists at the Hebrew University of Jerusalem and the Vienna University of Technology have developed what is being dubbed the anti-laser. This is a device that traps light until it is fully absorbed.

There are a lot of uses for technologies that can absorb light. Photovoltaic cells would be increasingly efficient if all light can be absorbed and turned into electricity. Light sensors could be far more precise by eliminating stray light signals. The ability to capture faint light images with a telescope could be enhanced by eliminating spurious light.

The technology takes advantage of the quantum properties of electromagnetic waves, where waveforms can undergo destructive interference if combined in an exact way. The scientists have created a device that pulses light in a way to enhance the interference. This is accompanied by a set of mirrors and lenses that trap light inside a cavity and bounce it over and over again until the light is absorbed by light-absorbing materials.

Interestingly, this mimics a phenomenon in nature. When a flashlight is shined in the eyes of animals with night vision, like owls, the light appears to be reflected back. This is due to a reflective layer that sits behind the retina of such animals. Reflecting the light back out allows two chances for the retina to capture what is being seen in the near-dark.

When the researchers started this experiment, they found that light entering the trap from different angles was not fully absorbed, and some light escaped. They solved the problem by arranging the mirrors in a way to force all light into a circular path until it is fully absorbed.

The FCC’s 12 GHz Decision

One of the hardest things that the FCC does is to decide spectrum policy. The agency has full authority to determine the details of how we use each slice of available spectrum. Most importantly, the agency can determine who can use spectrum – and that’s why the task is challenging.

In the last decade, it’s hard to think of any spectrum deliberation and decision that didn’t have to weigh the interests of multiple spectrum users. There is almost always somebody using spectrum that must be considered. The FCC must decide if there is more national benefit in allowing others to use the spectrum, and in doing so, the FCC has to decide if the current users can somehow stay in place. If not, the FCC has to find existing users a new slice of spectrum and cover the cost of moving existing users to the new frequencies.

There are multiple users of spectrum that want more spectrum than they have today. Probably first on this list are the cellular carriers who say they need scads more spectrum to keep up with the demands of our connected world. Satellite carriers are now clamoring for spectrum as they continue to add more users onto satellite broadband – and as they contemplate launching IoT and cellular services. The U.S. government and the military insist on having bands of spectrum for a wide variety of uses. WISPs want more spectrum for rural broadband. The companies that make WiFi equipment want more free spectrum for public use. Then there are the important niche players like connected automobiles, GPS, weather satellites, etc.

Finally, as odd as it sounds, there are also investors who have purchased spectrum in the past and who lobby the FCC to increase the value of their ownership – only in America would this be one of the underlying reasons to deliberate on the use of spectrum.

The recent FCC decision on the use of the lower 12 GHz spectrum is a good example of the FCC deliberation process on spectrum. This spectrum sits in the middle of the range of spectrum that the FCC recently dubbed as 6G. This spectrum has great characteristics – it can carry a lot of data while still being transmitted for decent distances. In general, the higher the frequency, the shorter the effective distance of a broadcast transmission.

This spectrum has been used for satellite broadband connections. At the prompting of others in the industry, the FCC decided to investigate if there are other ways to use this spectrum to satisfy more national needs.

  • Dell owned a lot of the 12 GHz spectrum and was lobbying to expand the use of the spectrum to improve its value.
  • DISH was hoping to use the 12 GHz spectrum as part of its nationwide roll-out of a new cellular network.
  • The other big cell companies jumped in with the suggestion that the spectrum be sold at auction for FWA broadband.
  • WISPs jumped in and suggested they could coexist with the other users and use the spectrum for rural broadband.
  • The WiFi coalition asked that the spectrum be allowed for free indoor usage.

As is usual in FCC spectrum proceedings, the various parties all filed testimony from experts that demonstrated that their proposed use could work. In this case, many of the proposals tried to show that the FCC could order terrestrial use of the spectrum without interfering with satellite base stations. The experts on both sides of the argument said that the arguments on the other side were incorrect.

The spectrum engineers at the FCC are left to somehow glean the truth from the conflicting arguments. Meanwhile, the FCC commissioners have to wrangle with the policy and lobbying aspects of the issue since all of the players do their best to bring pressure to bear on such FCC decisions.

The FCC decision was that the lower 12 GHz spectrum should continue to be used for satellite backhaul. The big winner in the decision was Starlink, and the biggest loser was DISH.

But the FCC left the door open to other uses and will continue its investigation. The FCC is still interested in hearing more about the use for point-to-point and point-to-multipoint wireless connections. That would serve as backhaul between towers and could be used to connect FWA and WISP customers. The FCC is also willing to consider the free unlicensed use of the spectrum for indoor use. So, as is often the case, the debate continues.

Another Lumen Reinvention?

Lumen has been the hardest large big telco to figure out. Verizon, AT&T, Frontier, Windstream, and others have clearly decided that building fiber is the future path to survival. Consequently, the other telcos are far ahead of Lucent in terms of fiber passings. In the recent investor webcast, CEO Kate Johnson talked about Lumen’s upcoming fiber plans. In doing so, she mentioned that Lumen only covers 12% of its passings with fiber – far behind the other telcos.

CenturyLink was one of the first big telcos after Verizon to embrace fiber. In 2017, under CEO Glen Post, the company had plans to pass 900,000 homes and businesses with fiber, with similar plans in upcoming years. Post, with a long telco background, had a clear vision of CenturyLink becoming a fiber-based ISP, at least in the many large cities it served.

However, at the end of 2017, the company took a sharp turn when it acquired Level 3. To nobody’s surprise, Jeff Storey from Level 3 took over as CEO, and the company changed its focus from residential fiber expansion to a focus on large business customers and small cell sites – the bread and butter of Level 3. By 2019, new fiber construction had dropped to 300,000 passings, with many of those coming from connecting large buildings to the network.

In 2020, the company’s stated focus was on adding large buildings to the network, and the company added 18,000 buildings to its fiber network, while only passing 400,000 total new fiber passings. That’s a low number of new fiber passings for a company that had 4.5 million broadband customers that year. 2020 was also the year when the company rebranded to become Lumen, a move to  distance itself from identification as a copper telco.

Instead of expanding fiber, Lumen decided to ditch copper assets and announced the sale of its last-mile networks in twenty states to Apollo Global Management in 2021. This brought a cash infusion needed for expansion and got rid of deteriorating copper networks.

Last year the company announced that its major expansion thrust was to beef up its large intercity fiber network across the country, with the goal of adding over 6 million miles of fiber strand by 2026. The original CenturyLink fiber network was starting to show some age, with many routes built forty years earlier. The company planned to upgrade to the newest fiber from Corning that can support 400-gigabit electronics. The new long-haul networks have fiber bundles between 432 and 864 fiber strands – much larger than the historical networks that had 96 to 144 fibers.

Lumen has been penalized by the many changes in its future direction by seeing the stock price go into the tank. CenturyLink stock peaked at almost $49 in 2007. By 2017, the stock had slipped to the mid-$20 range. Since then, the stock has dropped steadily and recently hit $1.80 per share after sitting at $10 per share a year earlier.

CEO Kate Johnson admitted that the company needs to do something different. The company eliminated its dividends to shareholders in the fourth quarter of 2022. The company is instead going to reinvest that money into building new fiber passings and has plans this year to connect 500,000 homes and businesses in 2023. It plans to build deeply into six major metro areas this year.

The company needs to reinvent itself. Lumen lost 253,000 broadband customers in 2022 – 7.7% of its broadband base. The company lost another 56,000 broadband customers in the first quarter this year, dropping the company to fewer than 3 million broadband customers while falling to be the eighth largest ISP after being surpassed by T-Mobile.

It’s not hard to understand, in retrospect, why the company has lost value. The company has seemingly reinvented itself every year since 2017 by changing its primary focus each year. Some of the changes, like adding more business buildings and beefing up the long-haul fiber network will likely generate a lot of cash and value in the long run. But Wall Street has clearly told the company to pick a future and stick to it.

Urban and Rural Speed Parity

I wrote a recent blog that talked about a recent trend where over 81% of U.S. households are now subscribed to a broadband speed of at least 200 Mbps. I got a lot of comments about that post, mostly from ISPs who think that we are fixated too much on speed and that consumers don’t need faster speeds – they think that the marketing departments of the big ISPs have just convinced folks that faster speeds are important.

But when talking about rural versus urban broadband speeds, the discussion can’t only be about what people need or don’t need. There was an edict from Congress in the Telecommunications Act of 1996 that directed the FCC to have parity between urban and rural broadband. There has been no change of law that has softened this mandate, so it’s still something that the FCC should be following:

ACCESS IN RURAL AND HIGH COST AREAS.—Consumers in all regions of the Nation, including low-income consumers and those in rural, insular, and high cost areas, should have access to telecommunications and information services, including interexchange services and advanced telecommunications and information services, that are reasonably comparable to those services provided in urban areas and that are available at rates that are reasonably comparable to rates charged for similar services in urban areas.

The FCC has repeatedly ignored this mandate. Probably the most extreme example is when they gave over $11 billion to the biggest telephone companies with CAF II to supposedly upgrade DSL to 10/1 Mbps. This was done at a time when cable companies had mostly upgraded to DOCSIS 3.0, and most urban areas had access to speeds between 100 Mbps and 250 Mbps. By the time the CAF II subsidy ended, cable companies had mostly upgraded to DOCSIS 3.1, and urban speed capabilities in most places had reached 1 gigabit.

If anything surpasses the absurdity of CAF II, it’s the national definition of broadband that still sits at a ridiculous speed of 25/3 Mbps. According to the OpenVault latest statistics, only 4.7% of households with broadband are subscribed to speeds under 50 Mbps. That number doesn’t include rural households who can’t buy broadband because there is no reasonable option where they live – but still, the number of households that are using slow speeds has gotten to be a small fraction of broadband users.

Over the last two years, FCC Chairperson Jessica Rosenworcel suggested that the definition of broadband should be updated to 100 Mbps download. The OpenVault statistics now put that speed in the rearview window. Any federal definition of broadband has to be at least 200 Mbps. I don’t need to put forth any elegant argument why this is so – the statistics make the point for me. Over 80% of U.S. households are now subscribed to speeds of at least 200 Mbps download. The language in the 1996 Act makes it clear that rural residents ought to have access to broadband that is reasonably comparable to the speeds offered in urban areas. Any interpretation of the phrase “reasonably comparable” would conclude that rural speeds ought to at least be at the low end of subscribed urban broadband speeds – 200 Mbps is the minimum speed for over 80% of households.

The fact that over 80% of households are already subscribed to 200 Mbps speeds or faster (40% of households are subscribed to speeds of 500 Mbps or faster) means that all of our hand wringing over counting homes in the country with speeds of at least 100/20 Mbps is largely a joke. Under any current reasonable definition of broadband, those homes already are at speeds that shouldn’t be counted today as having broadband if the FCC was doing its job.

I really hate the numbers game with broadband, and no matter how we define broadband or set a cutoff for grant eligibility, there will be ISPs that will exaggerate the speeds of their current or planned technology to try to game the system. ISPs naturally work to try to protect their service areas from grant funding competition. Other ISPs want to be given grants for technologies that don’t reliably deliver broadband.

But the one thing we should stop doing is measuring broadband by standards that are already in the past in the real world. All of the angst, arguments, and fighting about whether areas are underserved with 100/20 Mbps broadband or slower ought to be scrapped – but unfortunately, the impetus of following grant rules will keep us squabbling about the wrong things for  years to come.

 

Another Twist in The BEAD Grant Process?

Word has been circulating that the NTIA recently informed State Broadband Offices that they must submit a final BEAD plan to the NTIA one year after receiving approval of the Initial Proposal of grant rules. That’s not a surprise since this language is straight out of the legislation, and the NOFO for BEAD – An Eligible Entity may initiate its competitive subgrantee selection process upon approval of its Initial Proposal and will have up to one year to conduct additional local coordination, complete the selection process, and submit a Final Proposal to NTIA.

The ugly twist is that the NTIA is expecting the Final Proposal to include a final list of all BEAD grant winners. Everybody has always assumed that the Final Proposal would be just that – a proposal that describes and fine-tunes the rules being used to award grants. Most State Grant Offices have assumed that they would have multiple years to pick BEAD grant winners.

Consider what has to happen once a state gets approval of its Initial Proposal:

  • A State Broadband Office must finalize the rules for awarding grants through attorneys and state leadership. Some states are going to be required to get the Legislature involved to approve grant rules. This will likely take 3-4 months for most states, but a few will take much longer.
  • The Grant Office would then be ready to announce the date for the first round of grant applications. They would typically give applicants 60-90 days to submit grant applications.
  • A Grant Office will need at least 30 days for the initial review of applications and to provide time to ask for clarifications from applicants.
  • Next, the detailed grant scoring must be done. The BEAD grants are complex, and it’s hard to see a state scoring and ranking grant applications in less than 60 days. There is a lot of complicated due diligence needed by grant offices that are often manned by first-time grant reviewers.
  • The State is then going to have to allow 15-30 days to post the grant applications and allow for protests and challenges. There would be another 30-60 days to resolve protests.
  • Finally, grant awards are announced, and it can easily take three months to negotiate contracts with grant winners. Inevitably, some winners will back out during this process.

The timeline above totals 16 months – and that’s if everything goes smoothly. The BEAD grants are complex, and reviewing and resolving grants that ask to serve overlapping areas is going to add a lot of complication to the process. To put this timeline into perspective, my state of North Carolina is 18 months into the $350 million ARPA grant process and still has not finished identifying all of the grant winners. And that’s with a capable and experienced Grant Office – some states are new to the grant process. The BEAD grants are for more dollars, are more complicated, and will take more time to review than ARPA grants.

The above timeline doesn’t reflect the added rules that are specific to BEAD. State Broadband offices have a mandate to bring broadband to every unserved location. They also must contend with special handling of high-cost areas. Both of these processes will require a lot more time than listed above for Broadband Offices to reach out to and negotiate with ISPs. States that are lucky enough to fund all unserved and underserved areas will need more time to figure out what comes next.

I’m fairly certain that any pressure to speed up the grant time frame comes from the recent White House emphasis on getting infrastructure money out the door quickly. I think everybody in the industry thinks that the BEAD grant process should have gone faster. But the BEAD process has been glacially slow and it’s been 19 months since the IIJA legislation was signed. It’s absurd that we are just now announcing the amount of money that states will get.

But we can’t make up for the glacial process of launching the BEAD grants by rushing at the end so that the money is shoved out the door without taking time to make sure that each State is getting the best long-term solution. States have been having a lot of internal debates about the technologies and types of ISPs they hope will win funding – any deliberation and chance of directing the funds responsibly will be cut short if the process is hurried. One of the most important parts of any grant process is to give worthy applicants a chance to refine and amend a grant request in a subsequent round. The BEAD grants are the first grants in my memory where the States had to reach out to stakeholders to get public feedback. If we rush, all that was learned in that process will be tossed aside.

If the NTIA really insists on a speedy timeline, it will be creating an RDOF-type disaster. The only way to get this process done in a year (or even 18 months) would be through a single round of grants – done hastily. With a tight time frame, the grants won’t be reviewed closely and grants that include errors will be pushed through. ISPs that aren’t really qualified will sneak through.

Having only one round of grants will feel a lot like the RDOF reverse auction. A giant pile of grants will be shoved into the funnel, and it’s likely that the grants will go to ISPs that ask for the lowest percentage of grant funding. A friend of mine has jokingly been saying that 95% of BEAD money will go to the large incumbent providers, and if there is a single-round grant process, he might not be far from the truth.

I’m hoping that this is just a trial balloon being circulated by the NTIA to get feedback, and if so, every State Broadband Office needs to push back hard. If the grants are going to be hurried, we’re going to end up with yet another disastrous federal grant program. I was hopeful that BEAD would avoid the mistakes of the past since the money was given to the States. But if the NTIA forces State Broadband Offices to rush the grant process, we’ll be watching a slow-motion train wreck over the next year.

Should Grant-funded Networks be Open-Access?

There was an interesting political effort in the Washington legislature recently to expand the use of open-access networks. There was language included in the Substitute House Bill 1147 that would require that any network funded from BEAD grants must become open-access and available to other ISPs.

Open-access has been a topic in Washington for many years. There was a long-time prohibition against Public Utility Districts (PUDs) from offering retail broadband. These county-wide government-owned utilities wanted to bring better broadband and settled by building open-access networks. Over the last few decades, a number of PUDs have launched open-access fiber networks, some of them with tens of thousands of fiber customers.

For those not familiar with open-access, it is a network where multiple ISPs can buy access to reach customers. This provides customers with the choice of using multiple ISP on the same network. All reports are that customers like the extra choices they get. Every broadband survey my firm has ever conducted has shown a huge public preference preference for choice.

The legislature finally relaxed the prohibition for PUDs last year, but in a bizarre fashion. The legislature passed two conflicting bills that allow PUDs to provide retail broadband services. Rather than choose between the two bills, the Governor signed both simultaneously (a pen in each hand) so that both bills went into effect. As might be imagined, this created as much confusion as clarity over the issue.

I doubt that anybody will be surprised that the biggest ISPs in the state vehemently opposed this legislation. The big cable companies have always immediately fought any suggestion that they allow other ISPs to use their networks. The big telcos were forced to sell unbundled copper loops starting with the Telecommunications Act of  1996, but that requirement continues to wane as the amount of copper keeps shrinking. The telcos started fighting against the unbundling rules as it was enacted, and over the years succeeded in greatly weakening the ability of outsiders to use their copper.

I don’t think anybody will be surprised to find out that the big ISPs in Washington succeeded in killing this idea. The big ISPs threatened to not pursue any grant funding if this proposal becomes law. Some even made veiled threats to stop investing in the state if this became law.

But it’s an interesting concept. The BEAD grant rules have a clear preference for open-access networks, and any carrier promising an open network will get extra points on a grant application. But the open-access preference is only a suggestion and not a requirement – something the big ISPs in Washington all pointed out.

Requiring open-access is not a far-fetched idea because open-access is required on all of the middle-mile networks that were announced this week as recipients of NTIA grants. But the whole point of the NTIA middle-mile networks is to build networks to places where backbone connections are unavailable or unaffordable. Requiring the grant recipient to sell affordable connections to everybody is a good use of federal grant dollars.

But this raises a much larger question. I know there are a lot of open-access proponents in the country who think that any network funded with government dollars ought to be made open-access to provide the most value to the taxpayers who are funding it. That is exactly what was suggested in Washington, but it didn’t take very long for the big ISPs to kill the idea.

Many industry folks want to take this idea even further. I don’t think I’ve seen a thread on this topic that doesn’t include somebody who thinks government should own all grant-funded fiber infrastructure, which should then be made available to all ISPs that want to use it. Obviously the BEAD grant rules weren’t written that way, and with the sway that big ISPs hold in D.C. it probably never will happen. But is something that Congress could do if they ever have the will to enact it. We’re starting to see cities who are adopting this idea, so we’re going to keep seeing new open-access networks coming to life. I have to think that the citizens in every city close to an open-access network is going to be asking why they can’t have the same thing.

The FCC to Look at Data Caps

FCC Chairwoman Jessica Rosenworcel asked the other Commissioners to join her in opening an investigation into broadband data caps. According to FCC rules, a majority of Commissioners must agree to open any official proceeding. For those not familiar with the data cap concept, it’s where an ISP bills extra for using more than a defined amount of broadband in a month.

Not all ISPs use data caps. The ISP that gets the worst press about data caps is Comcast, but it doesn’t bill data caps in all markets – seemingly only where it doesn’t have a lot of competition. Charter would love to bill data caps, but it has been prohibited from doing so because of an arrangement reached with the FCC when it got approval to buy Time Warner. That agreement just lapsed on May 18 of this year. We’ll have to wait to see if Charter will impose data caps – but it seems likely it will do so since the company asked permission from the FCC to impose data caps in 2021.

AT&T imposes Data caps on DSL and on some fiber connections. Astound broadband charges data caps in Washington, Oregon, and California. Cox has data caps that kick in after a user exceeds 1.25 terabytes per month. Mediacom imposes data caps on many of its plans. All of the products of the high-orbit satellite companies, HughesNet and Viasat, have severe data caps. So do cellular hot spot data plans.

It’s an interesting request by the Chairwoman. Under current FCC rules, the FCC has no authority to do anything about data caps. This authority went away when the previous FCC under Chairman Ajit Pai eliminated the regulation of broadband by killing Title II authority. Chairman Pai went even further and pushed remaining vestiges of any broadband regulation to the Federal Trade Commission.

This makes me wonder why Chairwoman Rosenworcel would try to open this docket. I can see several possibilities. First, this could just be done to show that the FCC cares about an unpopular ISP practice. It’s clear that the public hates data caps. I saw that the press that covers the FCC immediately flooded the news after this was announced. I would hope the Chairwoman would not be so callous as to investigate something for which the FCC is powerless to make any changes.

That leads to the second possibility that Chairwoman Rosenworcel believes that adding a fifth Commissioner will provide the votes needed to reinstate Title II authority or some updated version of it. Starting the investigation into data caps now might sync up well with renewed FCC regulatory authority and let the FCC make a popular change in the future to ban or modify data caps.

I’ve written several blogs over the years that make the argument that data caps are nothing more than a way for ISPs to extract extra payments from customers. There is zero justification from a cost perspective that residential customers that use more data than average cause any significant incremental cost for an ISP. ISPs buy wholesale broadband based on the busiest times of the usage in a month. Within the pile of broadband purchased to meet that peak need, it doesn’t matter how much broadband customers use as long as it doesn’t push up the busy hour for the month.

Additionally, the big ISPs that use data caps also engage in peering arrangements where they directly hand off broadband traffic to the largest web services like Google, Netflix, Facebook, and Microsoft. While there is a cost to create the peering points, once established, the amount of data sent through peering arrangements saves a huge amount of money compared to shipping this same traffic through the Internet.

It’s harder each year for affected homes to avoid data caps. Data caps accumulate both download and upload usage, and homes are increasingly using upload bandwidth that most folks don’t even realize. According to OpenVault, the average home in the U.S. now uses over 560 megabytes of data per month, an amount that keeps climbing. The household average as recently as 2017 was only 273 megabytes per month.

This will be an interesting process to watch. Chairwoman Rosenworcel has created a form for folks to describe their data caps stories. I’m sure that everybody who does so will be hoping that the FCC can help them – but that remains to be seen. It means getting a fifth Commissioner who is willing to reintroduce broadband regulation – something that is going to have a lot of opposition.