Categories
Regulation - What is it Good For? Technology

A 10-Gigabit Tier for Grants

One of the biggest flaws in the recent RDOF reverse auction grant was allowing fixed wireless technology to claim the same gigabit technology tier as fiber. The FCC should never have allowed this to happen. While there is a wireless technology that can deliver up to a gigabit of speed to a few customers under specific circumstances, fiber can deliver gigabit speeds to every customer in a network. This is particularly true in a rural setting where the short reach of gigabit wireless at perhaps a quarter mile is a huge limiting factor for using the technology in a rural setting.

But rather than continue to fight this issue for grant programs there is a much easier solution. It’s now easy to buy residential fiber technology that can deliver 10-gigabits of speed. There have been active Ethernet lasers capable of 10-gigabit speeds for many years. In the last year, XGS-PON has finally come into a price range that makes it a good choice for a new passive fiber network – and the technology can deliver 10-gigabit download speeds.

The FCC can eliminate the question of technology equivalency by putting fiber overbuilders into a new 10-gigabit tier. This could give funding fiber the priority over all other technologies. Fixed wireless will likely never be capable of 10-gigabit speeds. Even if that ever is made possible decades from now, by then fiber will have moved on to the next faster generation. Manufacturers are already looking at 40-gigabit speeds for the next generation of PON technology.

Cable company hybrid-fiber coaxial networks are not capable today of 10-gigabit speeds. These networks could possibly deliver speeds of around 6 or 7 gigabits, but only by removing all of the television signals and delivering only broadband.

I don’t know why it was so hard for the FCC to say no to gigabit fixed wireless technology. When the industry lobbied to allow fixed wireless into the gigabit tier, all the FCC had to do was to ask to see a working demo of wireless gigabit speeds working in a rural farm environment where farms are far apart. The FCC should have insisted that the wireless industry demonstrates how every rural household in the typical RDOF area can receive gigabit speeds. They should have been made to show the technology overcomes distance and line-of-sight issues. There is no such demo because the wireless technology can’t do this – at least not without building fiber and establishing a base transmitter at each farm. The FCC really got suckered by slick PowerPoints and whitepapers when they should have instead asked to see the working demo.

Don’t get me wrong – I don’t hate the new wireless technologies. There are small towns and neighborhoods in rural county seats that could really benefit from the technology. The new meshed networks, if fed by fiber, can superfast bandwidth to small pockets of households and businesses. This can be a really attractive and competitive technology.

But this is not fiber. Every rural community in America knows they want fiber. They understand that once you put the wires in place that fiber is going to be providing solutions for many decades into the future. I think if fiber is built right that it’s a hundred-year investment. Nobody believes this to be true of fixed wireless. The radios are all going to be replaced many times over the next hundred years and communities worry about having an ISP who will make that continual reinvestment.

But since there is such an easy way to fix this going forward, these arguments about gigabit wireless can be largely moot. If the FCC creates a 10-gigabit tier for grants, then only fiber will qualify. The fixed wireless folks can occupy the gigabit tier and leave most other technologies like low-orbit satellite to some even lower tier. The FCC made a mistake with RDOF that they can’t repeat going forward – the agency declared that other technologies are functionally equivalent to fiber – and it’s just not true.

Categories
Regulation - What is it Good For? The Industry

Reporting the Broadband Floor

I want to start by giving a big thanks to Deb Socia for today’s blog. I wrote a recent blog about the upcoming public reporting process for the FCC maps. In that blog, I noted that ISPs are going to be able to continue to report marketing speeds in the new FCC mapping. An ISP that may be delivering 3 Mbps download will continue to be able to report broadband speeds of 25/3 Mbps as long as that is marketed to the public. This practice of allowing marketing speeds that are far faster than actual speeds has resulted in a massive overstatement of broadband availability. This is the number one reason why the FCC badly undercounts the number of homes that can’t get broadband. The FCC literally encourages ISPs to overstate the broadband product being delivered.

In my Twitter feed for this blog, Deb posted a brilliant suggestion, “ISPs need to identify the floor instead of the potential ceiling. Instead of ‘up to’ speeds, how about we say ‘at least’”.

This simple change would force some honesty into FCC reporting. This idea makes sense for many reasons. We have to stop pretending that every home receives the same broadband speed. The speed delivered to customers by many broadband technologies varies by distance. Telco DSL speeds get noticeably slower the further they are transmitted. The fixed wireless broadband delivered by WISPs loses speed with distance from the transmitting tower. The fixed cellular broadband that the big cellular companies are now pushing has the same characteristic – speeds drop quickly with the distance from the cellular tower.

It’s a real challenge for an ISPs using any of these technologies to pick a representative speed to advertise to customers – but customers want to know a speed number. DSL may be able to deliver 25/3 Mbps for a home that’s within a quarter-mile of a rural DSLAM. But a customer eight miles away might be lucky to see 1 Mbps. A WISP might be able to deliver 100 Mbps download speeds within the first mile from a tower, but the WISP might be willing to sell to a home that’s 10 miles away and deliver 3 Mbps for the same price. The same is true for the fixed cellular data plans recently being pushed by A&T, Verizon, and T-Mobile. Customers who live close to a cell tower might see 50 Mbps broadband, but customers further away are going to see a tiny fraction of that number.

The ISPs all know the limitations of their technology, but the FCC has never tried to acknowledge how technologies behave in real markets. The FCC mapping rules treat each of these technologies as if the speed is the same for every customer. Any mapping system that doesn’t recognize the distance issue is going to mostly be a huge fiction.

Deb suggests that ISPs must report the slowest speed they are likely to deliver. I want to be fair to ISPs and I suggest they report both the minimum “at least” speed and the maximum “up to” speed. Those two numbers will tell the right story to the public because together they provide the range of speeds being delivered in a given Census. With the FCC’s new portal for customer input, the public could weigh in on the “at least” speeds. If a customer is receiving speeds slower than the “at least” speeds, then, after investigation, the ISP would be required to lower that number in its reporting.

This dual reporting will also allow quality ISPs to distinguish themselves from ISPs that cut corners. If a WISP only sells service to customers within 5 or 6 miles of a transmitter, then the difference between its “at least” speeds and its “up to” speeds would be small. But if another WISP is willing to sell a crappy broadband product a dozen miles from the transmitter, there would be a big difference between its two numbers. If this is reported honestly, the public will be able to distinguish between these two WISPs.

This dual reporting of speeds would also highlight the great technologies – a fiber network is going to have a gigabit “at least” and “up to” speed. This dual reporting will end the argument that fixed wireless is a pure substitute for fiber – which it clearly is not. Let the two speeds tell the real story for every ISP in the place of marketing hype.

I’ve been trying for years to find a way to make the FCC broadband maps meaningful. I think this is it. I’ve never asked this before, but everybody should forward this blog to the FCC Commissioners and politicians. This is an idea that can bring some meaningful honesty into the FCC broadband maps.

Categories
Regulation - What is it Good For?

Public Reporting of Broadband Speeds

The FCC’s Acting Chairman Jessica Rosenworcel wrote a recent blog that talks about the progress the FCC is making towards revising the FCC mapping system. The blog concentrates on the upcoming consumer portal to provide input into the FCC maps.

It’s good to see progress finally being made on the maps – this has been discussed but not implemented for over two years. And it’s good that the public will have a way to provide input to the FCC database. Hopefully, the FCC will change the rules before the new mapping tools are implemented because the current rules don’t let the public provide any worthwhile input to the mapping data.

The current mapping rules were implemented in Docket FCC 21-20 on January 13 of this year – one of the last acts of outgoing Chairman Ajit Pai. Those rules outline a consumer input process to the mapping that is going to be a lot less impactful than what the public is hoping for.

The new FCC maps will require that ISPs draw ‘polygons’ around the areas where there is existing broadband coverage, or where the ISP can install broadband with 10 days of a consumer request. A consumer can challenge the availability of broadband at their home. If a consumer claims that broadband is not available at an address, the ISP is required to respond. If there is no broadband available at the address, the likely response of the ISP will be to amend the polygon to exclude the challenged address. I guess that consumers who can’t buy broadband from a given ISP can gain some satisfaction from having that ISP fix the maps to set the record straight. But the complaint is unlikely to get broadband to the home where broadband is not available.

Unfortunately, the challenge process is not going to help in the much more common situation where a household has dreadfully slow broadband. The ISP might be advertising speeds of ‘up to 25/3 Mbps’ but delivering only a tiny fraction of that speed. This is the normal situation for rural DSL and many fixed wireless connections – speeds customers see are much slower than what ISPs claim on the FCC maps.

Unless the FCC changes the rules established in this Docket, a consumer claiming slow broadband will see no change to the FCC map. The January rules allow ISPs to continue to claim marketing speeds in the new FCC mapping system. A rural ISP can continue to claim ‘up to 25/3 Mbps’ for an area with barely functioning broadband as long as the ISP advertises the faster up-to speed.

The FCC needs to change the rules established in the January Docket or they are going to witness a rural revolt. Consumers that are seeing broadband speeds that are barely faster than dial-up are going to flock to the new FCC reporting portal hoping for some change. Under the current rules, the FCC is going to side with the ISP that advertises speeds faster than it delivers.

The FCC has a real dilemma on how to change the public reporting process. The FCC can’t automatically side with each consumer. Any given consumer that reports slow speeds might be seeing the impact of an old and outdated WiFi router, or have some other issue inside the home that is killing the speed delivered by the ISP. But when multiple homes in a neighborhood report slow speeds, then the ISP is almost certainly delivering slow speeds.

Unfortunately, there is no way to report ‘actual’ speeds on an FCC map. If you ever ran a speed test multiple times during a day and night you know that the broadband speed at your home likely varies significantly during a day. What’s the ‘actual’ broadband data speed for a home that sees download speeds vary from 5 Mbps to 15 Mbps at different times of the day?

The consumer challenge of FCC data was dreamed up to allow the public to tell a broadband story different than what the ISPs have been reporting to the FCC. Unfortunately, it’s not going to work to anybody’s satisfaction. The real culprit in this story is the idea that we can define broadband somehow by speed – that there is a functional difference between a broadband connection that delivers 5 Mbps or 15 Mbps. The fact is that both connections are dreadfully slow and should not be considered as broadband. But as long as we have grant programs that fund areas that have speeds under 10/1 Mbps or 25/3 Mbps, we’ll keep having these dumb processes that pretend that we know the actual speed on even a single rural broadband connection. The fact is – we don’t and we can’t.

Categories
Regulation - What is it Good For? The Industry

AT&T Says No to Symmetrical Broadband

Since it seems obvious that the new FCC will take a hard look at the definition of broadband, we can expect big ISPs to start the lobbying effort to persuade the FCC to make any increase in the definition as painless as possible. The large ISPs seem to have abandoned any support for the existing definition of 25/3 Mbps because they know sticking with it gets them laughed out of the room. But many ISPs are worried that a fast definition of broadband will bypass their technologies – any technology that can’t meet a revised definition of broadband will not be eligible for future federal grants, and even more importantly can be overbuilt by federal grant recipients.

AT&T recently took the first shot I’ve seen in the speed definition battle. Joan March, the Executive VP of Federal Regulatory Relations wrote a recent blog that argues against using symmetrical speeds in the definition of bandwidth. AT&T is an interesting ISP because the company operates three different technologies. In urban and suburban areas AT&T has built fiber to pass over 14 million homes and businesses and says they are going to pass up to 3 million more over the next year or two. The fiber technology offers at least a symmetrical gigabit product. AT&T is also still a huge provider of DSL, but the company stopped installing DSL customers in October of last year. AT&T’s rural DSL has speeds far south of the FCC’s 25/3 definition of bandwidth, although U-verse DSL in larger towns has download speeds as fast as 50 Mbps.

The broadband product that prompted the blog is AT&T’s rural cellular product. This is the company’s replacement for DSL, and AT&T doesn’t want the FCC to declare the product as something less than broadband. AT&T rightfully needs to worry about this product not meeting the FCC definition of broadband – because in a lot of places it is slower than 25/3 Mbps.

Reviews.org looks at over one million cellular data connections per year and calculates the average data speeds for the 3 big cellular carriers. The report for early 2021 shows the following nationwide average speeds for cellular data. These speeds just barely qualify as broadband with the current 25/3 definition.

AT&T – 29.9 Mbps download, 9.4 Mbps upload

T-Mobile – 32.7 Mbps download, 12.9 Mbps upload

Verizon – 32.2 Mbps download, 10.0 Mbps upload

PC Magazine tests cellular speeds in 26 major cities each summer. In the summer of 2020, they showed the following speeds:

AT&T – 103.1 Mbps download, 19.3 Mbps upload

T-Mobile – 74.0 Mbps download, 25.8 Mbps upload

Verizon – 105.1 Mbps download, 21.6 Mbps upload

Cellular data speeds are faster in cities for several reasons. First, there are more cell sites in cities. The data speed a customer receives on cellular is largely a function of how far the customer is from a cell site, and in cities, most customers are within a mile of the closest cell site. The cellular carriers have also introduced additional bands of spectrum in urban areas that are not being used outside cities. The biggest boost to the AT&T and Verizon urban speeds comes from the deployment of millimeter-wave cellular hotspots in small areas of the downtowns in big cities – a product that doesn’t use traditional cell sites, but which helps to increase the average speeds.

Comparing the urban speeds to the average speeds tells us that rural speeds are even slower than the averages. In rural areas, cellular customers are generally a lot more than one mile from a cell tower, which really reduces speeds. My firm does speed tests, and I’ve never seen a rural fixed cellular broadband product with a download speed greater than 20 Mbps, and many are a lot slower.

The AT&T blog never makes a specific recommendation of what the speeds ought to be. But Marsh hints at a new definition at 50/10 or 100/20. My firm has also done a lot of surveys during the pandemic and we routinely see about one-third of households or more that are unhappy with the upload speeds on urban cable company networks – which have typical upload speeds between 15 Mbps and 20 Mbps. AT&T is hoping that the FCC defines broadband with an upload speed of 10-20 Mbps – a speed that many homes already find inadequate today. That’s the only way that rural fixed cellular can qualify as broadband.

Categories
Regulation - What is it Good For?

Is it Time to Kill Retransmission Rules?

Rep. Anna Eshoo (D-Calif.) and Rep. Steve Scalise (R-La.) recently introduced a bill to Congress labeled as the Introducing the Modern Television Act of 2021 that wants to largely do away with the retransmission consent rules for cable companies. They’ve introduced similar bills in recent years.

Retransmission rules require that cable operators must carry local TV stations that are within over-the-air transmission range of a given area. That rule sounds benign enough but has been used by local stations to extract huge fees from cable companies for carrying local content. The fees paid to local stations are one of the primary reasons that cable TV rates have escalated so quickly over the last decade.

Fifteen years ago, it was rare for local stations to charge anything for carrying their signal. They were happy to be able to claim cable viewers of their content when calculating advertising rates based upon ‘eyeballs’ for ads placed on their stations. But a handful of consultants convinced a few stations that the retransmission requirements were a valuable commodity and stations started insisting on payments from cable companies to carry the content. Since that time, the payments have climbed from zero to rates in the range of $4 or more per cable customer, per local station per month. For a cable company carrying even the basic four networks of ABC, CBS, FOX, and NBC means shelling out $16 or more per month to local stations for each cable subscriber.

It was these fees that have led the big cable companies to create the local programming fees that are not part of basic rates. Cable companies may advertise a basic rate for a cable package at $50 but then sock on large hidden fees of $20 or more to cover local station fees along with some sports network fees.

The bill sponsors also blame high retransmission fees for the increasing blackouts of content that we’ve seen in recent years. When cable companies balk at paying increasing rates each year for local content, the local stations have adopted the tactic of shutting off access to their content until the cable company finally agrees to pay the ever-increasing rates.

Following are a few of the key provisions of the bill:

  • Eliminates the retransmission consent, mandatory copyright fees, and other provisions of current FCC rules (which were dictated by Congress). This should allow for real negotiations of rates – today the stations demand rates and there is little room for negotiation.
  • Adds a 60-day period where blackouts of content aren’t allowed when the local station and a cable operator are negotiating rates.
  • Gives the FCC the right to push a programming dispute into binding arbitration. Blackouts would be prohibited during the arbitration period.
  • Preempts federal, state, and local governments from regulating cable rates. This is an odd requirement since there is little or no rate regulation that I know of, but it must exist somewhere in the country.
  • Keeps the rule that cable networks and satellite providers must continue to carry local content.

As would be expected, local TV stations and the major networks are against these changes. Most of the money charged for retransmission consent ends up in the pockets of the major networks. Cable companies are obviously in favor of the proposed changes since it would give them an opportunity for real negotiations for content.

Congress created this original mess by mandating that cable companies must carry local content without allowing for things like the arbitration in negotiations this bill brings to the process. But the runaway rates in the cable industry can be pinned on the greed of programmers who have raised programming charges far more than inflation for two decades. The industry has driven cable rates so high that millions of households are cutting the cord annually and abandoning paying for content that includes local stations. If you were asked to imagine a scenario where an industry would self-destruct over time, it would be hard to think of a better example than the retransmission fees in the TV industry.

Categories
Regulation - What is it Good For?

Focus on Sustainability

There are a few glaring holes in all federal broadband grants that have to do with how a grant recipient uses the network that was constructed with grant dollars. I wrote a recent blog that talks about the fact that most grants surprisingly don’t have any mandate that the grant recipient serve any customers in the grant area. For example, Starlink could take a grant for western North Carolina but never sign a customer in the grant areas.

Even more amazingly, there is not any proof required that the grant money was all spent for the intended purposes in the grant areas. Consider the CAF II grants where the telcos self-report that they have completed the upgrades in each grant area – the telcos were not required to show any proof of the capital spending. A lot of people, including me, think that the big telcos didn’t make many of the required CAF II upgrades. The FCC has no idea if grant upgrades were really done. It would have been easy for the FCC to demand proof of capital expenditures showing the labor and specific equipment that was used in each of the grant areas. Such a requirement would have forced the telcos to do the needed work because it would be extremely easy for an FCC auditor to show up and ask to see some of the specific equipment that was claimed as installed.

Today’s blog talks about the third missing element of federal; grants – grant recipients don’t have to make any promise to maintain the networks after they are constructed. There is nothing to stop a grant recipient from taking the grant money, building the network, and then milking revenues for years without spending any future capital.

All of the industry experts will tell you that a new fiber network will likely be relatively problem-free after you shake out any initial problems. Unless fiber is cut, or unless customer electronics go bad, there is not a lot of maintenance capital required for the first decade after building a new fiber network. There will still be fiber cuts and storm damage and the inevitable things that happen in the real world, but fiber technology is so tried and true right now that it largely works well out of the box.

I wrote a blog recently that conjectured that a fiber network can be a hundred-year investment. But the key to longevity is maintenance. If a grant recipient treats a fiber network the way that the big telcos have treated copper networks, then new fiber networks will start deteriorating in ten years and will be dead in thirty years. Good maintenance means properly fixing fiber cuts with quality splices. It may mean replacing stretches of fiber that demonstrate ongoing problems that might have come from the factory or from improper handling during installation. But most importantly, maintenance means upgrading and replacing electronics.

Fiber electronics don’t last forever. Manufacturers talk about a 7-year life on electronics, but they are in the business of selling the replacements. There is no physical reason to replace customer electronics (ONT) as long as it keeps working, and we’ve already seen some customer electronics (fiber ONTs) last for as long as fifteen years. But my guess is that, on average, that electronics are going to require upgrades every ten or twelve years.

Luckily, it looks like many of the FTTP upgrades already on the market involve what we call an overlay. This means introducing a new core that can provide new customer electronics while still being able to support the old equipment, as long as it’s working well. This is the sane way to do upgrades because a company can phase customers from old electronics to new over many years rather than going through the chaotic process of trying to change technology for a lot of customers at the same time.

But back to the grants. Federal grants are going to turn out to be a total disaster if the companies receiving the grants don’t build what they are supposed to build and maintain the network to keep it running for a hundred years. This won’t become apparent for fifteen or twenty years, but then we’ll start hearing about big problems in rural areas where customers on poorly maintained fiber networks go out of service and can’t get repairs.

It really bothers me to know that there are bad ISPs in the industry who are likely to take the grant money with the intention of milking the revenues and not reinvesting in the networks. We know that cooperatives, small telco, and municipal network owners will be happily operating grant-funded fiber networks a century from now. But amazingly, sustainability isn’t part of the discussion or criteria in deciding which ISPs deserve grant funding. We continue to pretend that all ISPs are good corporate citizens even after some have proved repeatedly that they are not.

Categories
Regulation - What is it Good For?

Cost Models and Grants

Possibly the least understood aspect of the recent FCC RDOF grants is that the FCC established the base amount of grant for every Census block in the grant using a cost model. These cost models estimate the cost of building a new broadband network in every part of the country – and unfortunately, the FCC accepts the results of the cost models without question.

The FCC contracts with CostQuest Associates to create and maintain the cost estimation models. The cost models have been used in the past in establishing FCC subsidies, such as Universal Service Fund payments made to small telephone companies under the ACAM program. For a peek into how the cost models work, this link is from an FCC docket in 2013 when the small telcos challenged some aspects of the cost models. The docket explains some of the basics about of the cost model functions.

This blog is not meant to criticize CostQuest, because no generic nationwide cost model can capture the local nuances that impact the cost of building fiber in a given community. It’s an impossible task. Consider the kinds of unexpected things that engineers encounter all of the time when designing fiber networks:

  • We worked in one county where the rural utility poles were in relatively good shape, but the local electric company hadn’t trimmed trees in decades. We found the pole lines were now 15 feet inside heavy woods in much of the fiber construction area.
  • We worked in another county where 95% of the county was farmland with deep soil where it was inexpensive to bury fiber. However, a large percentage of homes were along a river in the center of the county that consisted of steep, rocky hills with old crumbling poles.
  • We worked in another county where many of the rural roads were packed dirt roads with wide water drainage ditches on both sides. However, the county wouldn’t allow any construction in the ditches and insisted that fiber be placed in the public right-of-way which was almost entirely in the woods.

 

Every fiber construction company can make a long list of similar situations where fiber construction costs came in higher than expected. But there are also cases where fiber construction costs are lower than expected. We’ve worked in farm counties where road shoulders are wide, the soil is soft, and there are long stretches between driveways. We see electric cooperatives that are putting ADSS fiber in the power space for some spectacular savings.

Generic cost models can’t keep up with the fluctuations in the marketplace. For example, I saw a few projects where the costs went higher than expected because Verizon fiber construction had lured away all local work crews for several years running.

Cost models can’t possibly account for cases where fiber construction costs are higher or lower than what might be expected in a nearby county with seemingly similar conditions. No cost model can keep up with the ebb and flow of the availability of construction crews or the impact on costs from backlogs in the supply chain.

Unfortunately, the FCC determines the amount to be awarded for some grants using these cost models, such as the recently completed RDOF grants. The starting bid for each Census block in the RDOF auction was determined using the results of the cost models – and the results make little sense to people that understand the cost of building fiber.

One might expect fiber construction costs to easily be three or four times higher per mile in parts of Appalachia compared to the open farmland plains in the Midwest. However, the opening bids for RDOF were not as proportionately higher for Appalachia than what you might expect. The net results are that grants offered a higher percentage of expected construction cost is the open plains compared to the mountains of Appalachia.

There is an alternative to using the cost models – a method that is used by many state grants. Professional engineers estimate construction costs and many state grants then fund some percentage of the grant cost based upon factors like the technology to be constructed. This kind of grant would offer the same percentage of grant assistance in all different geographies of a state. Generic cost models end up advantaging or disadvantaging grant areas, without those accepting the grants even realizing it. The RDOF grants offered drastically different proportions of the cost of construction – which is unfair and impossible to defend. This is another reason to not use reverse auctions where the government goofs up the fairness of the grants before they are even open for bidding.

Categories
Regulation - What is it Good For? The Industry

The White House Broadband Plan

Reading the White House $100 billion broadband plan was a bit eerie because it felt like I could have written it. The plan espouses the same policies that I’ve been recommending in this blog. This plan is 180 degrees different than the Congress plan that would fund broadband using a giant federal, and a series of state reverse auctions.

The plan starts by citing the 1936 Rural Electrification Act which brought electricity to nearly every home and farm in America. It clearly states that “broadband internet is the new electricity” and is “necessary for Americans to do their jobs, to participate equally in school learning, health care, and to stay connected”.

The plan proposes to fund building “future proof’ broadband infrastructure to reach 100 percent broadband coverage. It’s not hard to interpret future proof to mean fiber networks that will last for the rest of the century versus technologies that might not last for more than a decade. It means technologies that can provide gigabit or faster speeds that will still support broadband needs many decades from now.

The plan wants to remove all barriers so that local governments, non-profits, and cooperatives can provide broadband – entities without the motive to jack-up prices to earn a profit. The reference to electrification implies that much of the funding for modernizing the network might come in the form of low-interest federal loans given to community-based organizations. This same plan for electrification spurred the formation of electric cooperatives and would do something similar now. I favor this as the best use of federal money because the cost of building the infrastructure with federal loans means that the federal coffers eventually get repaid.

The plan also proposes giving tribal nations a say in the broadband build on tribal lands. This is the third recent funding mechanism that talks about tribal broadband. Most Americans would be aghast at the incredibly poor telecom infrastructure that has been provided on tribal lands. We all decry the state of rural networks, but tribal areas have been provided with the worst of the worst in both wired and wireless networks.

The plan promotes price transparency so that ISPs must disclose the real prices they will charge. This means no more hidden fees and deceptive sales and billing practices. This likely means writing legislation that gives the FCC and FTC some real teeth for ending deceptive billing practices of the big ISPs.

The plan also proposes to tackle broadband prices. It notes that millions of households that have access to good broadband networks today can’t use broadband because “the United States has some of the highest broadband prices among OECD countries”. The White House plan proposes temporary subsidies to help low-income homes but wants to find a solution to keep prices affordable without subsidy. Part of that solution might be the creation of urban municipal, non-profit, and cooperative ISPs that aren’t driven by profits or Wall Street earnings. This goal also might imply some sort of federal price controls on urban broadband – an idea that is anathema to the giant ISPs. Practically every big ISP regulatory policy for the last decade has been aimed at keeping the government from thinking about regulating prices.

This is a plan that will sanely solve the rural broadband gap. It means giving communities time to form cooperatives or non-profits to build broadband networks rather than shoving the money out the door in a hurry in a big reverse auction. This essentially means allowing the public to build and operate its own rural broadband – the only solution I can think of that is sustainable over the long-term in rural markets. Big commercial ISPs invariably are going to overcharge while cutting services to improve margins.

Giving the money to local governments and cooperatives also implies providing the time to allow these entities to be able to do this right. We can’t forget that the electrification of America didn’t happen overnight and it took some communities as more than a decade to finally build rural electric networks. The whole White House infrastructure plan stretches over 8 – 10 years – it’s an infrastructure plan, not an immediate stimulus plan.

It’s probably obvious that I love this plan. Unfortunately, this plan has a long way to go to be realized. There is already proposed Congressional legislation that takes nearly the opposite approach, and which would shove broadband funding out of the door within 18 months in a gigantic reverse auction. We already got a glimpse of how poorly reverse auctions can go in the recently completed RDOF auction. I hope Congress thinks about the White House plan that would put the power back into the hands of local governments and cooperatives to solve the broadband gaps. This plan is what the public needs because it creates broadband networks and ISPs that will still be serving the public well a century from now.

Categories
Regulation - What is it Good For?

The Accessible, Affordable Internet Act for All – Part 2

This is the second look at the Accessible, Affordable Internet Act for All sponsored by Rep. James E. Clyburn from South Carolina and Sen. Amy Klobuchar from Minnesota. The first blog looked at the problems I perceive from awarding most of the funding in a giant reverse auction.

In a nutshell, the bill provides $94 billion for broadband expansion. A huge chunk of the money would be spent in 2022, with 20% of the biggest funds deferred for four years. There are other aspects of the legislation worth highlighting.

One of the interesting things about the bill is the requirements that are missing. I was surprised to see no ‘buy American’ requirement. While this is a broadband bill, it’s also an infrastructure bill and we should make sure that infrastructure funding is spent as much as possible on American components and American work crews.

While the bill has feel-good language about hoping that ISPs offer good prices, there is no prohibition that I can find against practices like data caps imposed in grant-funded areas that can significantly increase monthly costs for a growing percentage of households.

The most dismaying aspect of the bill that is missing is the idea of imposing accountability on anybody accepting the various federal grant funds. Many state grant programs come with significant accountability. ISPs must often submit proof of construction costs to get paid. State grant agencies routinely visit grant projects to verify that ISPs are building the technology they promised. There is no such accountability in the grants awarded by this bill, just as there was no accountability in the recent RDOF grants or the recently completed CAF II grants. In the original CAF II, the carriers self-certify that the upgrades have been made and provide no back-up that the work was done other than the certification. There is a widespread belief that much of the CAF II upgrades were never done, but we’ll likely never know since the telcos that accepted the grants don’t have any reporting requirements to show that the grant money was spent as intended.

There is also no requirement to report the market success of broadband grants. Any ISPs building last-mile infrastructure should have to report the number of households and businesses that use the network for at least five years after construction is complete. Do we really want to spend over $90 billion for grants without asking the basic question of whether the grants actually helped residents and businesses?

This legislation continues a trend I find bothersome. It will require all networks built with grant funding to offer a low-income broadband product – which is great. But it then sets the speed of the low-income service at 50/50 Mbps while ISPs will be required to provide 100/100 Mbps or faster to everybody else. While it’s hard to fault a 50/50 Mbps product today, that’s not always going to be the case as homes continue to need more broadband. I hate the concept that low-income homes get slower broadband than everybody else just because they are poor. We can provide a lower price without cutting speeds. ISPs will all tell legislators that there is no difference in cost in a fiber network between a 50/50 Mbps and a 100/100 Mbps service. This requirement is nothing more than a backhanded way to remind folks that they are poor – there is no other reason for it that I can imagine.

One of the interesting requirements of this legislation is that the FCC gathers consumer prices for broadband. I’m really curious how this will work. I studied a market last year where I gathered hundreds of customer bills and I found almost no two homes being charged the same rate for the same broadband product. Because of special promotional rates, negotiated rates, bundled discounts, and hidden fees, I wonder how ISPs will honestly answer this question and how the FCC will interpret the results.

The bill allocates a lot of money for ongoing studies and reports. For example, there is a new biennial report that quantifies the number of households where cost is a barrier to buying broadband. I’m curious how that will be done in any meaningful way that will differ from the mountains of demographic data that show that broadband adoption has almost a straight-line relationship to household income. I’m not a big fan of creating permanent report requirements for the government that will never go away.

Categories
Regulation - What is it Good For?

Taking the Short View

We need to talk about the insidious carryover impact of having a national definition of broadband speed of 25/3 Mbps. You might think that the FCC’s definition of broadband doesn’t matter – but it’s going to have a huge impact in 2021 on how we spend the sudden flood of broadband funding that’s coming to bear from the federal government.

First, a quick reminder about the history of the 25/3 definition of broadband. The FCC under Tom Wheeler increased the definition of broadband in 2015 from the paltry former definition of 4/1 Mbps – a sorely overdue upgrade. At the time that the new definition was set it seemed like a fair definition. The vast majority of US homes could comfortably function with a 25/3 Mbps broadband connection.

But we live in a world where household usage has been madly compounding at a rate of over 20% per year. More importantly, since 2015 we’ve changed the way we use broadband. Homes routinely use simultaneous broadband streams and a large and growing percentage of homes now find 25 Mbps download to be a major constraint on how they want to use broadband. The cable companies understood this, and to keep customers happy have upgraded their minimum download speeds from 100 Mbps to 200 Mbps.

Then came the pandemic and made the whole country focus on upload speeds. Suddenly, every student and every adult who tried to work at home learned that the upload stream for most  broadband connection will just barely support one person working at home and is completely inadequate for homes where multiple people are trying to function at home at the same time.

Meanwhile, the FCC under Chairman Ajit Pai ignored the reality of the big changes in the way that Americas use broadband. The FCC had multiple opportunities to increase the definition of broadband – including after the evident impact of the pandemic – but he stubbornly stuck with the outdated 25/3 definition. Chairman Pai did not want a legacy of suddenly declaring that many millions of homes didn’t have adequate broadband.

We now have an FCC that is likely to increase the definition of broadband, but the FCC is still waiting for a fifth Commissioner to hold a vote on the issue. Meanwhile, we are poised to start handing out billions of dollars of broadband subsidies that come from the $1.9 trillion American Rescue Plan Act. This includes $10 billion that is directly approved as state block grants for broadband plus some portion of the larger $350 billion that is approved for use for more general infrastructure that can include broadband. I can promise you that this money is going to come encumbered in some form or fashion by the old definition of broadband. I can’t predict exactly how this will come into play, but there is no way that the Treasure Department, which is administering these funds can’t ignore the official definition of broadband.

As much as federal officials might want to do the right thing, 25/3 Mbps is the current law of the land. The new federal monies are likely to emphasize serving areas that don’t have speeds that meet that 25/3 Mbps definition. Let me rephrase that to be more precise – federal broadband money will be prioritized to give funding to areas where ISPs have told the FCC that households can’t buy a 25/3 broadband product. Unfortunately, there are huge parts of the US where homes don’t get speeds anywhere close to 25/3 Mbps, but where ISPs are safe in reporting marketing speeds to the FCC rather than actual speeds. States like Georgia and North Carolina have estimated that the number of households that can’t buy a 25/3 Mbps broadband product is twice what is reported to the FCC.

What this all means is that we are going to decide where to spend billions in funding from the American Rescue Plan Act based upon the 25/3 Mbps definition of broadband – a definition that will not long survive a fully staffed FCC. The intransigence of Chairman Pai and the big ISPs that strongly supported him will carry over and have a huge impact even after he is gone. The broadband that will be built with the current funding will last for many decades – but unfortunately, some of this funding will be misdirected due to the government taking the short view that we must keep pretending that 25/3 Mbps is a meaningful measurement of broadband.