Categories
Uncategorized

The Municipal Broadband Battle

My overall impression of the BEAD NOFO is that the NTIA largely emphasized, and in many cases even strengthened the requirements for the grants as delineated in the federal legislation. But there is one area where I think the NTIA is instead treading carefully.

The federal legislation made it clear that BEAD grants should be made available to everybody – commercial entities, non-profit entities, Tribes, and municipal entities. The day that I read the federal legislation, I knew there would be an eventual showdown since many states have barriers or restrictions against municipal participation in building, owning, or operating commercial broadband networks.

The NTIA is tackling this sticky issue using the following language:

A state must Disclose (1) whether the Eligible Entity (the state) will waive all laws  . . . concerning broadband, utility services, or similar subjects, whether they predate or postdate enactment of the Infrastructure Act, that either (a) preclude certain public sector providers from participation in the subgrant competition or (b) impose specific requirements on public sector entities, such as limitations on the sources of financing, the required imputation of costs not actually incurred by the public sector entity, or restrictions on the service a public sector entity can offer; and (2) if it will not waive all such laws for BEAD Program project selection purposes, identify those that it will not waive and describe how they will be applied in connection with the competition for subgrants.

If there is any one area where politics creep into building broadband, it is with state restrictions on municipalities. There are states, like my state of North Carolina, where the legislature has repeatedly emphasized that it doesn’t want municipalities to be funding or operating fiber networks. This is also a bigger political issue because it creates another skirmish point in the ongoing battle over state rights versus federal mandates. There are many states that have drawn hard lines against federal mandates, even when it costs them a lot of money – such as the states that are still turning down billions of dollars per year of federal funding through the Affordable Care Act.

The NTIA could have handled this differently by plainly requiring that any state broadband plan has to make the grant funding available to everybody, including municipalities – as stated in the legislation. States could have reacted to that requirement in one of several ways. A state could file a plan that agrees to waive some or all municipal restrictions only for purposes of the BEAD grant. Broadband is such a hot topic in some states that a legislature might blink rather than take a hard line.

Or a state could react by filing a state plan that clearly sticks to existing barriers against municipal participation in broadband. That would set up a big fight between the state and the NTIA.

A state could react by refusing to file a state broadband plan and forgoing the billion dollars of funding that comes with this legislation. The interesting thing is that the legislation seems to have presupposed that some states might do this since there are provisions where municipalities and ISPs can ask for grants directly from the NTIA if a state refuses to agree with the requirements of the legislation.

I’m just guessing, and I have no insight into the mystery that is D.C. politics, but I think the NTIA chose a path that wouldn’t goad states into dropping out of the BEAD grant program. Perhaps they want to first open up a dialog, and the above language does that. It allows a state to define what it will or won’t do related to municipalities.

I also guess that this language will open up a dialog within each state about restrictions on municipal broadband. Recall that every state has to put its plan out for public comment before submitting it to the NTIA. The above language will force a state to defend its plans to the public if it refuses to allow grants for local governments. Any state taking that approach might expect a lot of public pushback – which is different than setting such policies behind closed doors in the legislature.

The language doesn’t preclude the NTIA from rejecting a state broadband plan if a state won’t budget on the issue, and it may eventually have to do so. But I also guess that the NTIA doesn’t want to directly administer the grant programs in some states, so it is motivated to see the money go to the states. I’m asked daily when I think BEAD grant applications will be available, and this is one more issue that makes it impossible to predict. If your state is going to fight the municipal broadband battle it’s going to be later than states who don’t.

Categories
Uncategorized

An Easier Way to Define Broadband

Our broadband policies always seem to lag the market. If and when the FCC seats the fifth Commissioner, it’s expected that the agency will raise the definition of broadband from 25/3 Mbps to 100/20 Mbps. That change will have big repercussions in the market because it will mean that anybody that can’t buy broadband speeds of at least 100/20 Mbps would not have broadband. That’s how an official broadband definition works – you either have broadband, or you don’t.

The definition of broadband matters for several reasons. First, it makes areas that don’t have broadband eligible for federal grants – although many of the current round of big grants did not wait for the FCC to change the definition of broadband. It also matters in how we count the number of people without broadband. That has supposedly been one of the major purposes of the FCC broadband maps, and they failed badly in identifying homes that can’t buy 25/3 broadband. But on the day that the FCC changes the definition of broadband, millions of homes will be officially declared to not be able to buy real broadband.

I’ve always hated these arbitrary hard lines defined by speeds. Anybody who has ever done speed tests at their home knows that the broadband speed delivered varies from second to second, minute to minute, and hour by hour. It’s not unusual at my desk to see speeds vary by more than 50% during the course of the day.

The original purpose for having a definition of broadband was established by Congress, which directed the FCC to have plans to bring rural broadband into parity with urban broadband. The folks that wrote that law in 1996 could never have envisioned that we’d grow from having dial-up Internet to gigabit capabilities in urban America in 2022.

If the goal is still to create parity between urban and rural broadband, there is a much easier way to define broadband. The cable companies have regularly increased the speeds of their minimum broadband products, and in my mind, when they do so, they set a new standard target for parity between rural and urban areas.

Recently both Charter and Cable One increased the minimum speeds of basic broadband to 200 Mbps (with no mention of upload speeds). Charter is increasing speeds automatically with no rate changes. Cable One’s change seems like more of a quiet rate increase since it will charge customers $5 more per month to automatically move them from 100 Mbps to 200 Mbps.

Charter has always led the industry in this. I think they were a leader in moving to 30 Mbps, 60 Mbps, 100 Mbps, and now 200 Mbps. The rest of the cable industry generally matches Charter in this increases within a year or so.

The one exception is Comcast Xfinity. The company still has a 50Mbps and a 100 Mbps product. However, if you go to the web, all they are pushing is a new 300 Mbps product. I expect it’s not easy for a new customer to buy the 50 Mbps product.

When the big cable companies voluntarily raise the speed bar by increasing speeds across the board, they have, by definition, redefined urban broadband. Can parity mean anything other than residents in a rural area should be able to buy broadband as fast as is available to a basic broadband customer in an urban area?

Maybe I’m being too simplistic, but if the FCC finally raises the definition of broadband this year to 100/20 Mbps, it will already be lagging behind the urban broadband market with that definition.

Of course, the download speed question is only half of the speed equation. You have to dig deep on cable company websites to find any mention of upload speeds. The cable companies lobbied extremely hard during the passage of broadband grant legislation to make certain that the upload speed definition for grant purposes didn’t go higher than 20 Mbps. When cable companies talk to customers, they are moot on upload speeds since few urban cable products actually deliver 20 Mbps.

I probably have written too many blogs about the definition of broadband. But it’s a topic that keeps having real-life implications. It’s ludicrous that there are still federal grants that award more money for serving areas with broadband speeds under 25/3 Mbps. If the real goal of the federal government is to have parity between rural and urban broadband speeds, then Charter and Cable One just provided us with a new definition of broadband. If somebody uses federal grant money to build a rural market with 100 Mbps download technology, it’s already out of parity in 2022, and it’s hard to imagine how far it will be out of parity by the time the grant-funded network is built and operational.

Categories
Uncategorized

FCC Investigating Cost of Pole Replacements

The FCC recently issued a Second Notice of Proposed Rulemaking concerning the allocation of costs when replacing poles to accommodate adding fiber or other communications wires communications devices to poles. The traditional rule has been that the new attacher must pay for 100% of the cost of make-ready, including the cost of pole replacement if there is not sufficient room to add a new wire or device (like a small cell).

In January 2021, the FCC issued an order that clarified that it is unreasonable for a new attacher to automatically have to pay the full cost of swapping a pole. This current NPRM now asks the industry for comments to clarify some of the stickier situations that arise out of trying to allocate the costs of pole replacement. For example, the NPRM asks for comments on the following situations and questions:

  • How do you determine the extent to which a pole owner will benefit from a pole replacement? There are plenty of cases where poles are clearly at the end of life and should have been replaced by the pole owner as a part of routine maintenance.
  • What standards or formulas should be applied to calculate the amount that a pole owner should be responsible for?
  • How should any new rules handle having to replace a pole that is not near the end of economic life – what the FCC is calling early pole replacement.
  • Will requiring pole owners to pay a share of pole replacement negatively impact the negotiation between telecom companies and pole owners?
  • Is there any mechanism that can be used to minimize disputes or to expedite the resolution of disputes?
  • Finally, the FCC is looking at the question of pole attachment rates and asks if there should be refunds to pole attachers if a finding is made that pole attachment rates are too high?

The FCC is asking the right questions. Carriers wanting to add wires or devices to poles have had two common complaints – the process takes too long and is too expensive.

The dilemma faced by the FCC is that anything other an iron-clad, non-debatable formula for allocating costs for pole attachments will make the timelines worse. If there is even a sliver of a chance for the costs to be negotiated, there will be a lot of disputed negotiations that will be to be resolved by regulators – and that will inevitably add even more time to the pole attachment process.

Let’s put this into perspective with a few examples. I know of several rural electric cooperatives where the poles are in dreadful condition. The poles are old, short, and starting to rot and fall down. I know fiber builders who have walked away from bringing fiber in these areas because they were told that practically every pole must be replaced – at a cost of $20,000 or more per pole. A fair finding might be that these cooperatives should pay for 100% of the cost of pole replacement since it is needed for the electric grid – it’s something they should have been doing anyway. Should a new attacher pay a penalty because pole owners had no pole maintenance and replacement plan? But these cooperatives are in poor counties, and the cooperatives don’t feel they can afford to replace the poles. An order that makes the cooperative quickly replace all of the poles could be an unaffordable burden. There may be no winners in this kind of dispute.

Remember that all pole owners are not neutral parties. Some poles are owned by telcos or by a utility that plans to offer broadband. In both cases, the pole owner has a financial incentive to delay or drive away potential pole attachers who are competitors. These owners might avail themselves of every possible delay that comes from any regulatory established timeline to settle disputes on the pole replacement issue.

For every one of the questions that the FCC is asking, there are real-life examples of sticky situations that are hard to resolve. A solution cannot include a dispute resolution process, or bad actor pole owners are going to fight every pole replacement request.

The very questions that the FCC is asking would lead to the grounds for a dispute. For example, what party can determine the condition of a given pole and if a pole owner will benefit from a free replacement? Who is to define what an early pole replacement looks like – in a way that can’t be disputed? Regulators must be cringing when they read this NPRM because they know they will be flooded with individual disputes over single poles filed by a fiber builder that wants a fast resolution.

Even the simplest solution I can think of would lead to disputes. Consider a solution that uses a formula that determines the share of costs allocated to the pole owner by the age of the pole – the older the pole, the more a pole owner pays. I can promise even that will lead to arguments about the age of a given pole.

Categories
Uncategorized

National Broadband Growth is Slowing

Leichtman Research recently released the broadband customer statistics for the end of the fourth quarter of 2021. The numbers show that broadband growth has slowed significantly for the sixteen largest ISPs tracked by the company. LRG compiles these statistics from customer counts provided to stockholders, except for Cox which is privately owned.

Net customer additions sank each quarter during the year.  The first quarter of 2021 saw over 1 million net new broadband customers. That dropped to just under 900,000 in the second quarter, 630,000 in the third quarter, and now 423,000 in the fourth quarter. The statistics for all of 2021 and for the fourth quarter are as follows:

Annual % 4Q %
4Q 2021 Change Change Change Change
Comcast 30,574,000 1,327,000 4.3% 213,000 0.7%
Charter 28,879,000 1,210,000 4.2% 190,000 0.6%
AT&T 15,384,000 120,000 0.8% (6,000) 0.0%
Verizon 7,129,000 236,000 3.3% 28,000 0.4%
Cox 5,380,000 150,000 2.8% 20,000 0.4%
CenturyLink 4,767,000 (248,000) -5.2% (70,000) -1.5%
Altice 4,389,600 (3,400) -0.1% (1,900) 0.0%
Frontier 2,834,000 (35,000) -1.2% 10,000 0.4%
Mediacom 1,438,000 25,000 1.7% (3,000) -0.2%
Windstream 1,109,300 55,200 5.0% 17,500 1.5%
Cable ONE 992,000 63,000 6.4% 25,000 2.4%
Atlantic Broadband 698,000 18,778 2.7% (222) 0.0%
WOW! 498,800 12,900 2.6% 2,200 0.4%
TDS 493,300 32,700 6.6% 3,200 0.6%
Cincinnati Bell 436,100 3,900 0.9% 1,000 0.2%
Consolidated 401,357 (16,793) -4.2% (6,097) -1.6%
Total 105,403,457 2,951,285 2.8% 422,681 0.4%
Cable 72,849,400 2,803,278 3.8% 445,078 0.6%
Telco 32,554,057 148,007 0.5% (22,397) -0.1%
           
Fixed Wireless 874,000 719,000 82.3%    

There are a few interesting things to keep an eye on in the future. The growth for Comcast and Charter have slowed significantly and my prediction is that there will come a quarter within a year where one or both of them will lose net customers. For several years running, Frontier has been bleeding customers but seems to be turning it around. The big loser is now CenturyLink.

For some reason, LRG is leaving out fixed cellular customers. At the end of 2021, T-Mobile reported 646,000 fixed cellular customers, with 546,000 added in 2021. Verizon is up to 228,000 fixed cellular customers, up by 173,000 during 2021. The two companies, along with AT&T, are making a major push in this market and expect to add millions of customers in 2022 – many at the expense of the other ISPs on the list. It’s an odd choice to exclude these customers since the speeds on fixed cellular are faster than the DSL delivered by the telcos on the list. Also missing are other big providers that are probably larger than Consolidated, like a few of the largest WISPs and fiber overbuilders like Google Fiber.

But even after counting the growth of fixed cellular broadband, it’s obvious that the broadband market growth has cooled. The burst of new customers in 2020 and the first half of 2021 were clearly fueled by homes buying broadband during the pandemic.

It’s also worth noting that the numbers for WOW! and Atlantic Broadband (now Breezeline) have been adjusted for the sale of customers by WOW!.

Categories
Uncategorized

Open Wire Telephony

I saw an article recently that reminded me about the early days of telephone technology. The article talks about the barbed wire fences used to bring the first rudimentary communications links to the remote Texas Panhandle.

Telephony using insulated copper wires started to appear in cities in the U.S. in the decades following the invention of the telephone in 1876 by Alexander Graham Bell. Around the turn of the 20th century, engineers developed a technology that could carry telephone signals for a greater distance using large-gauge bare copper wires. The technology involved installing two side-by-side bare wires – one to communicate in each direction.

Engineers discovered that a properly insulated network bare wire would have minimal power loss even over great distances. Bell Long Lines undertook the stringing of open wire copper routes on tall poles to routes between cities. The physical network consisted of bare copper wires connected at each pole to glass insulators. When I broke into the industry in the 1970s, I think every telephone technician and engineer had a green glass insulator sitting on their desk.

The cost of building poles and installing copper wires was expensive, and Bell Long Lines recovered its investment by providing expensive long-distance calls that generated enough revenue to justify building the network. Long-distance rates to call from coast to coast at the start of the 20th Century were around $1 per minute – adjusted for inflation, that’s over $33 per minute in today’s dollars. Only corporations, the very rich, or the government could afford to make long-distance calls in the early days of telephony.

The article talks about how the ranchers at the XIT ranch in Texas got creative and installed the technology using existing strands of barbed wire. The XIT ranch was gigantic and enclosed three million acres of grazing land. To provide context, the northern border of the ranch was 162 miles long. The quality of the connections using barbed wire was expectedly dreadful, but the connections were good enough to notify other parts of the ranch about emergency situations like a grass fire. The article says the ranchers tinkered and improved the quality of the barbed wire network over time by adding insulators along the fence to minimize the wire touching any surfaces that added interference. Early in my career as a consultant, I visited the XIT Rural Telephone Cooperative, and I remember seeing insulators along barbed wire fences. They were no longer used for telephony, and I never made the connection at the time to understand that the fences had been rudimentary telephone lines.

The technology for using open wire networks improved over time with the introduction of open wire carrier which used frequency division multiplexing. First-generation open wire carrier could carry up to 4 simultaneous calls on a pair of wires, and over time the carrier technology improved to carry up to sixteen calls at the same time on pair of open copper wires.

The use of open wire technology for long-haul transmission of telephone calls carried into the 1970s. Rural telephone companies built last-mile telephone systems using the traditional twisted-pair copper wires. But open wire technology was still the preferred way to send telephone signals for longer distances, and there was always one or more routes leaving a rural telephone exchange that used upon wire technology. These routes were always easy to spot due to the insulators.

The technology lasted even longer for railroads who strung long-haul copper networks between stations so they could have free calling. I wouldn’t be shocked if there are still a few of these routes in place along rural railroad spurs in the remote west.

Open wire technology was ultimately replaced by better technologies using traditional copper wiring, microwave radios, and ultimately by fiber. But many rural telephone companies kept the old open wire routes in place as an emergency backup for times when other wires went out of service.

I’m prompted to write blogs about older telco technologies when I occasionally ponder how far the world of communications has come in a relatively short time. We went from open wires to satellites and fiber optics in less than a century – largely thanks to Bell Labs, which constantly pushed the frontiers of how we communicate. I wonder what the folks at the XIT Ranch of 1900 would make of the 10-gigabit fiber routes that likely serve the area today?

Categories
Uncategorized

Video Continues to Drive Broadband Usage

Nielsen recently published some statistics about the way that we watch video that shows a continuing trend of migration from traditional video to watching video online.

One of the most striking statistics is the total volume of online video. December 2021 saw an aggregate of 183 billion minutes of online video viewing. And even that number is likely small since there are many uses of video on the web that are not likely counted in the total. The prior largest months for video volumes was 178 billion minutes in November 2021 and 160 billion minutes in March 2020, the first month of the pandemic.

Here is a comparison of the video usage by category. Cable means video delivered by a wired cable provider or from a satellite service like DirecTV. Streaming is all sources of online content like Netflix. Broadcast is watching video using an antenna. Other is an all-inclusive category that includes things like gaming, DVDs, and video on demand.

May 2021 Dec 2021
Cable 39% 37%
Streaming 26% 28%
Broadcast 25% 26%
Other 9% 9%

Nielsen has been tracking these numbers for many years. There has been a steady migration from traditional cable viewing to both streaming and broadcast viewing as millions of homes are dropping traditional cable each year. The other category has also been growing, fueled by the explosive growth of online gaming.

Nielsen also reported on the market share of each of the major online video services. The following percentages represent the share that each service has of all online video content. It’s impressive to see that 6.4% of all video content is delivered by Netflix. It’s also impressive to see Disney+ grow so large after having just launched at the end of 2019.

Netflix 6.4%
YouTube 5.8%
Hulu 3.0%
Prime Video 2.1%
Disney+ 1.6%
All Other 8.8%

The reason I’m writing about this topic is a reminder of how many minutes of video usage are still carried by traditional cable TV and by broadband TV. In December there were 242 billion minutes of content delivered by traditional cable TV and another 170 billion minutes delivered by broadcast using antennas. Over time, more and more of these minutes are going to migrate online and is one of the primary driving factors behind the continued explosive growth where broadband networks have been seeing a doubling of overall bandwidth used about every three years. The trends identified by Nielsen mean that ISPs can’t expect any break in that growth for the foreseeable future, likely well into the next decade.

Categories
Uncategorized

Why I am Thankful for 2021

Every year I write a blog at Thanksgiving talking about the things in our industry for which I am thankful. Most years, this is easy because there are always a lot of great things happening in the broadband industry. But 2021 has been hard for a lot of the folks in the broadband industry. I was deeply touched this year by many of the stories I heard during the pandemic. I heard from a rural high school principal who was upset because 9% of the students in his school disappeared when learning went virtual. I talked to a librarian who was distraught watching students sit outside her library in the snow all day to keep up with virtual schoolwork.

Please feel free to comment at the end of this blog about events this year in the broadband industry for which you are thankful.

ARPA Grants Provide Local Solutions. The ARPA grants given directly to local governments are a breath of fresh air in an industry where only carriers have been able to get government funding for broadband projects. Towns, cities, and counties can use this money to solve the most pressing broadband problems in their community – rural residents with no broadband options, low-income neighborhoods that have been left behind, or retail shopping districts that ISPs have ignored. It’s too bad that it took a pandemic to try the idea of empowering local communities to tackle the problems that are specific to their communities.

Finally, an Emphasis on the Digital Divide. The recently enacted Infrastructure Investment and Jobs Act (IIJA) created two new grant programs to address digital equity and inclusion. Together these provide $2.75 billion in grants to tackle the digital divide. We’ve been talking about the digital divide for at least fifteen years, and this is the first significant effort to try to include everybody in the digital economy. It’s hard to imagine being digitally illiterate in today’s economy since so much of daily life is now online.

Technologies Continue to Improve. As I look around the industry, I can see that every broadband technology is getting better. There are scientists and engineers continuing to improve the performance and speeds of the technology that fuels our broadband. We’re seeing a new generation of fiber PON, better fixed wireless radios, a new generation of DOCSIS, and better cellular technology. These will all fuel better broadband.

Improved Cellular Speeds. This might seem like an odd thing to be thankful for. But much of the country saw big leaps in cellular data speeds in 2021 as the cellular carriers launched the new spectrum bands being labeled as 5G. The immediate impact of the upgrades is a giant leap in bandwidth, which makes life easier for a lot of homes without a landline broadband connection. More importantly, the major cellular carriers are all launching unlimited-usage broadband plans using the new cellular spectrum. The new spectrum will enable functional broadband in rural homes close enough to a cell tower. In cities, the faster cellular offers an affordable broadband alternative to those who can’t afford cable company broadband.

Infrastructure Funding? I was a little hesitant to put this on the list. I think we’re going to have to wait for a decade to find out if throwing the huge sum of $42.5 billion at rural broadband is really going to work. I have no doubt that this will make broadband better for huge parts of the country. But I also worry that much of the money will go to the projects that fail or to the giant ISPs who will not be the good stewards of the funding in rural areas. I’m also doubtful about the FCC being able to improve the mapping quickly enough so that communities are not left behind by this funding. Since this is once-in-a-generation funding, I’m happy for the millions of folks who will get better broadband but worried about those won’t.

It’s a Good Year to be a Consultant. I’m not going to kid you, it’s a nice year to be a consultant and to be in high demand. But it’s also a frustrating year. I’ve had to say no to many projects where I know I could have provided the solution clients are seeking. There have been recent weeks when a dozen potential projects came to my attention – and it’s frustrating to have to say no.

Categories
Uncategorized

The Fight Over 12 GHz Spectrum

For an agency that has tried to wash its hands from regulating broadband, the FCC finds itself again trying to decide an issue that is all about broadband. There is a heavyweight battle going on at the FCC over how to use 12 GHz spectrum, and while this may seem like a spectrum issue, it’s all about broadband.

12 GHz spectrum is key to several broadband technologies. First, this is the spectrum that is best suited for transmitting data between the earth and satellite constellations. The only way Starlink is going to be able to grow to serve millions of remote customers in the U.S. is by having enough backhaul to fuel the huge amounts of data that will be passed to serve that many customers. Lack of backhaul bandwidth will significantly limit the total number of customers that can be served and is an obvious major concern of the satellite companies.

It turns out that 12 GHz is also the best spectrum for transmitting large amounts of data with 5G. The carriers have been dabbling with the higher millimeter-wave spectrum, but it’s turning out that there are squirrelly aspects of millimeter-wave spectrum that make it less than ideal in real-world wireless deployments. The 12 GHz spectrum might be the best hope for carriers to be able to deliver gigabit+ wireless drops to homes. Verizon has been deploying fiber-to-the-curb technology using mid-range spectrum and seeing speeds in the range of 300 Mbps. Using the 12 GHz spectrum could provide a reliable path to multi-gigabit wireless drops.

The big question facing the FCC is if 12 GHz can somehow be used to satisfy both needs, pitting the 5G carriers against the satellite carriers. As an aside, before talking more about the issue, I must observe that the satellite companies bring a new tone into FCC proceedings. Their FCC filings do everything except call the other side a bunch of dirty scoundrels. Probably only those who read a lot of FCC documents would notice this, but it’s something new and refreshing.

The current argument before the FCC comes from filings between Starlink and RS Access, which is associated with Michael Dell, who owns a lot of the spectrum in question. But this is part of the larger ongoing battle, and there have been skirmishes that also involved Dish Networks, which is the largest owner of this spectrum.

The FCC will have to somehow untie the Gordian knot on a tough issue. As is to be expected with any use of spectrum, interference is always a major concern. The usefulness of any band of spectrum can be negated by interference, so carriers only want to deploy wireless technologies that have minimal and controllable interference issues. Both sides in the 12 GHz fight have trotted out wireless engineers who support their positions. RS Access says that spectrum can be shared between satellite and terrestrial usage, supporting the idea of not giving more spectrum solely to Starlink. Starlink says the RS Access engineers are lying and wants dedicated spectrum for satellite backhaul. I don’t know how the FCC can sort this out because the only way to really know if spectrum can be shared is to try it.

What I find most unusual about the fight is that the FCC is being dragged into a broadband issue. The last FCC Commission, Ajit Pai, did his best to wash broadband out of the vocabulary at the FCC. But in today’s world, almost everything the FCC does, other than perhaps chasing robocallers, is ultimately about broadband. While this current 12 GHz fight might look like a spectrum battle to an outsider, it’s all about broadband.

Categories
Regulation - What is it Good For? Uncategorized

Another Problem with RDOF

I have been critical of the RDOF awards for a number of reasons, but one of the worst problems isn’t being discussed. When the FCC picked the eligible areas for the RDOF awards, there was no thought about whether the grant award areas make any sense as a service area for an ISP. Instead, the FCC picked Census blocks that met a narrow definition of speed eligibility without any thought of the nearby Census blocks. The result is that RDOF serving areas can best be described as a checkerboard, with RDOF serving areas scattered in with non-RDOF areas.

The easiest way to show this is with an example. Consider the community of Bear Paw in western North Carolina. This is a community of 200 homes, 42 cottages, and 23 condominiums that sticks out on a peninsula in Lake Hiwassee. The community was founded to house the workers who originally built the Tennessee Valley Authority’s Nottley dam on the Hiwassee River. Today’s community has grown from the original cottages. As you might expect for a small town deep into Appalachia, the town has poor broadband, with the only option today being slow DSL offered by Frontier. Residents describe the DSL as barely functional. This is exactly the kind of area where the RDOF awards were supposed to improve broadband.

Below are two maps. The first is printed from the FCC’s RDOF maps – it’s a little hard to read because whoever created the map at the FCC chose a bizarre color combination. On the right is a more normal map of the same area. The red areas on the FCC map are the places where RDOF was claimed by an ISP. As you can see, in a community with only 265 households, the FCC awarded RDOF to some parts of the community and not to others.

 

 

 

 

 

 

The checkerboard RDOF award causes several problems. First, any ISP will tell you that the RDOF award areas are ludicrous – it’s impossible for an RDOF winner to build only to the red areas.

And that’s where the second problem kicks in. The RDOF award winner in Bear Paw is Starlink, the satellite company. Starlink is not going to be building any landline broadband. Unfortunately for Bear Paw, giving the award to Starlink makes no sense. All of the lots in Bear Paw are in the heavy woods – that’s one of the attractions for living in the community. Everything I’ve read say that satellite broadband from Starlink and others will be sketchy or even impossible in heavily wooded areas.

The obvious solution if Starlink doesn’t work well is for the community to try to find another ISP to build fiber to the community. But getting another ISP to build in Bear Paw won’t be easy. Other federal and state grant programs will not fund the red RDOF areas on the FCC map. Even should Congress pass the infrastructure bill, there might not be enough grant money made available to an ISP to make a coherent business case to build to Bear Paw. The FCC checkerboard awards significantly curtail any future grant funding available to serve the community.

The shame of all of this is that any other grant program would have brought a real solution for Bear Paw. With most grants, an ISP would have proposed to build fiber to the entire community and would have applied for the grant project to make that work. But the RDOF awards are going to make it hard, or impossible to ever find solutions for the parts of the checkerboard that the RDOF left behind.

By spraying RDOF awards willy-nilly across the landscape, the FCC has created hundreds of places with the same situation as Bear Paw. The FCC has harmed Bear Paw in several ways. It first allowed a company to win the RDOF using a technology that is not suited to the area. Why wasn’t Starlink banned from bidding in wooded parts of the country? (Or an even better question might be why Starlink was allowed into the RDOF process at all?)  Since no other grants can be given to cover the RDOF areas, there will probably not be enough grant money available from other sources for an ISP to bring fiber to the community. Even if the federal infrastructure funding is enacted and the federal government hands out billions in broadband grant money, towns like Bear Paw are likely going to get left behind. How do you explain to the residents of Bear Paw that the FCC gave out money in a way that might kill their once-in-a-generation chance to get good broadband?

Categories
Uncategorized

Where are the Gigabit Applications?

I remember that soon after the City of Chattanooga launched its citywide fiber network that the company held a competition seeking web applications that would benefit from gigabit speeds. I don’t recall if anything useful came out of that effort, but I know that there are still today almost no big bandwidth applications on the web online aimed at the average household.

There are always a few people in every community that immediately benefit from gigabit broadband. In many of the cities I’ve worked with, some of the first customers who signed up for residential gigabit service are radiologists and other doctors who appreciate the ability to review medical imagery without having to pop into the hospital in the middle of the night. I know of a few cases where doctors now buy multi-gigabit connections as fast as 10-gigabits because a gigabit isn’t fast enough.

There are always a few others in larger markets that also need the full gigabit capability. This would include scientists and engineers who work with huge data sets. I know people who work from home with movie animation who regularly fill a gigabit connection. But these connections are all work-related and represent people who work with large data at the office and who want the convenience of doing so at home. It’s been eight years since Google Fiber got the whole country talking about gigabit broadband speeds – and yet there still are not any killer gigabit applications.

I think we’re finally on the verge of seeing this change. The demand for faster broadband products leaped upward during the pandemic. According to OpenVault, at the end of the first quarter of 2021, the percentage of homes subscribing to gigabit data products jumped to 9.8% of all homes. This grew from 1.9% of homes in 2018 and 2.8% at the end of 2019. This is a profound market change because having 10% of all households subscribing to gigabit broadband means there is finally a potential market for gigabit applications. A company that develops a high-bandwidth application now can be assured that there are enough possible customers to make it worthwhile.

Another factor that makes us ripe for gigabit applications is the continued growth of gaming. It’s hard for folks of my generation to put gaming into perspective. The gaming industry now dwarfs other entertainment segments like movies or television. In 2020, the gaming industry had revenues of almost $140 billion. That includes $73.8 billion for mobile gaming, $33.1 billion for PC gaming, $19.7 billion for game console gaming, $6.7 billion for extended reality, and even $9.3 billion for paid subscriptions to watch others play games.

The gaming industry made a big change just before the pandemic when the biggest game companies moved games to the cloud. The old phenomenon of kids lining up to buy the latest game release at midnight is a thing of the past as games are now introduced online and played in data centers. I think the early big bandwidth applications will be related to gaming – and this is not unusual because entertainment has driven the use of bandwidth in the past – a large percentage of home broadband usage is still to watch video.

The third factor that I think will drive faster broadband applications is generation Z coming of age. This is the generation that grew up with a smartphone in their hands and good home broadband. This is a generation that adapted immediately to homeschooling – while they badly missed their friends and live activities, this is a generation that was already living a large percentage of their lives virtually. Generation Z kids are now in college and will soon be out in the world as a major block of consumers. They are likely going to be the first target audience for faster broadband applications and is also likely to be the generation that will create many of the new applications.

It’s likely that the big bandwidth applications will involve extended reality, which is an umbrella term that covers virtual reality, augmented reality, mixed reality, and other similar technologies. With gigabit-capable homes as customers, we’ll start seeing games that bring telepresence and virtual worlds into the home using big bandwidth. The country is ripe for big-bandwidth applications since 10% of homes are buying gigabit broadband products. We have a huge potential market for innovative gaming since over half of the people in the country now play games at least once a month, with tens of millions who play games regularly. All that is needed now is for a few entrepreneurs to see the potential for developing big-bandwidth games. And as happens with all new technologies, this will grow from a  start in gaming to extend to the rest of us.

Exit mobile version