California Tackles Middle-Mile Fiber

The California legislature unanimously passed legislation in both the Assembly and Senate to fund $3.2 billion for middle-mile fiber and another $2 billion for last-mile networks. It’s an interesting use of broadband money and recognizes something that we don’t talk about enough. There are currently huge federal grants aimed at bringing good last-mile broadband to rural areas, but many of the rural places in America still have inadequate backhaul to reach connectivity to the Internet.

The California legislation recognizes several things. First, there is a lot of federal money currently aimed at providing last-mile networks, but barely any grant funding currently for middle-mile fiber. Second, I imagine legislators in California have all heard stories about how rural communities today lose all Internet access for hours or days at a time when the single fiber reaching the community gets cut or has an electronics failure.

The best fiber last-mile network in the world can’t function if the fiber that routes traffic to and from the Internet goes out of service. The middle-mile fiber networks in rural America are often on the oldest fiber still operating in the country. Much of these routes were built years ago by the big incumbent telephone companies to provide a fiber path to support long-distance traffic. Most of these networks are configured like a wagon wheel where straight-line fiber paths are built from rural communities to a hub larger city in a region.

Like the rest of rural networks maintained by the big telcos, many of these fiber routes have been poorly maintained, and still likely are using electronics that are past the useful life. I wrote a blog last year about how the communities in northwest Colorado banded together to build an alternative for the inadequate middle-mile network still operated by CenturyLink. The existing fiber would maddingly go out of service, sometimes for days, and would leave communities and vital institutions like hospitals and public safety with no Internet access. The communities built a new middle-mile network they labeled as Project THOR, and almost immediately after activation, the new network saved communities from another big regional middle-mile outage.

https://potsandpansbyccg.com/2020/05/06/funding-middle-mile-fiber/

Rural communities not only need reliable middle-mile networks to deliver traffic to and from the Internet, but these networks must be redundant so that a single fiber cut doesn’t kill broadband for an entire community. That means building fiber rings. Too much of our daily lives now rely on broadband, and it is poisonous to a local economy when broadband access dies for an hour or a day.

The California middle-mile plan anticipates open-access where affordable transport can be provided to any ISP that wants to use the network. The legislation has a dual stated purpose – to first provide reliable broadband access to the rural parts of the state, but secondarily to make it easier for ISPs to serve in rural parts of the state. Buying connectivity on the traditional rural middle-mile networks is often unreasonably expensive and has been a barrier for serving pockets of rural customers.

The new networks will focus on reaching parts of the state where businesses and residents don’t have access to broadband faster than 25/3 Mbps. Almost invariably, in most states, these are the region that don’t have adequate middle-mile networks.

It’s always interesting to see any legislators pass something unanimously – it only happens when a topic is indisputably important. Once built, the new middle-mile fiber routes will serve rural California for many decades to come. There won’t be any headlines when this network is functioning and meeting its purpose – instead, there will no longer be news stories of small towns that lost broadband access for a few days.

The Battle Over Grant Rules

There has been a huge battle going on behind the scenes in Washington as Congress wrestles with including broadband grants in the infrastructure bill. Every lobbyist in Washington has been working overtime to try to influence the process. We’ve now seen the Senate’s vision of legislation, but there will still be a big fight when it’s time to reconcile the Senate and House broadband grant rules. Here are some of the key issues being contested in creating final infrastructure legislation:

What is Grant Eligible? This is the biggest area of contention. The big cable companies and telcos only want to see grants being awarded to places that don’t have 25/3 Mbps broadband today. More importantly, they want to see the eligible areas defined by the FCC’s lousy mapping. The big ISPs don’t want to make it easy for communities to make claims that actual broadband speeds are far slower than what has been reported to the FCC.

What the big ISPs definitely don’t want to see is the definition of unserved and underserved to be updated to something closer to reality. They do not want to see grant funding available to areas that don’t have 100/20 Mbps speeds today.

The worst possible scenario for the big ISPs is that local communities get to decide what areas need better broadband like is happening with ARPA funding that’s been given to cities, counties, and states. They know that cities intend to build fiber to poor neighborhoods, or even to whole cities – and the ISPs want to maintain the monopoly in these areas.

Who Decides the Grant Rules? There is also a lot of arm wrestling about who gets to decide the rules for grants. The big ISPs are not happy that Treasury got to set the rules for ARPA grants because that was a new group of decision-makers who have never been lobbied before. Giving the rulemaking ability to Treasury also meant that the White House could provide input by making its wishes known.

There is no perfect answer to this question from the perspective of the big ISPs. They don’t want Treasury to set the rules again. There are several alternatives. One is to let the FCC award the funding through another reverse auction, the option with the highest chance of following the FCC mapping. The rules can be set by the NTIA – a group that lobbyists know well – but which is likely unpredictable if handed tens of billions of dollars. Grant money could be given to the USDA to administer through the ReConnect grant program and the arcane rules at the RUS. Or funding could be given to states to decide – but only of the rules are first restricted to limit states from using the money freely.

Interestingly, I’ve heard credible rumors over the last few months that each of these options has been considered during various permutations of writing the legislation. The key goal for the big ISPs is to be able to influence grant rules, regardless of who will dispense the money. If the rules are set tight enough, much of the grant money could be unusable – and nothing would please the big ISPs more. For example, if money is divvied up evenly by state, then there are many states that can’t spend a pro rata share of the billions. If the rules can strictly only be used in places that can’t get 25/3 Mbps, then it’s probably impossible to spend much of the infrastructure money.

Broadband Speeds for New Infrastructure. A fight over speeds is the same thing as a fight over the technologies that can be built with grant funds. It seems the Senate has accepted the goal for new infrastructure at 100/20 Mbps. This speed enables grant funding to go to cable companies, WISPs, and satellite broadband. If grants can only be used to fund 100 Mbps symmetrical speeds, those two technologies are largely eliminated. You may have noted a spate of opinion pieces lately throughout the industry claiming that we don’t need symmetrical speeds. This is what that argument is all about.

Summary. You can quickly see who is winning the lobbying war by skimming through proposed legislation for these critical elements. On one side of the battle are broadband and community advocates who think we should largely use grant money to build fiber. This side argues for symmetrical speeds. They want communities to decide the areas that need better broadband rather than stick with erroneous FCC maps.

On the other side are big ISPs that don’t want to see fiber everywhere. They are pushing for a strict definition of areas that are eligible for grants, and they want technologies that barely meet 100/20 Mbps to be grant eligible. They want the ability to influence the writing of the grant rules.

We are now deep into the sausage-making part of legislation, and all of these issues are still open for debate. The Senate legislation clearly favors the big ISP position. There is still work to be done to get a bill that reconciles the House and Senate plans – and much more infighting to come. But at this point, the big ISPs and their lobbyists are winning the fight – which likely means in ten years, we’ll still be wondering why many parts of the country didn’t get adequate broadband.

A New Fiber Technician Training Program

Back in February I wrote a blog that talked about the need for fiber technicians in the country. Eleven different industry trade associations wrote a letter to the White House and Congress outlining an upcoming crisis due to a shortage of technicians. The group estimated that the industry would need to find 850,000 new technician man-years by 2025. That’s a huge number. Anybody who has been trying to hire technicians lately knows that there is a shortage. I also have heard from many clients who have seen technicians lured away for higher wages.

The Fiber Broadband Association (FBA) has developed a new technician training program being labeled as the Optical Telecom Installation Certification (OpTIC) program. It’s hoping to get vocational schools, community colleges, and veteran training programs to adopt the program. The first launch of the program will be at the Wilson Community College in Wilson, NC.

The training course consists of 144 hours of class and lab courses that will be followed by a 2,000-hour apprenticeship. The key to making this work will be fiber companies of all types to step up to accept and work with the apprentices. That could be fiber construction companies, ISPs, and anybody else that operates fiber networks. Apprentice programs are great for participants because they get paid during the apprenticeship work, with the pay increasing as they progress through the training program.

The OpTIC program has been recognized by the U.S. Department of Labor, which means that the program is eligible for state and federal grants. That’s really important right now because communities could use ARPA funding to help establish local training programs. I think every community understands that technical training programs are one of the best paybacks that any community can invest in since this creates higher-paying jobs. The eleven trade groups said in February that the average technician salary is $77,500.

Details of the new training program are on this website. Anybody who completes the program will be certified as an FBA accredited OpTIC technician.

For now, this program is being launched at just the one community college. Hopefully, other institutions around the country will jump on the bandwagon. That’s likely going to take ISPs to step up and partner with local schools to get this started. FBA announced the initiative at its recent Fiber Connect annual meeting in Nashville. Hopefully, a lot of attendees at that meeting carried the idea home for further discussion.

There is no doubt that this is needed. We are already in a superheated industry in terms of the amount of fiber construction that is underway this year. ISPs of all sizes are expanding fiber coverage this year. There is even more fiber construction on the way as the current round of grants kick in, including RDOF, NTIA grants, ReConnect grants, EDA grants, and ARPA grants. The top will really be blown off the industry if Congress adds an infrastructure program to build massive amounts of fiber.

Let’s Not Forget the Lobbyists

Common Cause recently released a report, Broadband Gatekeepers, that describes the influence that lobbyists have on broadband policies. The numbers are staggering – the ISP industry spent $234 million lobbying the 116th Congress (2019 and 2020). That number is likely understated since the rules on reporting lobbying are lax, and enforcement is almost nonexistent. That number doesn’t include the huge amounts of lobbying efforts at State legislatures.

The evidence of lobbying is all around us in the industry, yet we don’t talk about it very much. Consider the massive push for 5G a few years ago by the federal government. Industry lobbyists convinced both Congress and the White House that the country was facing a 5G crisis. The lobbyists injected a sense of urgency through the guess of arguing that we are losing the 5G race to China and that our economy will never recover. It’s hard to fully fault the FCC for passing pro-5G rules when they were being pushed to do so by Congress and the White House, all which were prodded by lobbyists.

The cellular carriers had some legitimate concerns, and that’s usually the case for most issues that are heavily lobbied. The FCC was dragging its feet in approving new cellular spectrum. Cities were taking a long time to approve the location of small cell sites.

The intense lobbying on 5G paid off. The FCC gave the carriers carte blanche authority to place small cells anywhere and at a low licensing cost. The FCC sped up and pushed through a ton of new spectrum for the industry. But the 5G effort went too far, and there was serious talk about the US Government buying Nokia or Ericsson so that the US could control its 5G future. All of the government reactions to the supposed 5G crisis were crazy when you consider that we still don’t have any phones served with 5G – and the world has not come tumbling down around us.

5G wasn’t the only issue on lobbyists’ plates during the last decade. Intense lobbying got the last FCC to eliminate broadband regulation by killing its Title II authority. State regulatory Commissions have largely deregulated the big ISPs over the last decade. The big telcos pocketed most of the $10 billion CAF II grants with no repercussions. Numerous states legislatures have passed prohibitions against municipalities and even electric cooperatives from offering broadband. AT&T decided last year to unilaterally stop selling DSL, with no regulatory pushback that I can see. The big ISPs have regularly redlined poor urban neighborhoods. The big telcos stopped maintaining rural copper networks. The two biggest cable companies are on a trajectory for having a basic $100 broadband product. This list could go on for a few pages. It’s pretty obvious that lobbying has paid off big time for big ISPs and cellular carriers. .

The Common Cause report looks at the political spending and lobbying spending of the big ISPs. The report demonstrates how specific lobbying efforts have derailed attempts at broadband regulation. The report specifically focuses on ISP spending during the 116th Congress as an example of how political spending impacts legislation.

Everything detailed in this report is dwarfed by the current lobbying efforts of the big ISPs, which are trying to stave off real competition through the billions in broadband grants that are raining down on the industry. The big ISPs are genetically opposed to competition in any form, even in the rural markets they wrote off decades ago. There is intense lobbying at every level of government to not use grants to build fiber except in rural areas.

The report discusses some commonsense legislation that could put some brakes on lobbying by requiring more openness and disclosures. While this blog looks only at the broadband industry, it’s scary to think that there are many similar lobbying efforts by large corporations throughout the economy.

Federal Broadband Coordination

The White House is now requiring that the three agencies that are involved with broadband funding – the National Telecommunications and Information Administration (NTIA), the Federal Communications Commission (FCC) and the U.S. Department of Agriculture (USDA) – to share information about broadband funding.

The agencies have agreed to share the following information about any location that is receiving or is under consideration for federal broadband funding:

  • Every ISP serving in the area
  • Description of broadband technology and speeds being delivered for each ISP
  • Maps showing the geographic coverage of ISPs
  • Identity and details of any ISP that has received or will receive broadband funds from any of the three agencies.

This kind of coordination seems vital in the current environment and where all three agencies are awarding sizable grants. It’s not hard to imagine having different ISPs seeking grants from different federal grant programs to serve the same geographic areas.

But then what happens? Will two agencies collaborate to decide which grant program will make the award? That would add another layer of complexity to grants if a grant application filed with one agency is suddenly conflicting with a grant request at another agency. Will ISPs be informed if discussions are happening behind the scenes between agencies concerning a grant request?

This also raises the issue of different agencies having significantly different grant requirements. We’re already seeing differences among grants in terms of identifying areas that are eligible for grant awards, different definitions of qualifying broadband speeds, different lists of technologies that will or won’t be funded, etc. How can the agencies collaborate if grants trying to serve the same area are following different grant rules? For example, what does collaboration mean when grants at one agency allow for wireless technologies when grants at another agency don’t?

One of the most troublesome aspects of this arrangement is that the agencies are going to share information on existing broadband speeds and coverage. The whole industry understands that the FCC’s database for this data is often badly flawed. Some grant programs today are open to examining facts that prove the errors in the FCC mapping data – but will the FCC be open to having its data challenged by a grant request filed with a different agency?  For collaboration to be effective, all three agencies have to be working with the same set of facts.

One of the oddest aspects of this collaborative effort is that it’s only required to last two years and any of the three agencies is free after that to end the collaboration. That makes it sound like somebody doesn’t think this is going to work.

The collaboration sounds like a worthwhile effort if the agencies can work together effectively. But it’s not hard to imagine the collaborative effort adding complexity and possibly even paralysis when considering multiple grants for the same location. How will the three agencies resolve differences between grant programs? My biggest fear is that this effort will add paperwork and time to the grant process without improving the process.

Starry Coming to Columbus, Ohio

I’ve been predicting for several years that wireless broadband is coming to metropolitan areas. There are several factors aligning to support this trend. Most importantly, there have been big advances in millimeter-wave radios that can bring decent broadband. Second, DSL is clearly on the way out – in fact, AT&T will no longer install DSL or even copper voice customers. Finally, the big cable companies feel that they’ve won the broadband battle and are flexing their monopoly power by aggressively raising rates – both Comcast and Charter are on a path to have $100 broadband in just a few years. These factors leave a void for an ISP with low prices and decent broadband.

Perhaps the splashiest wireless company to tackle the market niche is Starry. The company is owned by Chet Kanojia, who you may remember as the founder of Aereo, which was trying to offer a cheap alternative to local programming.

Starry has been operating for several years by beaming gigabit broadband to high-rise apartment buildings in Boston, New York City, Washington D.C., Denver, and Los Angeles. In those markets, Starry has been offering broadband using the 37 GHz spectrum band through a market test license from the FCC.

Starry offers an interesting broadband product. The company always posts its speeds on its website, and as I wrote this article, the average speed for Starry customers is 204 Mbps download and 201 Mbps upload – an attractive product in today’s market. The company offers low prices – in current markets, the price is $50 per month with no contracts, no connect fee, and no gimmicks. The company also provides 24/7 live customer service.

Starry is ready to roll out the next generation of millimeter-wave technology and has chosen Columbus, Ohio as the first market. Rather than offer broadband only to high-rise apartments, the wireless technology will be available to everybody from high-rises to single-family homes and will cover downtown and stretch into nearby suburbs.

Starry is taking advantage of radios that can bounce signals from one customer to the next. This creates a neighborhood mesh network around each base transmitter. Starry also is deploying Time Division Duplex (TDD), which handles upload and download bandwidth differently than other ISPs. With TDD, there are both download and upload timeslots built automatically into the transmission path. This allows a single frequency and channel to handle both upload and download functions simultaneously. One user in a household can be downloading while somebody else uploads at the same time using a single frequency channel. The Starry technology will further vary the number of upload or download time slots depending upon demand.

Because of the TDD technology, the Starry home receiver is a sophisticated piece of electronics and not a simple receiver. When Starry first launched, the cost of the receiver was nearly $500 but is now approaching $200.

Starry will be launching in Columbus with a $25 introductory price for early adopters but will likely get back soon to a standard $50 rate. Starry has big plans to eventually pass up to 40 million urban households. Starry also won $268.9 million in the RDOF grants to bring broadband to over 100,000 rural homes, with a network that will be a hybrid of both fiber and fixed wireless.

Starry will  be the first splashy wireless company to hit urban markets, but it won’t be the last. There is a solid market in every city for decently priced broadband with decent speeds. There are folks in every market who want an alternative to the big cable companies, and wireless technology is poised to fill the niche being vacated by DSL. Expect to hear a lot more from Starry and from others that start popping up in urban markets.

Buying Copper Networks

There was a surprise announcement recently that Apollo Global Management is in serious talks to acquire $5 billion of copper assets from Lumen (CenturyLink). This is not a done deal and could fizzle, but it raises the question of why companies would spend anything to buy dying copper networks.

There are some clear downsides to buying an existing copper-based telco. The current DSL technology is obsolete. One has to assume that a buyer plans to walk away from the copper networks as soon as reasonably possible after buying. And that is the major dilemma to overcome. Buying and then upgrading a copper property effectively means paying for the property twice.

Anybody willing to spend $5 billion to buy copper networks and then billions more to upgrade the properties has other options. Why not just go out and overbuild fiber in many dozens of county seats where a fiber competitor could thrive?

I’ve had clients that were faced with the same opportunity over the last decade, and it’s not an easy decision. One of the big upsides of buying is getting the existing revenue stream. But in the case of Lumen, that revenue stream has been dying as households find alternatives to DSL. But the immediate revenue stream, even if small, can help to fund the new property.

Perhaps the big plus to such a transaction is the many county seats and other towns where Lumen operates as an ISP. In most cases, Lumen has already lost the battle against the cable company competitor – but in rural towns it’s not unusual for the cable company network to also be outdated and underperforming. Competing against Charter and Comcast in rural markets is not the same as competing against these companies in upgraded urban markets.

Another interesting upside to buying these properties is that Lumen has some vitally needed fiber. Even where there is no fiber to customers, a fiber network connects all of the small towns in a region together. This backhaul makes it possible to support all of the towns in a region. Another network upside is that new fiber can be overlashed onto old copper wires for a lot lower cost than adding new standalone fiber to poles. Lumen already owns the rights-of-ways, and there would be little or no make-ready costs needed to overlash fiber. From this perspective, buying Lumen almost equates to buying a huge messenger-wire network ready to accept fiber.

Of course, there are big downsides to buying an existing telco. If the buyer is smart, they will want to walk away from much of the rural areas, much like AT&T has recently walked away from DSL. The dilemma is that a buyer is probably going to have to make promises to regulators that it won’t abandon rural areas quickly.

There is also the huge operational challenge of taking over a big-company network. I’ve helped clients do this several times, and it’s a bear. The purchase generally transfers the assets but not the operational systems supporting the assets. Even if those systems somehow come along, the big telcos are using software that is massively out of date and obsolete. Big telcos also are highly decentralized, and many of the functions that support the properties are a challenge to work with during a transition and hard to replace quickly.

We’ve seen big companies in the past that failed at the process of consolidating obsolete copper networks. One only has to look at Frontier and Windstream to see how hard it is to make this work. A buyer of this many properties will likely have grandiose plans to overbuild all of the lucrative parts of markets quickly – before somebody else does so. But there doesn’t seem to be an easy solution about what to do with the rural copper networks. No big company is ever going to be satisfied with the returns from rural markets even should it do everything right.

Time to Revisit the Small Cell Order

One of the most controversial orders from the last FCC chaired by Ajit Pai was the order in 2018 that small cell sites be given priority. That order made two specific rulings. First, the FCC issued a declaratory ruling which said that the FCC has the authority to override local and state regulations concerning small cell deployments. Second, the order created a ‘shot clock’ that requires localities to process small cell applications quickly while also setting a cap on local fees.

Earlier this year, NATOA (the National Association of Telecommunications Officers and Advisors) and the CWA (Communications Workers of America) released a letter and a report that argues that it’s time to revisit that FCC order. They argued that the timeline set by the order is ridiculously, short considering the complexity of some of the installations. They also point out that cellular carriers are not using the FCC order to install ‘pizza boxes’ on poles as cellular carriers originally promised but are placing devices as large as refrigerators on poles, creating dangerous situations for technicians of other utilities that have to navigate around the large devices. Finally, the letter argues that there is no justifiable reason for setting small cell application fees below cost – cities are being required to subsidize the big cellular companies.

It’s important to put the original FCC order into context before taking a position on the issues raised in this letter. Starting around 2015, the cellular industry declared an emergency and said that the US was falling badly behind China in the race towards 5G. Both the White House and the Congress jumped aboard on the issue and said that quickly deploying 5G must be a top priority for the US economy. You might recall that the US government went so overboard on the 5G race that there was even talk about the US government buying Nokia or Ericsson so that the US wouldn’t be left behind.

In this environment, where pushing 5G forward was considered a national emergency, it was easy for the FCC to push through this order that gave cellular carriers everything on their wish list concerning small cell deployments. Just six years later, we can see that 5G deployment was not an emergency. None of the big promises made about 5G have materialized, and in fact, the cellular carriers are still struggling to define a business plan that will monetize 5G.

The real reason for the push for 5G was that the 4G cellular networks were getting overloaded – and small cell sites were needed to bolster the existing cellular networks. Everybody relies on our cellular networks, and that was a legitimate reason for the FCC to take action – but the cellular companies never publicly made this argument. The carriers didn’t want the public to know that their 4G networks were in trouble since that would hurt their stock prices. Instead, the cellular companies pulled off one of the biggest public relations scams in history and invented the 5G race to push through regulations that benefitted them.

I agree with the CWA and NATOA that it’s time to put the genie back in the bottle and revisit the small cell order. Like with all regulatory policy disputes, both sides of the issue have some legitimate concerns. The cellular carriers had a legitimate beef when they said that some cities took far too long to process permits for small cell sites. The cities also had legitimate concerns – they wanted some say so in the placement and aesthetics of the small cell deployments – and they want to be able to say no to putting a refrigerator-sized device in the middle of a busy pole line.

It’s time for the FCC to reopen this docket and try again. We now know the kinds of devices that the cellular carriers want to place, and there can be separate rules for placing pizza boxes versus refrigerators on poles. We also now have thousands of examples of the effort required by cities to review and implement small cell requests. A new docket could examine the facts instead of being pushed forward by an imaginary 5G national emergency.

The cellular carriers got everything they wanted, and any regulatory ruling that is this one-sided is almost always a bad one. We now understand that there is no 5G race with China – but we also recognize that cellular carriers have a legitimate need to keep placing small cell sites. It’s time for the FCC to weigh the facts and reissue rules that put a balance between cellular carrier and city interests – because that’s what good regulations are supposed to do.

The Reemergence of Holding Times

There is an interesting phenomenon happening with ISP networks that I don’t see anybody discussing. During the last year, we saw a big change in the nature of our broadband usage in that many of us are connecting to remote work or school servers, or we are connecting to long Zoom calls.

We already can see that these changes have accelerated the average home usage of broadband. OpenSignal reports that the average broadband usage per home grew from 274 gigabytes per month just before the pandemic up to 462 gigabytes per month measured at the end of the first quarter of this year. Since much of the new usage came during the daytime, most ISPs reported that they were able to handle the extra usage. This makes sense because ISP networks in residential neighborhoods were relatively empty during the daytime before the pandemic – adding the additional usage at these non-busy times did not stress networks. Instead, the daytime hours started to become as busy as the evening hours, which have historically been the busiest time for residential networks.

But there is one impact of the way networks are now being used that is impacting ISPs. Before the pandemic, most of the use of the Internet in residential neighborhoods was bursty. People shopped or surfed the web, and each of these events resulted in short bursts to the Internet. Even video streaming is bursty – when you watch Netflix, you’re not downloading a video continuously. Instead, Netflix feeds you short, fast bursts of content that cache on your computer and keeps you ahead of what you are watching.

But our new network habits are very different. People are connecting to a school or work server with a VPN and keeping the connection for hours. Most Zoom video calls last 30 minutes to an hour. Suddenly, we’re using bandwidth resources for a long time.

In telephone networks, we used to refer to this phenomenon as holding times. Holding times were important because they helped to determine how many trunks, or external connections were needed to handle all of the demand. A longer holding time for a given kind of traffic meant that more external trunks were needed for that kind of calling. This is pure math – you can fit twice as many calls into an hour if the average holding time is five minutes instead of ten minutes. A telephone company would have multiple kinds of trunks leaving a central office – some trunks for local traffic between nearby exchanges and other trunks for different types of long-distance traffic. Traffic engineers measured average holding times to calculate the right number of trunks for each kind of traffic.

The fact that residents are maintaining Internet connections for hours is having the same kind of impact on broadband networks. The easiest place to understand this is in the neighborhood network. Consider a neighborhood served by DSL that has a DS3 backhaul provided by the telephone company – that’s 45 megabytes of capacity. Such a connection can support a lot of bursty traffic because requests to use the Internet come and go quickly. But the new, long-duration  broadband holding times can quickly kill a DSL neighborhood connection, as we saw during the pandemic. If only 20 homes in the neighborhood (which might consist of 100 homes) connect to a school or work server using a 2 Mbps connection, then 40 of the 45 megabytes is fully occupied for that use and can’t be used for anything else. It’s possible for this local network to become totally locked with heavy VPN usage.

This kind of network stress doesn’t just affect DSL networks, but every broadband technology. The connections inside the networks between homes and the hub have gotten far busier as people lock up Internet links for long periods of time. For technologies like DSL with small backhaul pipes, this phenomenon has been killing usage for whole neighborhoods. This is the phenomenon that killed the upload backhaul for cable companies. For technologies with larger backhaul bandwidth, this phenomenon means the backhaul paths are much fuller and will have to be upgraded a lot sooner than anticipated.

This phenomenon will ease somewhat if schools everywhere go live again. However, it appears that we’re going to continue to have people working from home. And video calling has moved into the mainstream. That means that backhaul connections inside ISP networks are a lot busier than any network engineer would have predicted just two years ago. While some of the extra traffic comes from increased broadband volumes, much of it is related to the much longer customer holding times – a term we’ve never used before with broadband networks.

Hollow Core Fiber

BT, formerly known as British Telecom has been working with Lumenisity to greatly improve the performance of hollow core fiber. This is fiber that takes advantage of the fact that light travels faster through air than it does through glass. In a hollow core fiber, air fills center tubes surrounded by glass. As can be seen by the picture accompanying this blog, multiple tubes of glass and air are created inside a single fiber creating a honeycomb effect.

There was news about hollow core fiber a decade ago when a lab at DARPA worked with Honeywell to improve the performance of the fiber. They found then that they could create a single straight path of light in the tubes that was perfect for military applications. The light could carry more bandwidth for greater distances without having to be regenerated. By not bouncing through glass, the signal maintained intensity for longer distances. DARPA found the fixed orientation of light inside the tubes to be of great value for communication with military-grade gyroscopes.

Until the recent breakthrough, the hollow tube fiber was plagued by periodic high signal loss when the light signal lost it’s straight-path coherence. Lumenisity has been able to lower signal loss to 1 dB per kilometer, which is still higher than the 0.2 dB loss expected for traditional fiber. However, the lab trials indicate that better manufacturing process should be able to significantly lower signal loss.

The Lumenisity breakthrough comes from the ability to combine multiple wavelengths of light while avoiding the phenomenon known as interwave mixing where different light frequencies interfere with each other. By minimizing signal dispersion, Lumenisity has eliminated the need for digital signal processors that are used in other fiber to compensate for chromatic dispersion. This means repeater sites that can be placed further apart and that require simpler and cheaper electronics.

Lumenisity doesn’t see hollow core fiber being used as a replacement on most fiber routes. The real benefits come in situations that require low latency along with high bandwidth. For example, the hollow core fiber might be used to feed the trading desks on Wall Street. The fiber might improve performance for fiber leaving big data centers.

Lumenisity is building a factory in the U.K. to manufacture hollow core fiber and expects to have it in mass production by 2023.