Improvements in Undersea Fiber

We often forget that a lot of things we do on the web rely on broadband traffic that passes through undersea cables. Any web traffic from overseas gets to the US through one of the many underwater fiber routes. Like with all fiber technologies, the engineers and vendors have regularly been making improvements.

The technology involved in undersea cables is quite different than what is used for terrestrial fibers. A  long fiber route includes repeater sites where the light signal is refreshed. Without repeaters, the average fiber light signal will die within about sixty miles. Our landline networks rely on powered repeater sites. For major cross-country fiber routes, multiple carriers often share the repeater sites.

But an undersea cable has to include the electric power and the repeater sites with the fiber since the cable may be laid as deep as 8,000 beneath the surface. HMN Tech recently announced a big improvement in undersea electronics technology. On a new underseas route between Hainan, China and Hong Kong, the company has been able to deploy 16 fibers with repeaters. This is a huge improvement over past technologies that have limited the number of fibers to eight or twelve. With 16 lit fibers, HMN will be able to pass data on this new route at 300 terabits per second.

Undersea fibers have a rough existence. There is a fiber cut somewhere in the world on underseas fiber every three days. There is a fleet of ships that travel the world fixing underseas fiber cuts or bends. Most underseas fiber problems come from the fiber rubbing against rocks on the seabed. But fibers are sometimes cut by ship anchors, and even occasionally by sharks that seem to like to chew on the fiber – sounds just like squirrels.

Undersea fibers aren’t large. Near to the shore, the fibers are about the width of a soda can, with most of the fiber made up of tough shielding to protect against dangers that come from the shallow waters near to shore. To the extent possible, an undersea fiber will be buried near shore. Further out to sea, the size of the fibers is much smaller, about the size of a pencil – there is no need to try to protect fibers that are deep on the ocean floor.

With the explosion in worldwide data usage, it’s vital that the cables can carry as much data as possible. The builders of the undersea routes only count on a given fiber lasting about ten years. The fiber will last longer, but the embedded electronics are usually too slow after a decade to justify continued use of the cable. Upgrading to faster technologies could mean a longer life for the undersea routes, which would be a huge economic benefit.

Technology Neutrality

Christopher Ali, a professor at the University of Virginia, says in his upcoming book Farm Fresh Broadband that technology neutrality is one of the biggest policy failures of our time. I completely agree, and today’s blog explores the concept and the consequences.

Over the last decade, every time that a pot of grant money has appeared on the horizon, we’ve heard talk at the FCC about making sure that there is technology neutrality when choosing the winners and losers of federal grants. This phrase had to be invented by one of the big ISPs because as is often typical of DC politics, the meaning of technology neutrality means exactly the opposite of what you might think it means.

Technology neutrality is a code word for allowing slower technologies to be funded from grants. The first time I remember hearing the phrase was in 2018, during the lead-up to the CAF II reverse auction. This was a $2 billion reverse auction for locations that hadn’t been claimed in the original FCC CAF II program. Many in the industry thought that federal grant funds ought to only be used to support forward-looking technologies. The term technology neutrality was used to support the argument that all ISPs and technologies should be eligible for grant funding. It was argued (mostly by ISPs that use slower technologies) that the FCC should not be in the game of picking winners and losers.

The technology neutrality proponents won the argument, and the FCC allowed technologies with capabilities as slow as 25/3 Mbps into the reverse auction. The results were what might be expected. Since lower-speed technologies tend to also be the least expensive to build, the slower technologies were able to win in a reverse auction format. It was not surprising at the end of that auction to see that three of the four top winners will collect $580 million to deploy slower technologies. This included fixed wireless providers AMG Technology (Nextlink) and WISPER, as well as high-orbit satellite provider Viasat.

The same argument arose again as the rules were being developed for the RDOF reverse auction. The first auction offered $14 billion in subsidies for ISPs to build last-mile broadband in places that the FCC thought had no broadband with speeds of at least 25/3 Mbps. The FCC heard testimony from the industry about the technologies that should be eligible for the subsidies. In the end, in the name of technology neutrality, the FCC allowed every technology into the reverse auction. The following is a quote from the FCC order that authorized the RDOF funding:

Although we have a preference for higher speeds, we recognize that some sparsely populated areas of the country are extremely costly to serve and providers offering only 25/3 Mbps may be the only viable alternative in the near term. Accordingly, we decline to raise the required speeds in the Minimum tier and we are not persuaded that bidders proposing 25/3 Mbps should be required to build out more quickly or have their support term reduced by half.

Again, it was not surprising to see that the list of RDOF winners included companies that will use the funding to build slower technologies, including fixed wireless and DSL. Only two of the top winners promised to build gigabit-capable broadband everywhere (a consortium of electric cooperatives and Charter). The FCC also decided at the last minute to allow Starlink into the auction – even those nobody knew at that time the speeds that could be delivered. The FCC really goofed up the technology issue by allowing some WISPs to bid and grab major winnings in the auction by promising to be able to deliver gigabit speeds with fixed wireless technology – a technology that doesn’t exist for a rural setting.

We recently saw the technology neutrality issue rear its head again in a big way. As the Senate was crafting legislation for a major infrastructure program, the original draft language included a requirement that any technologies built with the money should be able to immediately deliver speeds of 100/100 Mbps. That requirement would have locked out fixed wireless and cable companies from the funding – and likely also satellite companies. In backroom wrangling (meaning pressure from the big ISPs), the final legislation lowered that threshold to 100/20 Mbps.

The reason that Ali says that this is a policy failure is that the broadband policymakers are refusing to acknowledge the well-known fact that the need for broadband speeds continues to increase year after year. We just went through a miserable pandemic year where millions of homes struggled with inadequate upload broadband speeds, and yet the technology neutrality canard was rolled out yet again to justify building technologies that will be inadequate almost as soon as they are built. I would argue that the FCC has an obligation to choose technology winners and losers and shouldn’t waste federal broadband money on technologies that have no long-term legs. The decision by regulators and legislators to allow grant funding for slower technology means that the speed that current ISPs can deliver is being given priority over the speed people need.

The Pandemic and the Internet

Pew Research Center conducted several polls asking people about the importance of the Internet during the pandemic. The Pew survey report is seven pages filled with interesting statistics and a recommended read. This blog covers a few of the highlights.

The Overall Impact of the Internet. 58% of adults said that the Internet was essential during the pandemic – that’s up from 52% in April of 2020. Another 33% of adults say the Internet was important but not essential. Only 9% of adults said the Internet wasn’t important to them. The importance of the Internet varied by race, age, level of education, income, and location.

  • As might be expected, 71% of those under 30 found the Internet to be essential compared to 38% of those over 65.
  • 71% of those with a college degree found the internet to be essential versus 45% of those with a high school degree or less.
  • 66% of those in the upper third of incomes found the Internet to be essential compared to 55% of those in the lower third.
  • 61% of both urban and suburban residents found the Internet to be essential compared to 48% for rural residents.

Video Calling Usage Exploded. Possibly the biggest overall change in Internet usage has been the widespread adoption of video calling. 49% of adults made a video call at least once per week, with 12% doing so several times per day. The usage was most pronounced for those who work from home, with 79% making a video call at least once per week and 35% connecting multiple times per day.

Longing for a Return to Personal Interactions. Only 17% of Americans say that digital interactions have been as good as in-person contacts, while 68% say digital interactions are useful but no replacement for in-person contacts.

Challenges with Online Schooling. Only 18% of households said that online schooling went very well, with 45% saying it went somewhat well. 28% of households reported it was very easy to use the technology associated with online schooling, with another 42% saying it was somewhat easy. Twice as many people from the lower one-third of incomes said online schooling technology was difficult than those in the upper one-third of incomes. Nearly twice as many people in rural areas found online schooling technology to be a challenge compared to suburban residents.

Problems with Internet Connections. 49% of all survey respondents said they had problems with the internet connection during the pandemic. 12% experienced problems often.

Upgrading Internet. 29% of survey respondents said they did something to improve their Internet connection during the pandemic.

Affordability. 26% of respondents said they are worried about the ability to pay home broadband bills. This was 46% among those in the lower one-third of incomes.

Tech Readiness. 30% of Americans say they are not confident using computers, smartphones, or other connected electronics. This was highest for those over 75 (68%), those with a high school degree or less (42%), and those in the lower one-third of incomes (38%).

Treasury Defines Capital Project Fund Grants

The U.S. Department of the Treasury finally released the rules for the $10 billion Capital Projects Fund that will be distributed to states for broadband. The full rules are here.

This blog is not going to spit back all of the rules. Those have already been outlined well by others. Here is a great summary from Kevin Taglang from the Benton Institute.

States must apply to Treasury for the funds. The amount that each state can receive is here. A lot of the recently released rules tell states how to go about the process of claiming the money. States must make an application by December 27 and have until a year later to file details of the specific grants made within the state.

It’s hard to think that states won’t pursue this money, although a few small states might have problems finding enough eligible projects. I’m going to concentrate below on a few of the Treasury rules that will carry into state grant rules.

States Will Administer Grants. States will make awards to specific projects. Each state will need a grant program that follows the federal rules for this money. Since these new rules are different than the rules governing many existing state grant programs, the states will have to quickly adjust in order to follow these rules for at least this one grant. Some states are going to need legislative changes if current grant rules are established by the legislature.

Communities and States Can Define Eligible Areas. These grants do not use FCC mapping in determining eligibility. A grant area must only be shown to not have reliable 100/20 Mbps broadband in order to be eligible – and this is a very loose test. Treasury provides wide leeway in defining eligible areas, and almost any reasonable form of proof of poor broadband can suffice to prove an area is eligible. Of course, states will have some say in defining eligible areas, and I foresee a huge tug-of-war over this issue between state grant offices, communities, ISPs, and legislators.

Symmetrical Gigabit Speeds. Grant technologies must be able to provide symmetrical 100 Mbps speeds. There is going to cause confusion all over the industry as different grant programs have different speed requirements. This might also require legislative changes in some states. There is a provision that says that speeds can be slower where 100/100 Mbps isn’t practical, so expect a lot of challenges by ISPs trying to fund slower technologies.

Eligible Projects. A project must meet all of the following requirements: A project must be spent on capital assets that will enable work, education, and health monitoring. Projects must address a critical need that results from or was made obvious during the pandemic. Projects must address a critical community need.

Mostly for Infrastructure. Treasury wants a priority for last-mile infrastructure. States can request middle-mile projects, but Treasury must approve. Some money will be allowed for devices, but the state must retain ownership of devices. Money can go for improvement to government facilities that meet all of the eligibility rules.

No Required Matching. Treasury allows states to fund projects 100%, with no matching. But states might require matching in order to spread the grant benefits to more projects.

Some Prior Costs. Costs back to March 1, 2021 can be included in a grant under some circumstances. This might cover costs like a feasibility or engineering study.

Labor Standards. The rules do not mandate paying Davis-Bacon wages, but it encourages projects to pay a living wage.

Projects Completed by End of 2026. Projects must be completed by then, although Treasury has the ability to grant extensions.

Summary. I expect most states will grab the available funding. This funding should result in a state-administered grant program in every state in 2022 since states have to demonstrate having awarded this money by the end of next year. Since states are likely to put their own twist on these rules, keep an eye out for specific state rules. And start getting projects ready!

You’ve Got Mail

I’ve always been intrigued by the history of technology, and I think a lot of that is due to having almost everything computer-related happen during my lifetime. I missed a tech anniversary earlier this year when email turned 50.

It was April 1971 when software engineer Ray Tomlinson first used the @ symbol as the key to route a message between computers within ARPANET, the birthplace of the Internet. Tomlinson was working on a project at the U.S. Advanced Research Projects Agency that was developing a way to facilitate communications between government computers. There had been transmission of messages between computers starting in 1969, but Tomlinson’s use of the @ symbol has been identified as the birth of network email.

ARPA became DARPA when the military adopted the agency. DARPA kept a key role in the further development of email and created a set of email standards in 1973. These standards include things like having the “To” and “From” fields as headers for emails.

Email largely remained as a government and university protocol until 1989, when CompuServe made email available to its subscribers. CompuServe customers could communicate with each other, but not with the outside world.

In 1993, AOL further promoted email when every AOL customer was automatically given an email address. This led to the “You’ve got mail” slogan, and I can still hear the AOL announcement in my head today.

In 1996, Hotmail made a free email address available to anybody who had an Internet connection. Millions of people got email addresses, and the use of email went mainstream. If you ever used Hotmail, you’ll remember the note at the bottom of every email that said, “P.S. I love you. Get your free email here”. Hotmail was purchased by Microsoft in 1997 and was morphed over time into Outlook. This was one of the first big tech company acquisitions, at $400 million, which showed that huge value could be created by giving away web services for free.

In 1997, Yahoo launched a competing free email service that gave users even more options.

In 2004, Google announced its free Gmail service with the announcement that users could have a full gigabyte of storage, far more than anybody else offered.

Over the years, there have been many communications platforms launched that promised to displace email. This includes Facebook Messenger, WeChat, Slack, Discord, and many others. But with all of these alternate ways for people to communicate, email still reigns supreme and usage has grown every year since inception.

There are over 300 billion emails generated every day. Gmail alone has 1.6 billion email addresses, representing 20% of all people on the planet. In the workplace, the average American employee sends 40 emails each day and receives 121.

The beauty of email is its simplicity. It can work across any technology platform. It still uses HTML protocols to create headers and add attachments to an email. Routing is done with SMTP (Simple Mail Transfer Protocol) that allows messages to be sent to anybody else in the world.

On the downside, the ease of email has spawned spam when marketers found that they could sell even the most bizarre products if they sent enough emails. In recent time, emails have been used to implant malware on a recipient’s computer if they willingly open attachments.

One downside for the future of email is that many Americans under 30 hate using it. We’ll have to see over time if email gets displaced, but it would be a slow transition.

But email is still a powerful tool that is ingrained in our daily lives. Email was one of the early features that lured millions into joining the web. So happy birthday, email.

Using Private Rights-of-Way for Fiber

As if the broadband industry didn’t already have enough obstacles, a new issue has arisen in Virginia. A couple in Culpepper County, John and Cynthis Grano, have sued the Rappahannock Electric Cooperative to stop it from putting fiber on existing pole lines that are located on a private easement.

To put this lawsuit into perspective, Virginia law in the past would have required a utility to negotiate a private easement to gain access to the placement of utility networks on private land. But in 2020, the legislature passed a new law that allows electric and communications utilities to add fiber along existing aerial and buried rights-of-way without getting additional permission from property owners. This law was passed to make it easier to build fiber in rural Virginia.

The Cooperative was getting ready to embark on a $600 million rural fiber project to bring broadband to rural customers, but this lawsuit has caused the Cooperative to halt plans for now.

As is usual with lawsuits, there are always additional facts to consider. The rights-of-way in question are not along a road in a public right-of-way. Instead, the fiber route cuts across the landowner’s property, which also is the site for one of the Cooperative’s electric substations. Prior to the law being passed, the Cooperative had offered a $5,000 fee to use the rights-of-way on the property.

It might seem logical that the new law would have preempted this kind of lawsuit – because this situation is exactly what legislators had in mind when they passed the law. But I’ve learned in this industry that a new law is only truly secure after the law has been successfully tested in court.

This case has already made it through the first round of the courts, where a U.S. District Court sided with the property owners. The ruling said that the new law stripped the property owners of existing rights that had been established in the 1989 easement agreement with the Cooperative. The court said that landowners had lost property value even without the Cooperative trying to hang new fiber on existing easements.

This lawsuit has to bring a chill to any fiber builder in the country that relies on private rights-of-way and easements to build their project. The right to use public rights-of-way has been long established and cemented by challenges to laws early in the last century. This new Virginia law tried to grant the same status to private easements that have always been given to public rights-of-way – and that is a new area of law.

I would have to assume that for this issue to stop the fiber expansion that the Cooperative must have a lot of electric lines that use private rights-of-way. Electric grids routinely cross private land – the large tower transmission grids mostly use private rights-of-way, and utilities rarely build high-voltage routes along public roads. If the issue was only with this one farm, the Cooperative could probably bypass it, but I’m sure the issue applies to many other properties as well.

The lawsuit should raise a red flag for any ISP that has rights-of-way on private land. There are a lot more private easements in place than you might suppose. Many subdivisions own their own roads. Private roads are routine in rural areas. ISPs routinely rent land for huts and cabinets.

None of this will be any comfort to the many households that were slated to get fiber broadband. Electric cooperatives like Rappahannock are leading the way in a lot of rural America for bringing fiber to areas with little or no current broadband. Virginia has a state goal to solve the rural broadband gap by the end of 2024, and this lawsuit will put a damper on those plans. Just a little side note that will drive broadband advocates crazy, the property owner in this case has subscribed to Starlink and is not impacted by having to wait for better broadband.

Preparing for Storm Damage

After every major hurricane, like the category 4 Ida that recently hit Louisiana, there is talk in the telecom and power industries about ways to better protect our essential power and communication grids. There was major damage to grids and networks in Louisiana from hurricane winds and storm surges and massive flooding in the mid-Atlantic from Western Maryland to New York City.

One thing that we’ve learned over time is that there is no way to stop storm damage. Strong hurricanes, tornados, floods, and ice storms are going to create damage regardless of the steps we might take to try to keep aerial utilities safe. What matters most is the amount of time it takes to make repairs – obviously, the shorter, the better.

A natural impulse is to bury all of the aerial utilities. However, the cost to bury wires in many places is exorbitant. There are also issues during some weather events from buried facilities. It takes a long time and a lot of effort to find and fix problems in flooded areas. Buried electric lines are also sensitive to the saltwater corrosion that comes from storm surges from big coastal storms.

The Electric Power Research Institute (EPRI) has been working on ideas to better protect wires, poles, and towers during big storms. EPRI operates an outdoor laboratory in Lenox, Massachusetts, to create simulations of storm damage, EPRI’s goal is to find techniques to either minimize storm damage or shorten the time needed to make repairs.

EPRI research is intriguing to anybody that’s been in the industry. For example, they are exploring ways that towers and poles can be made to collapse in pre-planned ways rather than be destroyed by winds. A controlled collapse could avoid all of the problems of snapped and dangerous power lines. If done properly, EPRI is hoping there would be a way to stand up a downed pole in hours instead of days.

They are exploring a similar line of research, looking at ways for aerial wires to disconnect from poles before drastic damage occurs. This would stop the domino effect of multiple poles being broken and dragged down by a heavy tree landing on a pole span. It would be a lot easier to put fallen wires back onto poles than to untangle and splice wire breaks caused by catastrophic damage.

EPRI is also exploring other aspects that are needed to effectuate storm damage repair. For example, there is a white paper on the site that looks at the effectiveness of alternate telecommunications channels so that key players at a utility to communicate and coordinate after a storm. All of the normal modes of communication are likely to go dead when the wires and towers come tumbling down. The white paper looked at using GEO satellites, LEO satellites, and HF radios to communicate. The goal was to find a communications medium that would allow for a 3-way call after more conventional communication paths are out of service. The best solution was the high-orbit GEO satellites.

This kind of research is both important and interesting because coordination of repair efforts is one of the biggest challenges after every disaster. A utility can have standby crews ready to work, but nothing gets done until somebody can tell them what most needs to be addressed.

The Beginnings of 8K Video

In 2014 I wrote a blog asking if 4K video was going to become mainstream. At that time, 4K TVs were just hitting the market and cost $3,000 and higher. There was virtually no 4K video content on the web other than a few experimental videos on YouTube. But in seven short years, 4K has become a standard technology. Netflix and Amazon Prime have been shooting all original content in 4K for several years, and the rest of the industry has followed. Anybody who purchased a TV since 2016 almost surely has 4K capabilities, and a quick scan of shopping sites shows 4K TVs as cheap as $300 today.

It’s now time to ask the same question about 8K video. TCL is now selling a basic 8K TV at Best Buy for $2,100. But like with any cutting-edge technology, LG is offering a top-of-the-line 8K TV on Amazon for $30,000. There are a handful of video cameras capable of capturing 8K video. Earlier this year, YouTube provided the ability to upload 8K videos, and a few are now available.

So what is 8K? The 8K designation refers to the number of pixels on a screen. High-Definition TV, or 2K, allowed for 1920 X 1080 pixels. 4K grew this to 3840 X 2160 pixels, and the 8K standard increases pixels to 7680 X 4320. An 8K video stream will have 4 pixels in the space where a high-definition TV had a single pixel.

8K video won’t only bring higher clarity, but also a much wider range of colors. Video today is captured and transmitted using a narrow range of red, green, blue, and sometimes white pixels that vary inside the limits of the REC 709 color specifications. The colors our eyes perceive on the screen are basically combinations of these few colors along with current standards that can vary the brightness of each pixel. 8K video will widen the color palette and also the brightness scale to provide a wider range of color nuance.

The reason I’m writing about 8K video is that any transmission of 8K video over the web will be a challenge for almost all current networks. Full HD video requires a video stream between 3 Mbps and 5 Mbps, with the highest bandwidth needs coming from a high-action video where the pixels on the stream are all changing constantly. 4K video requires a video stream between 15 Mbps and 25 Mbps. Theoretically, 8K video will require streams between 200 Mbps and 300 Mbps.

We know that video content providers on the web will find ways to reduce the size of the data stream, meaning they likely won’t transmit pure 8K video. This is done today for all videos, and there are industry tricks used, such as not transmitting background pixels in a scene where the background doesn’t change. But raw 4K or 8K video that is not filtered to be smaller will need the kind of bandwidth listed above.

There are no ISPs, even fiber providers, who would be ready for the largescale adoption of 8K video on the web. It wouldn’t take many simultaneous 8K subscribers in a neighborhood to exhaust the capability of a 2.4 Gbps node in a GPON network. We’ve already seen faster video be the death knell of other technologies – people were largely satisfied with DSL until people wanted to use it to view HD video – at that point, neighborhood DSL nodes got overwhelmed.

There were a lot of people in 2014 who said that 4K video was a fad that would never catch on. With 4K TVs at the time priced over $3,000 and a web that was not ready for 4K video streams, this seemed like a reasonable guess. But as 4K TV sets got cheaper and as Netflix and Amazon publicized 4K video capabilities, the 4K format has become commonplace. It took about five years for the 4K phenomenon to go from YouTube rarity to mainstream. I’m not predicting that the 8K trend could do the same thing – but it’s possible.

For years I’ve been advising to build networks that are ready for the future. We’re facing a possible explosion over the next decade of broadband demand from applications like 8K video and telepresence – both requiring big bandwidth. If you build a network today that is not contemplating these future needs, you are looking at being obsolete in a decade – likely before you’ve even paid off the debt on the network.

Will Hyper-inflation Kill Broadband Projects?

The broadband industry is facing a crisis. We are poised to build more fiber broadband in the next few years than has been built over the last four decades. Unfortunately, this peak in demand hits a market that was already superheated, and at a time when pandemic-related supply chain issues are driving up the cost of broadband network components.

The numbers I am hearing from clients are truly disturbing. Just in the last few weeks, I’ve heard that the cost of conduit and resin-based components like handholes and pedestals is up 40%. I’ve heard mixed messages on fiber – some saying that prices are up as much as 20%, while others are seeing little price increases. I think the increases have to do with the specific kind of fiber being purchased as well as the history of the buyer – the biggest increases are going to new or casual fiber buyers. I’ve heard the cost of fiber-related components like pigtails is also way up.

The kinds of numbers I’m hearing can only be classified as hyper-inflation. It’s way outside the bounds of normalcy when the cost of something is up 20% to 40% in a year. I’ve been listening to a lot of economists lately who say that many price increases that are due to the pandemic are temporary in nature and that what we are seeing is a price bubble – they predict prices ought to revert to old levels over time in competitive markets where multiple providers of components will be bidding for future business.

But I keep looking at the upcoming plans for the country collectively to build fiber, and it looks like we might be seeing a superheated industry for the rest of this decade. When much of the rest of the economy gets back to normal, it’s not hard to envision that not being the case for fiber.

This leads me to ask if this hyper-inflation is going to kill fiber projects? I start with the RDOF winners – the amount of subsidy they get is fixed and won’t be adjusted for higher costs. At what point do some of the RDOF projects stop making sense? The business modelers for these companies must be working overtime to see if the projects still work with the higher costs. It won’t be shocking to see some of the RDOF winners give up and return Census blocks to the FCC.

But this same thing affects the winners of every other grant. Consider the recent grant filings with the NTIA. Those were some of the most generous grants ever awarded, with the NTIA program picking up as much as 75% of the cost of a project. What happens to the winners of those grants if materials are much more costly when they go to build the project? Any extra cost must be borne by the grant winner, meaning that the real matching could be a lot more than 25%. Some grant winners are going to have a hard time finding extra funding to cover the extra costs. Some of these projects are in extremely rural places, and one has to worry that having to pay extra might make the difference between a project making sense and not. Even with grants, it’s often a fine line between a project being feasible or not.

This same worry has to be spreading through the big ISPs. Companies like Frontier, Windstream, and Lumen are betting their future viability on building more fiber. How do those plans change when fiber is a lot more expensive?

The worst thing is that we have no idea where prices will peak. We’ve not really yet seen any market impact from RDOF and other big grant programs. We’ve seen some impact from CAREs spending, but that was a drop in the bucket compared to what we’re likely to see from ARPA and federal infrastructure spending.

I have a lot of nervous clients, and I have no good advice for them on this issue. Should they buy as much fiber and components as they can now before prices rise even more, or should they wait and hope for parts of the market to return to normalcy? What we’re experiencing is so far outside the bounds of normal that we have no basis for making decisions. I chatted with a few folks recently who speculated that the best investment they could make this year would be to buy $1 million of fiber reels and sit on them for a year – they might be right.

Video Meetings are the New Normal

One of the big changes that came out of the pandemic will have a permanent impact on broadband networks. Holding online meetings on Zoom, Microsoft Teams, GoToMeeting, and other video platforms has become a daily part of business for many companies.

This article in the New York Times cites predictions that businesses will cut down on travel by 20% to 50%. This will have a huge impact over time on the airline and hotel industries. As a lifelong road warrior, I recall the relief every year when the school year started back in September and airports returned mostly to business travelers. It will be interesting in the future if airports really get more deserted during the business-only travel months.

But the real boon for businesses from less travel will be lower expenses and increased productivity. I can’t add up the number of times that I traveled somewhere for a one or two-hour meeting – something that has now fallen off my radar. We’re going to replace rushing to make a flight with the use of broadband.

What is interesting is how hard we tried in the past to make video conferencing into an everyday thing. Everybody of my age remembers these AT&T commercials from 1993 that predicted that video conferencing, working remotely, digital books, and GPS navigation would become a part of daily life. Most of the predictions made by these commercials became a reality much sooner than common video calling. Whole new industries have been built around digital books, and GPS is seemingly built into everything.

The business world fought against video conferencing. I recall a client from 20 years ago who had invested in an expensive video conference setup and insisted on either meeting in person or holding a video conference. I recall the hassle of having to rent a local video conferencing center to talk to this client – but even then, I could see how that expense was far better than spending time a wasted day in an airport and a night in a hotel.

I don’t know how typical my workday is, but I probably average 3 hours per day on video calls. I always hated long telephone calls, but I like the experience of seeing who I’m talking to. It’s enabled creating real bonds with clients and colleagues as I talk to them multiple times through video chat compared to an occasional live meeting.

A few weeks ago, I wrote about the concept of broadband holding times to account for the fact that we are tying up broadband connections for hours with video chats or connecting to a work or school server. I’m not sure that we’ve fully grasped what this means for broadband networks. Most network engineers had metrics they used for estimating the amount of bandwidth required to serve a hundred or a thousand customers. That math goes out the door when a significant percentage of those customers are spending hours on video chats that use a small but continuous 2-way bandwidth connection.

We’re not likely to fully grasp what this means for another year until the pandemic is fully behind us, and companies settle into a new normal. I know I’m not going to be in airports in the future like I was in the past, and many people I’ve talked to feel the same way.