The Dumb Pipe Question

Every few years I read something that resurrects the old question of whether ISPs should be dumb pipe providers or something more. Some ISPs have fought against the idea of being dumb pipe providers and want to believe they are far more than that. The latest event that raises this question anew is AT&T’s debacle with ditching DirecTV and WarnerMedia. AT&T was clearly not content with being considered as only a dumb pipe provider. The company was lured by the perceived higher earnings of both cable companies and media companies, and AT&T went on a buying spree and purchased both DirecTV and WarnerMedia.

At the time of the DirecTV purchase, when AT&T paid $67 billion for the satellite company, there were already rumblings in the industry about cord-cutting. There hadn’t been any evidence of large numbers of customers dropping traditional cable TV, but the industry was already in a holding pattern of zero net growth, with new customers roughly equaling customers who were ditching traditional TV. Since the DirecTV purchase, cord-cutting materialized with a fury as the traditional cable industry lost over 13 million cable subscribers.

The lure for an ISP to become a media company has hovered over the industry for over twenty years. Those of us that were in the industry in 2000 still remember being flabbergasted by the merger of AOL and Time Warner. The merger was blessed by Wall Street and by the consensus of analysts that the Internet was going to subsume media and that the merger was a defensive move by Time Warner. But it was hard to picture a path where the combined companies could grow to justify the astronomical $350 billion valuation that was awarded by the stock market at merger. And sure enough, the wheels came quickly off in what was possibly the worst merger of all time.

AT&T was also lured by the continued growth in the valuation of media companies. The stocks of media companies like Disney climbed in value year after year while AT&T’s value stagnated. AT&T was convinced that the merger with Time Warner would put the company’s stock on an upward trajectory like other media companies.

Underlying AT&T’s decision in both purchases to branch out was a dissatisfaction of being viewed by Wall Street as a dumb pipe provider. AT&T is the ultimate dumb pipe provider with a huge base of cellular and broadband customers – all who buy basic connectivity from the company.

AT&T was obviously jealous after watching companies like Apple and Google profit by putting apps on AT&T’s phones. AT&T was equally unhappy to see companies like Disney prosper from sending video signals over AT&T copper and fiber. I believe the entire AT&T debacle boils down to a company that did not want to be perceived as only providing dumb pipes. I think it’s that simple.

But something happened in the industry in recent years while AT&T lost over $90 billion from the two acquisitions in just five years. In recent years, the valuation of fiber-based dumb pipe providers is up significantly. In the last year the industry has seen transactions for fiber-based ISPs getting huge valuations. I honestly can’t fathom some of these high valuations any more than I could understand the AOL / Time Warner valuation. But the current high valuation for fiber networks is real since there are investors willing to pay big prices to get fiber companies.

All of the big ISPs have grasped this fundamental market shift. Most of the big ISPs have announced strategies to build significant amounts of fiber this year and next year. AT&T is building fiber past 3 million more homes this year. Verizon is on a tear and says it will build fiber-to-the-curb past 25 million homes by 2025. We see big fiber expansion plans from Charter, CenturyLink, Altice, Frontier, Windstream, and a long list of others.  All of a sudden, everybody wants to be a bigger dumb pipe provider.

It’s going to be interesting to see if this trend continues. For now, investors are betting that fiber companies will beat the cable companies in the broadband market – there is no other way to explain the higher valuations. The cable companies have thrived during a decade of lopsided competition against telephone DSL. Are the cable companies faced with being on the opposite side of the competitive battle and seeing fiber become the consumer choice? As always, this industry continues to provide interesting trends to watch.

A Rural Broadband Laboratory

The National Science Foundation along with the US Department of Agriculture is creating a broadband testbed in and around Ames Iowa. The program is part of NSF’s Platforms for Advanced Wireless Research (PAWR) program. This is the fourth wireless test site in the country and the first to be placed in a rural setting. The PAWR programs are a great example of public/private partnerships that to date have attracted over $100 million in private and government investments in research.

This project will provide an outdoor laboratory for engineers and scientists to explore ways to maximize the benefit of new wireless technologies for agriculture. Additionally, new technologies will be deployed throughout the college community of Ames.

The PAWR projects, to date, have included the participation of over 35 wireless providers and vendors. This project has already attracted the participation of several universities in addition to the  Iowa State University including the University of California at Irvine, and the Ohio State University. John Deere will be participating in the testbed along with U.S. Cellular, the Iowa Regional Utilities Association, and the Iowa Department of Transportation. The experiments will include participation from students from Iowa State as well as from local schools. Also participating will be Woodland Farms and the Meskwaki Tribal Nation.

Formal testbeds are always interesting because the FCC generally grants test licenses for scientists to experiment with radio frequencies in ways that may not be on the radar for the big carriers. The project includes $8 million to construct a wireless network that will cover nearly 600 square miles in and around Ames. One of the concepts to be explored is the collaboration potential and interaction between satellite broadband, existing wireless networks, and new wireless technologies.

Scientists will be experimenting with technologies involved in precision agriculture including drones, self-driving farm machinery, and an array of environmental sensors. One of the first experiments will involve identifying weeds for automatic eradication using high-resolution video. Field sensors will transmit live pictures to the cloud to allow for accurate identifications of weeds. Training robots to manually eliminate weeds would mean a drastic reduction in the use of herbicides in the food chain.

The project will also step outside of agriculture and look at technologies and applications that can expand wireless coverage in rural areas. This will involve experimenting with hybrid networks that use different frequencies and wireless technologies in unison to bring stronger broadband signals to the fields and areas where it is most needed.

These kinds of experimental sites are always interesting and exciting because ideas tested in programs like this end up as everyday technology a decade from now. Giving scientists and engineer a large outdoor laboratory provides them with a way to test ideas in ways that can’t be explored in the lab or in small testbeds.

Grant Money Should Build for the Future

I filed comments with the Department of Treasury concerning the questions they asked about broadband speeds in the Interim Final Rules for how states, cities, and counties can spend the ARPA grant funds. The following is the gist of my comments.

Question 22: What are the advantages and disadvantages of setting minimum symmetrical download and upload speeds of 100 Mbps? What other minimum standards would be appropriate and why?

Treasury is asking the wrong question when asking about current speed requirements. Federal grant money should only be used to build technology that will be capable of meeting broadband demands at least a decade from now. A technology deployment built to meet today’s speed requirements starts being obsolete almost immediately after it’s constructed.

I think 100 Mbps download is an adequate definition of broadband in 2021, and I doubt that there will be many arguments against a 100 Mbps requirement since most currently deployed technologies can deliver this speed.

There will be a huge outcry against 100 Mbps upload speeds since major technologies like cable company HFC networks, fixed wireless, and fixed cellular can’t deliver fast upload speeds. Treasury can’t be swayed by this argument – grant money should only be used to deploy technology that meets public broadband demand both today and into the future. ISPs are free to use their own money to deploy any technology – but federal money is precious and should be held to a higher standard.

In looking out only a decade, and using a conservative 21% annual growth rate in broadband speeds, the definition of broadband a decade from now should conservatively be 600/200 Mbps. That’s what I recommend as a reasonable goal for federal grant funding.

Question 24: What are the advantages and disadvantages of setting a minimum level of service at 100 Mbps download and 20 Mbps upload in projects where it is impracticable to set minimum symmetrical download and upload speeds of 100 Mbps? What are the advantages and disadvantages of setting a scalability requirement in these cases? What other minimum standards would be appropriate and why?

As described earlier, I think federal grant funding ought to be used to support a network that will still be viable a decade from now. My best guess of the upload requirement for a family of four today is between 30 Mbps and 40 Mbps. In looking forward a decade, that means upload speed requirement for a federal grant should be between 200 Mbps and 270 Mbps.

I think there are ISPs using major technologies like cable HFC networks, fixed wireless networks, and fixed cellular networks that are going to suggest that the proper upload speed for a grant should be 20 Mbps or less. Our vast experience of conducting broadband surveys all over the country during the pandemic showed us that a 20 Mbps upload path is already not always adequate today for a home with multiple people working or schooling from home at the same time. If Treasury sets the definition as low as major industry players are likely to suggest, then those networks will launch as already inadequate and will become badly obsolete as time goes by.

Question 25: What are the advantages and disadvantages of focusing these investments on those without access to a wireline connection that reliably delivers 25 Mbps download by 3 Mbps upload? Would another threshold be appropriate and why?

I think Treasury has identified one of the biggest problems with previous federal broadband grants by now saying that the test for grant eligibility is that an ISP can “reliably” deliver at least 25/3 Mbps. The reality is that much of the technology that is reported to the FCC today as being capable of 25/3 Mbps delivers far slower speeds.

We need to talk about real-life networks using three different definitions of speed – maximum speed, minimum speed, and marketing speed. The maximum speed is the fastest speed that a given technology can achieve in ideal conditions. But network conditions are rarely ideal, except perhaps when a single home is using a node at 3:00 AM – and even then, there could be slowdowns from node congestion outside of the neighborhood.

Minimum speeds are something we’ve always referred to as actual speeds. These are the speeds we see on speed tests, and they rarely equal the maximum speed.

Marketing speeds are something else altogether, and some ISPs advertise numbers close to actual speeds while others advertise purely fictional speeds that are greater than the maximum speeds. Unfortunately, the FCC allows ISPs to report marketing speeds, and this is one of the big contributors to the lousy FCC mapping data.

Areas should qualify for federal grant funding based upon the minimum speeds actually delivered to customers and should not use the maximum theoretical speed or the advertised marketing speeds.

My full comments are here: CCG Consulting Comments on Broadband Speeds.

OpenRoaming WiFi

AT&T recently conducted a test of WBA OpenRoaming WiFi in the historic 6th Street district of Austin, Texas. This is  technology that allows a WiFi user to roam between public hotspots without having to log-in to each new hotspot.

WBA stands for the Wireless Broadband Alliance, and the vision is that this technology will work for any ISPs or carriers that join the WBA federation. If the technology works as promised, and if multiple large carriers join the WBA federation, a user might seamlessly bounce between WiFi routers supported by different carriers.

One of the key aspects of the test is the use of Passpoint technology that was developed fifteen years ago by the Wi-Fi Alliance. This is software that provides and approves credentials for a WiFi user across different networks. Quick authentication is vital to the idea of roaming between hotspots operated by different ISPs and carriers. This allows WBA OpenRoaming to verify users quickly without having to rely on the MAC address of the user’s device.

The carriers are touting this as a great feature for WiFi users. A cellphone user could walk around a downtown district and not lose WiFi connectivity while moving between hotspots operated by different ISPs or wireless carriers. The WBA is promoting the WiFi connections as safe since every member of the federation agrees to use end-to-end encryption of the WiFi communication path – which would eliminate the biggest threat from using public WiFi.

What is not being discussed is the bigger motivation of the cellular carriers for implementing the technology. This would offload huge volumes of cellular data traffic to the landline broadband connections of the stores, shops, and restaurants that offer WiFi routers to the public. In a crowded environment like 6th Street in Austin, this would remove a huge amount of traffic from the cellular network in busy times.

This is something that cellular companies have wanted to do for many years. There was an earlier attempt to make this concept work in 2013 and 2014. At that time, the technology was labeled as Hotspot 2.0. The trials of that technology introduced Passpoint validation and had the ultimate goal of reliving data pressure on the cellular networks. Many of you may have sat in presentations at industry conventions that promised big revenues for landline ISPs for WiFi offloading – the idea was that the cellular companies would pay to get traffic off the cellular network. That idea never went much further than trials, and here we are in 2021 trying the idea again under a new name.

I also wonder if cellular companies see this as a revenue opportunity. It’s not hard to envision them selling the WiFi roaming feature to cellular customers for a few dollars per month – effectively getting WiFi users to pay to offload traffic to landline broadband connections.

Regardless of the motivation of the cellular carriers, it’s an idea that is long overdue. Users benefit by being able to stay under data caps on cellular data plans. The carriers benefit in crowded environments by shielding cellular networks from becoming overloaded. I just hope that the merchants supplying the underlying broadband will have unlimited data plans – or they are the ones who will pick up the tab for this.

The Looming Battle over Upload Speeds

By next week we’re going to see the opening shots in the battle for setting an official definition of upload broadband speeds. You might expect that this is a topic that would be debated at the FCC, but this battle is coming as a result of questions asked by the U.S. Department of Treasury as part of defining how to use the grant monies from the American Rescue Plan Act. Treasury has oddly been put in charge of deciding how to use $10 billion of direct broadband grants and some portion of the gigantic $350 billion in funding that is going directly to counties, cities, and towns across the country.

Treasury asked for comments through a series of questions about the broadband speeds of technologies that should be supported with the grant funding. The questions ask for a discussion of the pros and cons of requiring that grant dollars are used to built technologies that can achieve speeds of 100/20 Mbps versus 100/100 Mbps.

Treasury is not likely to see many comments on the requirement that grant deployments must meet 100 Mbps download speeds. All of the major broadband technologies will claim the ability to meet that speed – be that fiber, cable company hybrid-fiber networks, fixed wireless provided by WISPs, or low-orbit satellites. The only industry segment that might take exception to a 100 Mbps download requirement is fixed cellular broadband which can only meet that kind of speed for a short distance from a tower.

But asking the difference between upload speeds of 20 Mbps versus 100 Mbps has already set off a firestorm of comments around the industry. There are only a few technologies that can reasonably meet 100 Mbps upload speeds – fiber, fiber-to-the-curb as being deployed by Verizon, and wireless mesh networks that bounce millimeter wave spectrum from building to building. Setting the definition of upload bandwidth at 100 Mbps will exclude major technologies such as cable company broadband, fixed wireless, and fixed cellular. Both speed definitions happily put DSL out to pasture for receiving grant funding. We’ve already seen opponents of a 100 Mbps definition of upload speed start publicly making the arguments against symmetrical bandwidth.

A recent blog on the WISPA website argues that argues for upload speeds of 5 Mbps to 10 Mbps. The blog argues that it costs more to build 100/100 Mbps networks (as a way to remind that fixed wireless costs a lot less than fiber).

We know the cable industry is going to come out hard against any definition up upload speed greater than 20 Mbps – since that’s what most cable networks are delivering. In a show of solidarity with the rest of the cable industry, Altice recently announced that it will lower current upload speeds of 35 – 50 Mbps down to 5 – 10 Mbps. This is clearly being done to allow the cable industry to have a united front to argue against faster upload speeds. This act is one of most bizarre reactions that I’ve ever seen from an ISP to potential regulation and a direct poke in the eye to Altice customers.

Back in March, we saw Joan Marsh, the AT&T Executive VP argue that 21st-century broadband doesn’t need upload speeds greater than 10 Mbps. This was an argument that clearly was clearly meant to support using grant funds for rural fixed cellular technology. It’s an odd position to take for the second largest fiber provider in the country.

This is a discussion that should be held at the FCC. However, that agency has punted several times over the last three years on the topic of redefining the definition of broadband. The Ajit Pai FCC announced in several annual reports to Congress that the definition of broadband of 25/3 Mbps is still adequate. I think it’s clear that the Pai-led FCC did not want to take any blame for reclassifying millions of homes as not having broadband.

It’s interesting that Treasury is even involved in broadband grants. But Congress pushed the biggest pile of grant money to Treasury instead of to the FCC or the NTIA – the two agencies with the biggest historic roles in broadband policy. But since Treasury is the agency asking these questions, I have to think that whatever answer Treasury settles on is going to hold big sway moving forward.

This is a topic that the FCC should be tackling, but oddly, the White House is now a half-year into the new administration without recommending a fifth FCC commissioner. Until that position is filled, it seems unlikely that the FCC will tackle the re-regulation of broadband and issues like the definition of broadband. Treasury is going to need to resolve this question quickly since local governments are itching to make final plans for using the ARPA funding.

Misunderstanding Public Rights-of-Way

I read an article on WBAY.com, the ABC affiliate in Green Bay, Wisconsin that highlights residents’ dismay over AT&T putting 3-foot pedestals in front yards as part of the process of bringing fiber to a neighborhood. This is a good reminder of the ongoing saga between homeowners and telecom companies over the placement of facilities in public rights-of-ways.

Most homeowners don’t realize that their community has granted a right-of-way for utilities directly along streets and roads. It’s typical for a local government to provide 3-5 feet of right-of-way for utility purposes. The chances are that there are several utilities already buried in the disputed right-of-way in this Green Bay neighborhood. Most utilities are installed before houses are first sold to the public, so homeowners often have no clue that utilities already use their property.

Nobody reminds homeowners when they buy a house that utilities have a right to do construction or place devices at the front of their yard. I can’t recall ever being told that, although it might be buried in the thick pile of papers one signs when buying a home. Homeowners routinely plant flower beds or shrubs in the right-of-way, not realizing that a utility may one day rip out what they have done.

In a situation that is all too familiar with any fiber overbuilder, a city will direct a fiber builder to use the public rights-of-way whenever possible. I’m sure that when AT&T went to build fiber in this particular neighborhood that it was understood by both AT&T and the city that any devices were to be put into the public rights-of-way. There is a disingenuous quote by the Green Bay Public Works Director that his agency and the city are not responsible for the placement of the pedestals – because the city established the rights-of-way and his department approved the permits for the fiber construction.

Every fiber overbuilder can tell stories of irate homeowners who hated when burying fiber dug up a flower bed or, in this example, placed devices that homeowners don’t want. No fiber builder wants to create animosity with potential customers when they place a new network. In this case, AT&T has a legitimate beef when city employees throw up their hands as if this situation is out of their control. Cities enthusiastically support rights-of-ways as a way to keep the placement of utilities under control.

With that said, fiber builders have options. AT&T didn’t have to choose 3-foot-tall pedestals to place in this neighborhood. They could just as easily have used buried flowerpots that would have housed the fiber connections under a metal lid. I have a number of clients that won’t use pedestals for this reason, except perhaps in a few places where there is no other physical choice.

I remember disputes in recent years where cellular companies were placing 12-foot-tall small cell sites in rights-of-way in Colorado – monstrous devices that nobody would want in their front yard. It seems that carriers sometimes go out of their way to antagonize the public. In the long run, utilities and homeowners need to coexist. It’s likely that at least some of the homeowners on this particular street won’t buy broadband from AT&T due to the pedestal issue – or at least not for a few years until people no longer notice the pedestals and the issue is forgotten.

Where are the Gigabit Applications?

I remember that soon after the City of Chattanooga launched its citywide fiber network that the company held a competition seeking web applications that would benefit from gigabit speeds. I don’t recall if anything useful came out of that effort, but I know that there are still today almost no big bandwidth applications on the web online aimed at the average household.

There are always a few people in every community that immediately benefit from gigabit broadband. In many of the cities I’ve worked with, some of the first customers who signed up for residential gigabit service are radiologists and other doctors who appreciate the ability to review medical imagery without having to pop into the hospital in the middle of the night. I know of a few cases where doctors now buy multi-gigabit connections as fast as 10-gigabits because a gigabit isn’t fast enough.

There are always a few others in larger markets that also need the full gigabit capability. This would include scientists and engineers who work with huge data sets. I know people who work from home with movie animation who regularly fill a gigabit connection. But these connections are all work-related and represent people who work with large data at the office and who want the convenience of doing so at home. It’s been eight years since Google Fiber got the whole country talking about gigabit broadband speeds – and yet there still are not any killer gigabit applications.

I think we’re finally on the verge of seeing this change. The demand for faster broadband products leaped upward during the pandemic. According to OpenVault, at the end of the first quarter of 2021, the percentage of homes subscribing to gigabit data products jumped to 9.8% of all homes. This grew from 1.9% of homes in 2018 and 2.8% at the end of 2019. This is a profound market change because having 10% of all households subscribing to gigabit broadband means there is finally a potential market for gigabit applications. A company that develops a high-bandwidth application now can be assured that there are enough possible customers to make it worthwhile.

Another factor that makes us ripe for gigabit applications is the continued growth of gaming. It’s hard for folks of my generation to put gaming into perspective. The gaming industry now dwarfs other entertainment segments like movies or television. In 2020, the gaming industry had revenues of almost $140 billion. That includes $73.8 billion for mobile gaming, $33.1 billion for PC gaming, $19.7 billion for game console gaming, $6.7 billion for extended reality, and even $9.3 billion for paid subscriptions to watch others play games.

The gaming industry made a big change just before the pandemic when the biggest game companies moved games to the cloud. The old phenomenon of kids lining up to buy the latest game release at midnight is a thing of the past as games are now introduced online and played in data centers. I think the early big bandwidth applications will be related to gaming – and this is not unusual because entertainment has driven the use of bandwidth in the past – a large percentage of home broadband usage is still to watch video.

The third factor that I think will drive faster broadband applications is generation Z coming of age. This is the generation that grew up with a smartphone in their hands and good home broadband. This is a generation that adapted immediately to homeschooling – while they badly missed their friends and live activities, this is a generation that was already living a large percentage of their lives virtually. Generation Z kids are now in college and will soon be out in the world as a major block of consumers. They are likely going to be the first target audience for faster broadband applications and is also likely to be the generation that will create many of the new applications.

It’s likely that the big bandwidth applications will involve extended reality, which is an umbrella term that covers virtual reality, augmented reality, mixed reality, and other similar technologies. With gigabit-capable homes as customers, we’ll start seeing games that bring telepresence and virtual worlds into the home using big bandwidth. The country is ripe for big-bandwidth applications since 10% of homes are buying gigabit broadband products. We have a huge potential market for innovative gaming since over half of the people in the country now play games at least once a month, with tens of millions who play games regularly. All that is needed now is for a few entrepreneurs to see the potential for developing big-bandwidth games. And as happens with all new technologies, this will grow from a  start in gaming to extend to the rest of us.

Cellular Data Speeds Much Improved

I’m curious about how many people realize that cellular broadband download speeds have increased dramatically over the last year. I’m not a heavy cellular data user, particularly during the pandemic year when I barely used cellular data outside of the home. But I’ve always run cellular speed tests a few times per year and have definitely noticed faster download speeds.

Following is a comparison of cellular download speeds in the recent first quarter of this year compared to the first quarter of 2019. In both cases, the speeds are national averages reported by Ookla that are based upon millions of cellular speed tests.

2019 2021
AT&T 34.7 Mbps 76.6 Mbps
T-Mobile 34.1 Mbps 82.4 Mbps
Verizon 33.1 Mbps 67.2 Mbps

There are several reasons for the increase in speeds. First, many cell sites were not fully 4G compliant in the first quarter of 2019. The first fully 4G compliant cell site was completed in the winter of 2018. Since then, the carriers have implemented 4G everywhere.

The carriers have also implemented new spectrum bands. They’ve labeled the new spectrum as 5G, but the new spectrum bands are all still using 4G technology. The new spectrum allows cellular customers to spread out into multiple channels. This means that older spectrum bands and the networks are not getting bogged down and overbusy during the heaviest usage times of the day such as during the daily commute.

I also suspect that the pandemic has some role in the difference. During the pandemic the daytime demand for cellular data has been suppressed by far fewer people commuting and spending the day outside the home. A less busy cellular network should translate into faster speeds.

As part of writing this blog I took a speed test on my cellphone in downtown Asheville on AT&T. I got a download speed of 60.5 Mbps, and an upload speed of only 1.8 Mbps.

It’s worth looking at the Ookla article because it shows median broadband speeds by state. Note that median is different than average and median means half the speed tests were slower and half were faster. The medium speeds are significantly lower than the national average, which indicates that there are more fast speed tests than slow ones to drive the average higher.

It also seems likely that urban speeds are much faster than rural speeds for a variety of reasons. That conjecture is somewhat verified by the District of Columbia having the fastest median speeds. The top eight fastest speeds are all on the east coast. The ten slowest states are all at half of the median speeds in D.C. – with the slowest speeds being in Mississippi, Wyoming, West Virginia, Iowa, Vermont, and surprisingly Texas.

I’ve still never figured out why faster cellular data speeds would be important to the average cellular customer. The most data-intensive most people do on cellphones is to watch video, and that only requires a few Mbps of speed. There would be a benefit when updating cellphone software, but I have to imagine that most people do this while on WiFi. I would love for somebody to provide real-life examples of how faster cellular data speeds are making a daily difference.

Free Space Optics

I read an article on the Finley Engineering blog that talks about new research with free-space optics. For those not familiar with the term, this means communication gear that communicates directly using light without any wires.

The article talks about a Chinese team of scientists that have been able to use light to transmit ultrahigh-definition video signal between high-rise buildings. That’s an interesting feat because light signals in an urban environment must deal with air pollution, vehicle exhausts, and other factors that place small particles in the air that can disrupt a light signal. The scientists have found new ways to compensate for attenuation and scattering due to environmental factors.

If perfected, light transmitters could play some interesting roles. It might make sense to use light transmission in places with unusual terrain constraints. The technology could be used to pop up an instant broadband connection between buildings until a more permanent connection can be built. The technology could provide a quick fix for restoring key broadband connections after disaster recovery. The real promise for the technology is in space. It makes sense to use lasers to communicate between satellites or to communicate between manned outposts.

The technology has been around for a long time. Alexander Bell created a photophone in 1880 that he thought was one of his most important inventions. He demonstrated the use of the technology by transmitting a call about 700 feet between two buildings using a light signal. We all use remote controls that transmit signals using infrared light.

There have been earlier attempts to use the technology in the telecom industry. Back during the telecom boom of the late 1990s, several well-funded start-ups tried to develop working technology using light instead of radio frequency to transmit broadband. The biggest of these was Terabeam that attracted over a half-billion dollars in start-up funding and was backed by AT&T and Lucent. I recall talking to engineers at Terabeam about the technology. Other well-funded start-ups that explored the technology included AirFiber and LightPointe Communications.

But none of the companies could ever overcome the natural problems that occur in ambient outdoor conditions. It turns out the real killer for the technology is fog, which completely cuts off transmissions. But the technology also was never reliable in normal weather due to pollution and airborne particles.

The concept resurfaced again a decade later and was labeled as Li-Fi. The concept with this technology was to transmit data by turning LED diodes on and off extremely quickly as a way to transmit the ones and zeros needed for digital transmission. Scientists have been able to achieve a transmission speed as fast as 224 gigabits per second by simultaneously transmitting separate signals over different frequencies of light.

There were several trials of Li-Fi technology in Europe in 2018 and 2019 at a BMW plant, a school, and at the 2019 Paris airshow.

Free space optics is an attractive technology in environments like busy factories, nuclear power plants, or airports that are already busy with multiple radio frequencies. It’s an interesting way that can be used to pass data between smart cars that avoids all of the issues associated with radio frequencies. The technology is being considered for transmissions within aircraft to reduce interference with existing critical devices.

The idea of using light to transmit data is enticing because the visible light spectrum can carry approximately 10,000 times more bandwidth than the entire radio frequency spectrum. It’s a concept that is attractive to carriers because it could mean making short-length data transmissions without having to own spectrum. I doubt that we have heard the end of free space optics.

The Infrastructure Guessing Game

As Congress inches closer to an infrastructure bill, the industry is feverously speculating how a broadband infrastructure plan might work. There is still a lot of compromise and wheeling and dealing to be done, so nobody knows how a final broadband program might work, or even definitively if there will be one. But since this is the billion-dollar question for the industry, it’s worth a review of the possibilities.

At this point, I think we know where the White House would come down if they were the decider in the issue. We can see the White House’s influence in shaping the policies concerning ARPA grant monies. The White House has pushed to ignore the FCC’s outdated 25/3 Mbps definition of broadband and has quietly pushed for rules that would let broadband money solve the urban broadband gap and not just the rural gap. The White House’s original proposal was for $100 billion to built state-of-the-art broadband infrastructure. Other than that, the original White House plan avoided talking about the mechanics of how to use broadband funding – and that is the real billion-dollar question.

But the White House does not get to decide the details of how infrastructure spending will be done – that responsibility lies fully with Congress. There have been two radically different broadband plans suggested in Congress.

Last year, an $80 billion infrastructure plan was proposed with the Accessible, Affordable Internet for All Act sponsored by Rep. James E. Clyburn of South Carolina and Sen. Amy Klobuchar of Minnesota. This bill included a lot of unique ideas. Most of the funding was to be awarded in one gigantic reverse auction of at least $60 billion. We saw the perils of using the reverse auction last year when much of the FCC’s reverse funding awards went awry. We saw companies with no balance sheets winning huge dollars. We saw grant awardees claiming technologies like gigabit rural wireless that doesn’t exist. We saw money going to an unproven low orbit satellite business. We also quietly saw many of the big incumbents claim huge funding.

The Accessible, Affordable Internet for All Act also had a big emphasis on affordability. This was the act that suggested a $50 monthly discount for low-income broadband that materialized this year in the EFF program. The Act also promotes the idea of open access – an idea that is anathema to many ISPs. Altogether, the mechanics of this Act are troublesome.

Just this month, three Senators – Michael Bennet, D-Colorado, Angus King, I-Maine, and Rob Portman, R-Ohio – reintroduced the Broadband Reform and Investment to Drive Growth in the Economy Act (BRIDGE Act).  This would allocate $40 billion for broadband infrastructure. The bill promotes gigabit broadband and would require that funding only be allowed for technology that can deliver symmetrical 100 Mbps speeds. The bill lifts the ban on municipal participation in creating a broadband solution.  The BRIDGE Act takes a drastically different approach in allocating funding and would give money to States to choose funding priorities.

We now hear that a bipartisan group of Senators and the White House have hammered out a compromise infrastructure plan that would allocate $65 billion for broadband infrastructure. The plan at this point doesn’t talk about the mechanics of how the funding would be awarded. Nut it seems likely that a bipartisan compromise bill would be something different than the bills discussed above. For instance, it’s likely that a bipartisan bill would not support municipal broadband. By all news accounts, this compromise is still far from a done deal, and a Democratic-only infrastructure bill is also being considered. But passage of that is far from assured.

I predict that for the rest of the summer we’re going to continue to see different dollar amounts and different ideas on how to spend the infrastructure money – and that’s assuming there even is an infrastructure bill passed. We are now down to the sausage-making, and anybody who thinks they know how this will turn out has a better crystal ball than mine. There is still a lot of time for DC lobbyists to suggest tweaks that benefit their clients. One thing I think we can count on is every Senator telling us how much they care about solving the digital divide. For the broadband industry, this is great theater and there will be continuous headlines – but it’s also serious business since the plans that are already on the table vary significantly in the details that matter.