Categories
Technology

Hollow Core Fiber

BT, formerly known as British Telecom has been working with Lumenisity to greatly improve the performance of hollow core fiber. This is fiber that takes advantage of the fact that light travels faster through air than it does through glass. In a hollow core fiber, air fills center tubes surrounded by glass. As can be seen by the picture accompanying this blog, multiple tubes of glass and air are created inside a single fiber creating a honeycomb effect.

There was news about hollow core fiber a decade ago when a lab at DARPA worked with Honeywell to improve the performance of the fiber. They found then that they could create a single straight path of light in the tubes that was perfect for military applications. The light could carry more bandwidth for greater distances without having to be regenerated. By not bouncing through glass, the signal maintained intensity for longer distances. DARPA found the fixed orientation of light inside the tubes to be of great value for communication with military-grade gyroscopes.

Until the recent breakthrough, the hollow tube fiber was plagued by periodic high signal loss when the light signal lost it’s straight-path coherence. Lumenisity has been able to lower signal loss to 1 dB per kilometer, which is still higher than the 0.2 dB loss expected for traditional fiber. However, the lab trials indicate that better manufacturing process should be able to significantly lower signal loss.

The Lumenisity breakthrough comes from the ability to combine multiple wavelengths of light while avoiding the phenomenon known as interwave mixing where different light frequencies interfere with each other. By minimizing signal dispersion, Lumenisity has eliminated the need for digital signal processors that are used in other fiber to compensate for chromatic dispersion. This means repeater sites that can be placed further apart and that require simpler and cheaper electronics.

Lumenisity doesn’t see hollow core fiber being used as a replacement on most fiber routes. The real benefits come in situations that require low latency along with high bandwidth. For example, the hollow core fiber might be used to feed the trading desks on Wall Street. The fiber might improve performance for fiber leaving big data centers.

Lumenisity is building a factory in the U.K. to manufacture hollow core fiber and expects to have it in mass production by 2023.

Categories
Technology

The Natural Evolution of Technology

I’ve been thinking lately about the future of current broadband technologies. What might the broadband world look like in twenty years?

The future of broadband technology will be driven by the continued growth in broadband demand, both in the amount of bandwidth we use and in the broadband speeds the public will demand. Technologies that can’t evolve to keep up with future demand will fade away – some slowly and some virtually overnight.

I don’t think it’s a big stretch to say that within twenty years that fiber will be king. There is a huge national push to build fiber now, with huge funding from federal and state grants, but also unprecedented amounts of commercial investment in fiber. Fiber will be built in a lot of rural America through subsidies and in a lot of small and medium towns because it makes financial sense. The big challenge will continue to be urban neighborhoods where fiber construction costs are high. Twenty years from now we’ll look back on today as the time when we finally embraced fiber, much like we look back twenty years ago when DSL and cable modems quickly killed dial-up.

It goes without saying that telephone copper will be dead in twenty years. To the extent copper is still on poles it will be used to support overlashed fiber. DSL will serve as the textbook posterchild about how technologies come and go. DSL is already considered as obsolete, a mere twenty years after introduction to the market. In twenty more years, it will be a distant memory.

I don’t see a big future for rural WISPs. These companies will not fare well in the fierce upcoming competition with fiber, low-orbit satellite, and even fixed cellular. Some stubborn WISPs will hang on with small market penetrations, but research into new and better radios will cease as demand for WISP services fade. The smart WISPs are going to move into towns and cities. WISPs willing to adapt to using millimeter-wave radios can grab a decent market share in towns by offering low prices to consumers who value price over big bandwidth. I predict that WISPs will replace DSL as the low-price competitor against the large ISPs in towns and cities.

Low orbit satellites will still serve the most remote customers in twenty years – but this won’t be the technology of choice due to what will be considered in the future as very slow bandwidth. Two decades from now, a 150 Mbps download connection is going to feel like today’s DSL. The satellite companies will thrive in the third world where they will be the ISP of choice for most rural customers. Interestingly, when I look out forty years, I think it’s likely that residential satellite broadband will fade into history. It’s hard to envision this technology can have a forty-year shelf life in a world where broadband demand continues to grow.

The technology that is hard to predict is cable broadband. From a technology perspective, it’s hard to see cable companies still wanting to maintain coaxial copper networks. In twenty years, these networks will be 70 years old. We don’t talk about it much, but age affects coaxial networks even more than telephone copper networks. Over the next decade, cable companies face a hard choice – convert to fiber or take one more swing at upgrading to DOCSIS 4.0 and its successors. It’s hard to imagine the giant cable companies like Comcast or Charter making the decision to go all fiber – they will worry too much about how the huge capital outlay will hurt their stock prices.

I expect there will still be plenty of coaxial networks around in twenty years. Unfortunately, I foresee that coaxial copper will stay in the poorest urban neighborhoods and smaller rural towns while suburbs and more affluent urban neighborhoods will see a conversion to fiber. For anybody who doesn’t think that can happen, I pointto AT&T history of DSL redlining. Cable companies might even decide to largely abandon poorer neighborhoods to WISPs and municipal fiber overbuilders, similar to the way that AT&T recently walked away from DSL.

It’s easy to think of technologies as being permanent and that any broadband technology used today will be around for a long time. One only has to look at the history of DSL to see that broadband technologies can reach great success only to be obsolete within just a few decades. We’re going to see the evolution of technology for as long as the demand for broadband continues to grow. Much of the technology being touted today as broadband solutions will quietly fade into obscurity over the next twenty years.

This is the biggest reason why I think that only technologies that can be relevant a decade or two from now should be eligible for federal grant funding. It’s shortsighted to give tax dollars to technologies that are not likely to be relevant in the somewhat near future. We saw a great example of that with the CAF II program that funded already-obsolete DSL. More recently saw federal grant money going to Viasat and to rural WISPs in the CAF II reverse auction. There are smarter ways to spend valuable tax dollars.

 

Categories
Technology

Charging our Future

One of the hottest areas of scientific research that peripherally will affect every tech industry is battery research. It seems like every year there are big breakthroughs in battery capability. Today I look at four of the recent announcements.

New Batteries for Robots. Nicholas Kotov at the University of Michigan announced the development of rechargeable zinc batteries that could power robots of all sizes. The batteries use biomorphic technology that creates a membrane between anode and cathode made from Kevlar that mimics cartilage.

The batteries have huge advantages over current lithium batteries in that the materials in the battery are non-toxic and there is no danger of overheating or fires. The batteries are far more efficient, which reduces the weight of the battery needed to power a robot. These batteries can be scaled to power micro-robots or can save weight for large delivery robots. One of the most interesting properties of the batteries is that they can be woven into the outer cover of a robot – freeing space or decreasing the size of the robot.

Safer Lithium Batteries. Scientists at the Applied Physics Laboratory at Johns Hopkins have developed a safer lithium-ion battery. Traditional lithium batteries are powered by a flammable combination of lithium salts and toxic chemicals used for the electrolyte, and if the membrane between anode and cathode leaks the batteries can catch on fire. The safer batteries instead use water-based electrolytes which are non-flammable and non-toxic. The new batteries are also about three times more powerful than traditional lithium batteries. The most interesting characteristic of the batteries is that the material can be manufactured to be clear and can be placed in transparent and flexible housing – meaning the battery could be integrated into smart clothing .

Stable Lithium Batteries. Scientists at the John A. Paulson School of Engineering and Applied Science at Harvard have developed a solid-state lithium battery that eliminates the problems with current lithium batteries. They’ve created a solid lithium-metal battery that is stable and won’t overheat. The battery can be recharged up to 10,000 times, meaning it could work in a vehicle for over ten years. The solid battery also charges much faster than today’s car batteries.

The batteries are constructed with multiple alternating layers of lithium, graphite, and a solid electrolyte. One of the most interesting properties of the batteries is that they are self-healing. All lithium batteries deteriorate over time as lithium gets deposited as dendrites – metallic crystals. The structure of the batteries inhibits, and even reverses dendrite formation.

Aluminum-based Batteries. Using technology developed by the University of Queensland Australian Institute for Bioengineering and Nanotechnology, the Graphene Manufacturing Group plans to start manufacturing aluminum-based batteries. The graphene aluminum ion batteries are about three times more powerful than traditional lithium batteries and eliminate any possibility of starting fires. The big upside to these batteries is that they can be recharged much faster than lithium batteries.

The technology that makes these batteries work was finding a way to make holes in graphene in such a way to allow the aluminum ions to be stored closer together. Probably the best characteristic of the battery is that it uses widely available and low-cost aluminum and aluminum chloride rather than the more costly lithium which mostly comes from China and Chile. The batteries are effective as tiny coin-sized batteries or can scale up as a replacement for car batteries.

Categories
Technology

A Rural Broadband Laboratory

The National Science Foundation along with the US Department of Agriculture is creating a broadband testbed in and around Ames Iowa. The program is part of NSF’s Platforms for Advanced Wireless Research (PAWR) program. This is the fourth wireless test site in the country and the first to be placed in a rural setting. The PAWR programs are a great example of public/private partnerships that to date have attracted over $100 million in private and government investments in research.

This project will provide an outdoor laboratory for engineers and scientists to explore ways to maximize the benefit of new wireless technologies for agriculture. Additionally, new technologies will be deployed throughout the college community of Ames.

The PAWR projects, to date, have included the participation of over 35 wireless providers and vendors. This project has already attracted the participation of several universities in addition to the  Iowa State University including the University of California at Irvine, and the Ohio State University. John Deere will be participating in the testbed along with U.S. Cellular, the Iowa Regional Utilities Association, and the Iowa Department of Transportation. The experiments will include participation from students from Iowa State as well as from local schools. Also participating will be Woodland Farms and the Meskwaki Tribal Nation.

Formal testbeds are always interesting because the FCC generally grants test licenses for scientists to experiment with radio frequencies in ways that may not be on the radar for the big carriers. The project includes $8 million to construct a wireless network that will cover nearly 600 square miles in and around Ames. One of the concepts to be explored is the collaboration potential and interaction between satellite broadband, existing wireless networks, and new wireless technologies.

Scientists will be experimenting with technologies involved in precision agriculture including drones, self-driving farm machinery, and an array of environmental sensors. One of the first experiments will involve identifying weeds for automatic eradication using high-resolution video. Field sensors will transmit live pictures to the cloud to allow for accurate identifications of weeds. Training robots to manually eliminate weeds would mean a drastic reduction in the use of herbicides in the food chain.

The project will also step outside of agriculture and look at technologies and applications that can expand wireless coverage in rural areas. This will involve experimenting with hybrid networks that use different frequencies and wireless technologies in unison to bring stronger broadband signals to the fields and areas where it is most needed.

These kinds of experimental sites are always interesting and exciting because ideas tested in programs like this end up as everyday technology a decade from now. Giving scientists and engineer a large outdoor laboratory provides them with a way to test ideas in ways that can’t be explored in the lab or in small testbeds.

Categories
Technology The Industry

Cellular Data Speeds Much Improved

I’m curious about how many people realize that cellular broadband download speeds have increased dramatically over the last year. I’m not a heavy cellular data user, particularly during the pandemic year when I barely used cellular data outside of the home. But I’ve always run cellular speed tests a few times per year and have definitely noticed faster download speeds.

Following is a comparison of cellular download speeds in the recent first quarter of this year compared to the first quarter of 2019. In both cases, the speeds are national averages reported by Ookla that are based upon millions of cellular speed tests.

2019 2021
AT&T 34.7 Mbps 76.6 Mbps
T-Mobile 34.1 Mbps 82.4 Mbps
Verizon 33.1 Mbps 67.2 Mbps

There are several reasons for the increase in speeds. First, many cell sites were not fully 4G compliant in the first quarter of 2019. The first fully 4G compliant cell site was completed in the winter of 2018. Since then, the carriers have implemented 4G everywhere.

The carriers have also implemented new spectrum bands. They’ve labeled the new spectrum as 5G, but the new spectrum bands are all still using 4G technology. The new spectrum allows cellular customers to spread out into multiple channels. This means that older spectrum bands and the networks are not getting bogged down and overbusy during the heaviest usage times of the day such as during the daily commute.

I also suspect that the pandemic has some role in the difference. During the pandemic the daytime demand for cellular data has been suppressed by far fewer people commuting and spending the day outside the home. A less busy cellular network should translate into faster speeds.

As part of writing this blog I took a speed test on my cellphone in downtown Asheville on AT&T. I got a download speed of 60.5 Mbps, and an upload speed of only 1.8 Mbps.

It’s worth looking at the Ookla article because it shows median broadband speeds by state. Note that median is different than average and median means half the speed tests were slower and half were faster. The medium speeds are significantly lower than the national average, which indicates that there are more fast speed tests than slow ones to drive the average higher.

It also seems likely that urban speeds are much faster than rural speeds for a variety of reasons. That conjecture is somewhat verified by the District of Columbia having the fastest median speeds. The top eight fastest speeds are all on the east coast. The ten slowest states are all at half of the median speeds in D.C. – with the slowest speeds being in Mississippi, Wyoming, West Virginia, Iowa, Vermont, and surprisingly Texas.

I’ve still never figured out why faster cellular data speeds would be important to the average cellular customer. The most data-intensive most people do on cellphones is to watch video, and that only requires a few Mbps of speed. There would be a benefit when updating cellphone software, but I have to imagine that most people do this while on WiFi. I would love for somebody to provide real-life examples of how faster cellular data speeds are making a daily difference.

Categories
Technology

Free Space Optics

I read an article on the Finley Engineering blog that talks about new research with free-space optics. For those not familiar with the term, this means communication gear that communicates directly using light without any wires.

The article talks about a Chinese team of scientists that have been able to use light to transmit ultrahigh-definition video signal between high-rise buildings. That’s an interesting feat because light signals in an urban environment must deal with air pollution, vehicle exhausts, and other factors that place small particles in the air that can disrupt a light signal. The scientists have found new ways to compensate for attenuation and scattering due to environmental factors.

If perfected, light transmitters could play some interesting roles. It might make sense to use light transmission in places with unusual terrain constraints. The technology could be used to pop up an instant broadband connection between buildings until a more permanent connection can be built. The technology could provide a quick fix for restoring key broadband connections after disaster recovery. The real promise for the technology is in space. It makes sense to use lasers to communicate between satellites or to communicate between manned outposts.

The technology has been around for a long time. Alexander Bell created a photophone in 1880 that he thought was one of his most important inventions. He demonstrated the use of the technology by transmitting a call about 700 feet between two buildings using a light signal. We all use remote controls that transmit signals using infrared light.

There have been earlier attempts to use the technology in the telecom industry. Back during the telecom boom of the late 1990s, several well-funded start-ups tried to develop working technology using light instead of radio frequency to transmit broadband. The biggest of these was Terabeam that attracted over a half-billion dollars in start-up funding and was backed by AT&T and Lucent. I recall talking to engineers at Terabeam about the technology. Other well-funded start-ups that explored the technology included AirFiber and LightPointe Communications.

But none of the companies could ever overcome the natural problems that occur in ambient outdoor conditions. It turns out the real killer for the technology is fog, which completely cuts off transmissions. But the technology also was never reliable in normal weather due to pollution and airborne particles.

The concept resurfaced again a decade later and was labeled as Li-Fi. The concept with this technology was to transmit data by turning LED diodes on and off extremely quickly as a way to transmit the ones and zeros needed for digital transmission. Scientists have been able to achieve a transmission speed as fast as 224 gigabits per second by simultaneously transmitting separate signals over different frequencies of light.

There were several trials of Li-Fi technology in Europe in 2018 and 2019 at a BMW plant, a school, and at the 2019 Paris airshow.

Free space optics is an attractive technology in environments like busy factories, nuclear power plants, or airports that are already busy with multiple radio frequencies. It’s an interesting way that can be used to pass data between smart cars that avoids all of the issues associated with radio frequencies. The technology is being considered for transmissions within aircraft to reduce interference with existing critical devices.

The idea of using light to transmit data is enticing because the visible light spectrum can carry approximately 10,000 times more bandwidth than the entire radio frequency spectrum. It’s a concept that is attractive to carriers because it could mean making short-length data transmissions without having to own spectrum. I doubt that we have heard the end of free space optics.

Categories
Technology

An Attack on WiFi Spectrum

A little over a year ago the FCC approved the use of 1,200 MHz of spectrum in the 6 GHz band for public use – for new WiFi. WiFi is already the most successful deployment of spectrum ever. A year ago, Cisco predicted that by 2022 that WiFi will be carrying more than 50% of global IP traffic.

These are amazing statistics when you consider that WiFi has been limited to using 70 MHz of spectrum in the 2.4 GHz spectrum band and 500 MHz in the 5 GHz spectrum band. The additional 1,200 MHz of spectrum will vastly expand the capabilities of WiFi. WiFi performance was already slated to improve due to the introduction of WiFi 6 technology. Adding the new spectrum will drive WiFi performance to a new level. The FCC order adds seven 160 MHz channels to the WiFi environment (or alternately adds fifty-nine 20 MHz channels. For the typical WiFi environment, such as a home in an urban setting, this is enough new channels that big bandwidth devices ought to be able to grab a full 160 MHz channel. This is going to increase the performance of WiFi routers significantly by allowing homes or businesses to separate devices by channel to avoid interference.

One minor worry about the 6 GHz band is that it isn’t being treated the same everywhere. China has decided to allocate the entire 6 GHz spectrum band to 5G. Europe has allocated only 500 MHz for WiFi with the rest going to 5G. Other places like Latin America have matched the US allocation and are opting for a greatly expanded WiFi. This means that future WiFi devices won’t be compatible everywhere and will vary by the way the devices handle the 6 GHz spectrum. That’s not the ideal situation for a device maker, but this likely can be handled through software in most cases.

The GSMA, which is the worldwide association for large cellular carriers is lobbying for the US to allow 6 GHz to be used for 5G. They argue that since the 6 GHz spectrum is available to the public that cellular carriers ought to be able to use it like anybody else. They’d like to use it for License Assisted Access (LAA), which would allow the cellular carriers to use the spectrum for cellular broadband. If allowed, cellular traffic could flood the spectrum in urban areas and kill the benefits of 6 GHz for WiFi.

This is not the first time this issue was raised. The cellular industry lobbied hard to be able to use LAA when the FCC approved 5 GHz spectrum for WiFi. Luckily, the FCC understood the huge benefits of improved WiFi and chose to exclude cellular carriers from using the spectrum.

It would be a huge coup for cellular carriers to get to use the 6 GHz spectrum because they’d get it for free at a time where they’ve paid huge dollars for 5G spectrum. The FCC already heard these same arguments when they made the 6 GHz decision, so hopefully, the idea goes nowhere.

I talk to a lot of ISPs that tell me that poor WiFi performance is to blame for many of the perceived problems households have with broadband. Inefficient and out-of-date routers along with situation where too many devices are trying to use only a few channels is causing many of the problems with broadband. The 6 GHz WiFi spectrum will bring decades of vastly improved WiFi performance. It’s something that every homeowner will recognize immediately when they connect a properly configured WiFi router using the 6 GHz spectrum.

For now, there are not many devices that are ready to handle the new WiFi spectrum and WiFi 6 together. Some cellphones are now coming with the capability, and as this starts getting built into chips it will start working for laptops, tablets, PCs, and smart televisions. But homes will only see the real advantage over time as they upgrade WiFi routers and the various devices.

Interestingly, improved WiFi is a direct competitor for the cellular carriers in the home. The carriers have always dreamed of being able to sell subscriptions for homes to connect our many devices. WiFi allows for the same thing with just the cost of buying a new router. It would be an obvious boon to cellular carriers to both kill off the WiFi competitor while getting their hands on free spectrum.

Hopefully, the FCC will reject this argument as something that has already been decided. The GSMA argues that 5G will bring trillions of dollars in benefits to the world – but it can still do that without this spectrum. The benefit of improved WiFi has a huge value as well.

Categories
Technology

Fast Polymer Cables

Scientists and engineers are always looking for ways to speed up and more efficiently configure computing devices to maximize the flow of data. There are a lot of applications today that require the exchange of huge volumes of data in real-time.

MIT scientists have created a hair-like plastic polymer cable that can transmit data ten times faster than copper USB cables. The scientists recently reported speeds on the new cables in excess of 100 gigabits per second. The new fibers mimic the best characteristics of copper cable in that electronic signals can be conveyed directly from device to device.

https://spectrum.ieee.org/tech-talk/computing/networks/plastic-polymer-cables-that-rival-fiber-optics

Another interesting characteristic of polymer cables is that it’s possible to measure the flow of electrons through each cable from outside – something that is impossible to do with fiber optic cables. This is a key function that can be used to direct the flow of data at the chip level in fast computing devices.

If brought to market, these cables will solve several problems in the computing industry. The new fibers mimic the best feature of copper wires like USB cables are easily compatible with computer chips and other network devices. A copper Ethernet cable can connect two devices directly with no need to reformat data. Fiber cables are much faster than copper but require an intermediate device to convert light signals back into electronic signals at each device.

There are immediate uses for faster cables in applications like data centers, self-driving cars, manufacturing robots, and devices in space. The new cables would be a benefit anywhere that large amounts of data need to be transferred in real-time from device to device. Since the polymer fibers are thin, they could also this also be used to speed up data transfer between chips within devices.

The data transmission rates on the polymer cables are currently at 100 gigabits per second for a distance of around 30 centimeters. The MIT scientists believe they will be able to goose speeds to as much as a terabit while increasing transmission distances to a meter and beyond.

There is a long way to go to move a new technology from laboratory to production. There would first need to be industry standards developed and agreed upon by the iEEE. Using new kinds of cables means changing the interface into devices and chips. There are also the challenges of mass manufacturing the new cables and of integrated them into the existing supply chain.

I’m always amazed at how modern science seems to always find solutions when we need them. We are just now starting to routinely use computer applications like AI that rely on quickly moving huge amounts of data. Just a decade ago nobody would have been talking about chips that needed anything close to 100 gigabits of input or output. It’s easy to assume that computing devices somehow get faster when chips are made faster, but these new cables act as a reminder that there are numerous components required in making faster computing. Fast chips do not good if we can’t get data in and out of the chip fast enough.

Categories
Technology

Comcast Tests DOCSIS 4.0

Comcast recently conducted its first test of the DOCSIS 4.0 technology and achieved a symmetrical 4-gigabit connection. The test was enabled by a DOCSIS 4.0 chip from Broadcom. The DOCSIS 4.0 standard was released in March 2020 and this is the first test of the new standard. The DOCSIS 4.0 standard allows for a theoretical transmission of 10 Gbps downstream and 6 Gbps upstream – this first test achieved an impressive percentage of the capability of the standard.

Don’t expect this test to mean that cable companies will be offering fast symmetrical broadband any time soon. There is a long way to go from the first lab test to a product deployed in the field. Lab scientists will first work to perfect the DOCSIS 4.0 chip based upon whatever they found during the trial. It typically takes most of a year to create a new chip and it would not be surprising for Comcast to first spend several years and a few iterations to solidify the chip design. Assuming Comcast or some cable company is ready to buy a significant quantity of the new chips, it would be put into the product design cycle at a manufacturer to be integrated into the CMTS core and home cable modems.

That’s the point when cable companies will face to tough choice of pursuing the new standard. When the new technology was announced last year, most of the CTOs of the big cable companies were quoted that they didn’t foresee the implementation of the new standard for at least a decade. This is understandable since the cable companies recently made the expensive upgrade to DOCSIS 3.1.

An upgrade to DOCSIS 4.0 isn’t going to be cheap. It first means the replacement of all existing electronics in a rip-and-replace upgrade. That includes cable modems at every customer premise. DOCSIS 4.0 will require network capacity to be increased to at least 1.2 GHz. This likely means replacement of power taps and network amplifiers throughout the outside plant network.

There is also the bigger issue that the copper plant in cable networks is aging in the same manner as telco copper. There are already portions of many cable company networks that underperform today. Increasing the overall bandwidth of the network might result in the need for a lot of copper replacement. And that is going to create a pause for cable company management. While the upgrade to DOCSIS 3.1 was expensive, it’s going to cost more to upgrade again to DOCSIS 4.0. At what point does it make sense to upgrade instead to fiber rather than tackle another costly upgrade on an aging copper network?

There is then the market issue. The cable companies are enjoying an unprecedented monopoly position. Comcast and Charter together have over half of all broadband customers in the country. While there are households that are unhappy with Comcast or Charter broadband, most don’t have any competitive alternative. The FCC statistics and the screwball websites that claim that Americans have multiple broadband choices are all fiction. For the average urban or suburban family, the only option for functional broadband is the cable company.

This market power means that the cable companies are not going to rush into making upgrades to offer greater speeds just because the technology exists. Monopolists are always slow to introduce technology upgrades. Instead, the cable companies are likely to continue to increase customer speeds across the board. Both Charter and Comcast did this recently and increased download speeds (or at least the download speed they are marketing).

I expect that the early predictions that it would be a decade before we see widespread DOCSIS 4.0 were probably pretty prescient. However, a continued clamor for faster upload speeds or rapid deployment of fiber by competitors could always move up the time when the cable companies have to upgrade to DOCSIS 4.0 or fiber. But don’t let headlines like this make you think this is coming soon.

Categories
Regulation - What is it Good For? Technology

A 10-Gigabit Tier for Grants

One of the biggest flaws in the recent RDOF reverse auction grant was allowing fixed wireless technology to claim the same gigabit technology tier as fiber. The FCC should never have allowed this to happen. While there is a wireless technology that can deliver up to a gigabit of speed to a few customers under specific circumstances, fiber can deliver gigabit speeds to every customer in a network. This is particularly true in a rural setting where the short reach of gigabit wireless at perhaps a quarter mile is a huge limiting factor for using the technology in a rural setting.

But rather than continue to fight this issue for grant programs there is a much easier solution. It’s now easy to buy residential fiber technology that can deliver 10-gigabits of speed. There have been active Ethernet lasers capable of 10-gigabit speeds for many years. In the last year, XGS-PON has finally come into a price range that makes it a good choice for a new passive fiber network – and the technology can deliver 10-gigabit download speeds.

The FCC can eliminate the question of technology equivalency by putting fiber overbuilders into a new 10-gigabit tier. This could give funding fiber the priority over all other technologies. Fixed wireless will likely never be capable of 10-gigabit speeds. Even if that ever is made possible decades from now, by then fiber will have moved on to the next faster generation. Manufacturers are already looking at 40-gigabit speeds for the next generation of PON technology.

Cable company hybrid-fiber coaxial networks are not capable today of 10-gigabit speeds. These networks could possibly deliver speeds of around 6 or 7 gigabits, but only by removing all of the television signals and delivering only broadband.

I don’t know why it was so hard for the FCC to say no to gigabit fixed wireless technology. When the industry lobbied to allow fixed wireless into the gigabit tier, all the FCC had to do was to ask to see a working demo of wireless gigabit speeds working in a rural farm environment where farms are far apart. The FCC should have insisted that the wireless industry demonstrates how every rural household in the typical RDOF area can receive gigabit speeds. They should have been made to show the technology overcomes distance and line-of-sight issues. There is no such demo because the wireless technology can’t do this – at least not without building fiber and establishing a base transmitter at each farm. The FCC really got suckered by slick PowerPoints and whitepapers when they should have instead asked to see the working demo.

Don’t get me wrong – I don’t hate the new wireless technologies. There are small towns and neighborhoods in rural county seats that could really benefit from the technology. The new meshed networks, if fed by fiber, can superfast bandwidth to small pockets of households and businesses. This can be a really attractive and competitive technology.

But this is not fiber. Every rural community in America knows they want fiber. They understand that once you put the wires in place that fiber is going to be providing solutions for many decades into the future. I think if fiber is built right that it’s a hundred-year investment. Nobody believes this to be true of fixed wireless. The radios are all going to be replaced many times over the next hundred years and communities worry about having an ISP who will make that continual reinvestment.

But since there is such an easy way to fix this going forward, these arguments about gigabit wireless can be largely moot. If the FCC creates a 10-gigabit tier for grants, then only fiber will qualify. The fixed wireless folks can occupy the gigabit tier and leave most other technologies like low-orbit satellite to some even lower tier. The FCC made a mistake with RDOF that they can’t repeat going forward – the agency declared that other technologies are functionally equivalent to fiber – and it’s just not true.