Categories
Regulation - What is it Good For? Technology

A 10-Gigabit Tier for Grants

One of the biggest flaws in the recent RDOF reverse auction grant was allowing fixed wireless technology to claim the same gigabit technology tier as fiber. The FCC should never have allowed this to happen. While there is a wireless technology that can deliver up to a gigabit of speed to a few customers under specific circumstances, fiber can deliver gigabit speeds to every customer in a network. This is particularly true in a rural setting where the short reach of gigabit wireless at perhaps a quarter mile is a huge limiting factor for using the technology in a rural setting.

But rather than continue to fight this issue for grant programs there is a much easier solution. It’s now easy to buy residential fiber technology that can deliver 10-gigabits of speed. There have been active Ethernet lasers capable of 10-gigabit speeds for many years. In the last year, XGS-PON has finally come into a price range that makes it a good choice for a new passive fiber network – and the technology can deliver 10-gigabit download speeds.

The FCC can eliminate the question of technology equivalency by putting fiber overbuilders into a new 10-gigabit tier. This could give funding fiber the priority over all other technologies. Fixed wireless will likely never be capable of 10-gigabit speeds. Even if that ever is made possible decades from now, by then fiber will have moved on to the next faster generation. Manufacturers are already looking at 40-gigabit speeds for the next generation of PON technology.

Cable company hybrid-fiber coaxial networks are not capable today of 10-gigabit speeds. These networks could possibly deliver speeds of around 6 or 7 gigabits, but only by removing all of the television signals and delivering only broadband.

I don’t know why it was so hard for the FCC to say no to gigabit fixed wireless technology. When the industry lobbied to allow fixed wireless into the gigabit tier, all the FCC had to do was to ask to see a working demo of wireless gigabit speeds working in a rural farm environment where farms are far apart. The FCC should have insisted that the wireless industry demonstrates how every rural household in the typical RDOF area can receive gigabit speeds. They should have been made to show the technology overcomes distance and line-of-sight issues. There is no such demo because the wireless technology can’t do this – at least not without building fiber and establishing a base transmitter at each farm. The FCC really got suckered by slick PowerPoints and whitepapers when they should have instead asked to see the working demo.

Don’t get me wrong – I don’t hate the new wireless technologies. There are small towns and neighborhoods in rural county seats that could really benefit from the technology. The new meshed networks, if fed by fiber, can superfast bandwidth to small pockets of households and businesses. This can be a really attractive and competitive technology.

But this is not fiber. Every rural community in America knows they want fiber. They understand that once you put the wires in place that fiber is going to be providing solutions for many decades into the future. I think if fiber is built right that it’s a hundred-year investment. Nobody believes this to be true of fixed wireless. The radios are all going to be replaced many times over the next hundred years and communities worry about having an ISP who will make that continual reinvestment.

But since there is such an easy way to fix this going forward, these arguments about gigabit wireless can be largely moot. If the FCC creates a 10-gigabit tier for grants, then only fiber will qualify. The fixed wireless folks can occupy the gigabit tier and leave most other technologies like low-orbit satellite to some even lower tier. The FCC made a mistake with RDOF that they can’t repeat going forward – the agency declared that other technologies are functionally equivalent to fiber – and it’s just not true.

Categories
Technology

Our Evolving Technologies

A client asked me recently for an update on all of the technologies used today to deliver broadband. The last time I talked about this topic with this client was three years ago. As I talked through each technology, it struck me that every technology we use for broadband is better now than three years. We don’t spend enough time talking about how the vendors in this industry keep improving technology.

Consider fiber. I recently have been recommending that new fiber builders consider XGS-PON. While this technology was around three years ago it was too expensive and cutting edge at the time to consider for most ISPs. But AT&T and Vodaphone have built enough of the technology that the prices for the hardware have dropped to be comparable to the commonly used GPON technology. This means we now need to start talking about FTTP as a 10-gigabit technology – a huge increase in capacity that blows away every other technology. Some improvements we see are more subtle. The fiber used for wiring inside buildings if far more flexible and bendable than three years ago.

There have been big improvements in fixed wireless technology. Some of this improvement is due to the FCC getting serious about providing more spectrum for rural fixed wireless. During the last three years, the agency has approved CBRS spectrum and white space spectrum that is now being routinely used in rural deployments. The FCC also recently approved the use of 6 GHz WiFi spectrum that will add even more horsepower. There have also been big improvements in the radios. One of the improvements that isn’t mentioned is new algorithms that speed up the wireless switching function. Three years ago, we talked about high quality fixed wireless speeds of 25 Mbps to 50 Mbps and now we’re talking about speeds over 100 Mbps in ideal conditions.

All three major cellular carriers are in the process of building out a much-improved fixed cellular broadband product. This has also benefited from new bands of frequencies acquired by the cellular carriers during the last three years. Three years ago, any customer with a cellular hotspot product complained about slow speeds and tiny monthly data caps. The new products allow for much greater monthly usage, up to unlimited and speeds are better than three years ago. Speeds are still largely a function of how far a home is from the closest cell site, so this product is still dreadful for those without good cellular coverage – but it means improved broadband with speeds up to 50 Mbps for many rural households.

Three years ago, the low-orbit satellites from Starlink were just hype. Starlink now has over 1,000 satellites in orbit and is in beta test mode with customers reporting download speeds from 50 Mbps to 150 Mbps. We’re also seeing serious progress from One Web and Jeff Bezos’s Project Kuiper, so this industry segment is on the way to finally becoming a reality. There is still a lot of hype, but that will die when homes can finally buy the satellite broadband products – and when we finally understand speeds and prices.

Three years ago, Verizon was in the early testing stage of fiber-to-the-curb. After an early beta test and a pause to improve the product, Verizon is now talking about offering this product to 25 million homes by 2025. This product uses mostly millimeter-wave spectrum to get from the curb to homes. For now, the speeds are reported to be about 300 Mbps, but Verizon says this will get faster.

We’ve also seen big progress with millimeter-wave mesh networks. Siklu has a wireless product that they tout as an ideal way to bring gigabit speeds to a small shopping district. The technology delivers a gigabit connection to a few customers and the broadband is then bounced from those locations to others. Oddly, some companies are talking about using this product to satisfy the rural RDOF grants, which is puzzling since the transmission distance is only a quarter-mile and also requires great line-of-sight. But expect to see this product pop up in small towns or retail districts all over the country.

Cable company technology has also improved over the last three years. During that time, a lot of urban areas saw the upgrade to DOCSIS 3.1 with download speeds now up to a gigabit. CableLabs also recently announced DOCSIS 4.0 which will allow for symmetrical gigabit plus speeds but which won’t be available for 3-5 years.

While you never hear much about it, DSL technology over copper has gotten better. There are new versions of G.Fast being used to distribute broadband inside apartment buildings that is significantly better than what was on the market three years ago.

Interestingly, the product that got the most hype during the last three years is 5G. If you believe the advertising, 5G is now everywhere. The truth is that there is no actual 5G yet in the market yet and this continues to be marketing hype. The cellular carriers have improved their 4G networks by overlaying new spectrum, but we’re not going to see 5G improvements for another 3-5 years. Unfortunately, I would bet that the average person on the street would say that the biggest recent telecom breakthrough has been 5G, which I guess shows the power of advertising and hype.

Categories
Technology

The 6G Hype is Already Starting

Even though 5G hasn’t yet made it onto any cellphone, the wireless vendor industry is already off and running looking at the next generation of wireless technology that has been dubbed as 6G. This recent article describes the European Union Hexa-X project that started in January to look at developing specifications for next-generation wireless technology using terahertz spectrum. The initiative will be led by Nokia Bell Labs and Ericsson. Similar research is being done elsewhere around the world by companies such as Huawei, NTT, and Samsung.

6G wireless will explore using the high frequencies between 100 GHz and 1 THz (terahertz), which are collectively being referred to as terahertz frequencies. These are radio waves that are just below the frequencies of infrared light. These frequencies have such short waves, that at the upper end, the frequencies could carry as much as 1,000 times more bandwidth than the frequencies used in cellphones today.

But there is a huge trade-off for the huge bandwidth capacity in that these frequencies travel only short distances, measured in a few feet, before starting to dissipate. These frequencies will not pass through any obstacle and need a clear line of sight.

It’s likely that any 6G technology will be used for indoor data transmission, and 6G could become the fastest delivery mechanism of bandwidth to use within a room between devices. The bandwidth capabilities of these superhigh frequencies could finally fully enable technologies like telepresence (I finally get a holodeck!), or cobots (interactive robots).

Of course, like with any new technology, there is also already hype. Samsung recently released a whitepaper that said that using terahertz waves for cellphones is ‘inevitable’. Long before we try to somehow tame terahertz frequencies in the wild, we need to first figure out millimeter-wave cellular technologies. The current use of millimeter-wave hotspots in downtown metropolitan areas has provided cover for cellular carriers to hype gigabit speeds 5G – but this is a miserable technology in terms of usefulness or reliability. The millimeter-wave spectrum is blocked by everything in the environment, including the body of the user.

More importantly, I’ve never heard anybody make a coherent description of why we need to deliver gigabit or faster speeds to cellphones. If we modify cellphones to process data that quickly we’ll need to find a way to recharge the phones every hour. While I understand why engineers go gaga over the idea of delivering a hundred or a thousand times more data to a cellphone, we need a reality check to ask why anybody would want to do that. Smartphones might be the most important technology developed in this century, but there seems to be little need to turn cellphones into a walking data center unless we want to also start carrying around small air-conditioning units to keep the chips cool.

It makes sense that device makers like Nokia and Ericsson would get excited over the next generation of wireless devices. It’s not hard to envision entirely new technologies twenty years from now that take advantage of terahertz frequencies. Seriously, who is not going to want a holodeck in their living room?

Interestingly, the introduction of 6G is likely going to be of less value to the big cellular carriers. These companies have already started to lose the indoor battle for 5G. Verizon and AT&T had once envisioned a world where homeowners would buy monthly 5G data plans for all of the wired devices in our home. But the FCC already gutted that idea by releasing 6 GHz spectrum for free use, which manufacturers are marrying to the new WiFi 6 standard. As is inevitable, a free solution that doesn’t require a monthly subscription is going to capture most of the indoor market. We’re not going to be buying a 5G subscription for our 8K TV when we have WiFi 6 operating from a $100 router.

One has to imagine the same future for terahertz frequencies. The FCC will eventually create at least one band of terahertz frequency that anybody can use, and that’s the frequency that will power the superfast devices in our homes and offices.

One thing that the early 6G hype fails to mention is the fiber networks that will be needed to fuel superfast applications. We aren’t going to be operating a holodeck using a measly 1 Gbps broadband connection. Twenty years from now, techie households will be screaming for the delivery of 100 Gbps bandwidth to support their terahertz gaming applications.

Categories
Technology

The Gigabit Wireless Controversy

One of the big controversies in the RDOF auction was that the FCC allowed three of the top ten grant winners to bid using gigabit wireless technology. This was Starry (Connect Everyone), Resound Networks, and Nextlink (AMG Technology). By bidding in the gigabit tier these technologies were given the same technology and dollar weighting as somebody bidding to build fiber-to-the-premise. There was a big outcry from fiber providers that claim that these bidders gained an unfair advantage because the wireless technology will be unable to deliver gigabit speeds in rural areas.

Fiber providers say that the bidding with gigabit wireless violates the intent of the grants. Bidding in the gigabit tier should mean that an ISP can deliver a gigabit product to every customer in an RDOF grant area. Customers don’t have to buy a gigabit product, but the capability to provide that speed to every customer must be there. This is something that comes baked-in with fiber technology – a fiber network can deliver gigabit speeds (or 10-gigabit speeds these days) to any one customer, or easily give it to all customers.

There is no denying that there is wireless technology that can deliver gigabit speeds. For example, there are point-point radios using millimeter-wave spectrum that can deliver a gigabit path for up to two miles or a multi-gigabit path for perhaps a mile. But this technology delivers the bandwidth to only a single point. This is the technology that Starry and others use in downtown areas to beam a signal from rooftop to rooftop to serve apartment buildings, with the bandwidth shared with all of the tenants in the building. This technology delivers up to a gigabit to a building, but something less to tenants. We have a good idea of what this means in real life because Starry publishes the average speed of its customers. In March 2021, the Starry website said that its average customer received 232 Mbps download and 289 Mbps up. That’s a good bandwidth product, but it is not gigabit broadband.

There is a newer technology that is more suited for areas outside of downtown metropolitan areas. Siklu has a wireless product that uses unlicensed spectrum in the V-band at 60 GHz and around 70 GHz. This uses a Qualcomm chip that was developed for the Facebook Terragraph technology. A wireless base station that is fiber-fed can serve up to 64 customers – but the catch is that the millimeter-wave spectrum used in this application travels only about a quarter of a mile. Further, this spectrum requires a nearly perfect line-of-sight.

The interesting feature of this technology is that each customer receiver can also retransmit broadband to make an additional connection. Siklu envisions a network where four or five hops are made from each customer to extend broadband around the base transmitter. Siklu advertises this product as being ideal for small-town business districts where a single fiber-fed transmitter can reach the whole downtown area through the use of the secondary beams. With a handful of customers on a system, this could deliver a gigabit wireless product. But as you start adding secondary customers, this starts acting a lot like a big urban apartment building, and the shared speeds likely start looking like what Starry delivers in urban areas – fast broadband, but that doesn’t meet the definition that every customer can receive a gigabit.

The real catch for this technology comes in the deployment. The broadband strength is pretty decent if every base transmitter is on fiber. But ISPs using the technology are likely going to cut costs by feeding additional base stations with wireless backhaul. That’s when the bandwidth starts to get chopped down. An RDOF winner would likely have to build a lot of fiber and have transmitters every mile to get the best broadband speeds – but if they dilute the backhaul by using wireless connections between transmitters, or spacing base station further apart, then speeds will drop significantly.

The other major issue with this technology is that it’s great for the small-town business district, but how will it overlay in the extremely rural RDOF areas? The RDOF grants cover some of the most sparsely populated areas in the country. The Siklu technology will be quickly neutered by the quarter-mile transmission distance when customers live more than a quarter-mile apart. Couple this with line-of-sight issues and it seems extremely challenging to reach a lot of the households in most RDOF areas with this technology.

I come down on the side of the fiber providers in this controversy. In my mind, an ISP doesn’t meet the grant requirements if they can’t reach every customer in an RDOF area. An ISP also doesn’t meet the gigabit grant requirements if only some customers can receive the gigabit speeds. That’s the kind of bait-and-switch we’ve had for years, thanks to the FCC that has allowed an ISP to bring fast broadband to one customer in a Census block and declare that everybody has access to fast speeds.

It’s a shame that I feel obligated to come to this conclusion because deployed well, these wireless technologies can probably bring decent broadband to a lot of homes. But if these technologies can’t deliver a gigabit to everybody, then the ISPs gained an unfair advantage in the RDOF grant bidding. When I look at the widely spaced home in many RDOF areas I can’t picture a wireless network that can reach everybody while also delivering gigabit capabilities. The only way to make this work would be to build fiber close to every customer in an RDOF area – and at that point, the wireless technology would be nearly as costly as FTTH and a lot more complicated to maintain. I think the FCC bought the proverbial pig-in-a-poke when they approved rural gigabit wireless.

Categories
Technology

Next Generation PON is Finally Here

For years, we’ve been checking the prices of next-generation passive optical network (PON) technology as we help clients consider building a new residential fiber network. As recently as last year there was still at least a 15% or more price penalty for buying 10 Gbps PON technology using the NG-PON2 or XGS-PON standards. But recently we got a quote for XGS-PON that is nearly identical in price to buying the GPON that’s been the industry standard for over a decade.

New technology is usually initially more expensive for two reasons. Manufacturers hope to reap a premium price from those willing to be early adapters. You’d think it would be just the opposite since the first buyers of new technology are the guinea pigs who have to help debug all of the inevitable problems that crop up in new technology. But the primary reason that new technology costs more is economy of scale for the manufacturers – prices don’t drop until manufacturers start manufacturing large quantities of a new technology.

The XGS-PON standard provides a lot more bandwidth than GPON. The industry standard GPON technology delivers 2.4 Gbps download and 1 Gbps upload speed to a group of customers – most often configured at 32 passings. XGS-PON technology delivers 10 Gbps downstream and 2.5 Gbps upstream to the same group of customers—a big step up in bandwidth.

The price has dropped for XGS-PON primarily due to its use by AT&T in the US and Vodaphone in Europe. These large companies and others have finally purchased enough gear to drive down the cost of manufacturing.

The other next-generation PON technology is not seeing the same price reductions. Verizon has been the only major company pursuing the NG-PON2 standard and is using it in networks to support large and small cell sites. But Verizon has not been building huge amounts of last-mile PON technology and seems to have chosen millimeter-wave wireless technology as the primary technology for reaching into residential neighborhoods. NG-PON2 works by having tunable lasers that can function at several different light frequencies. This would allow more than one PON to be transmitted simultaneously over the same fiber but at different wavelengths. This is a far more complex technology than XGS-PON, which basically has faster lasers than GPON.

One of the best features of XGS-PON is that some manufacturers are offering this as an overlay onto GPON. An overlay means swapping out some cards in a GPON network to provision some customers with 10 Gbps speeds. An overlay means that anybody using GPON technology ought to be able to ease into the faster technology without a forklift upgrade.

XGS-PON is not a new technology and it’s been around for around five years. But the price differential stopped most network owners from considering the technology. Most of my clients tell me that their residential GPON networks average around 40% utilization, so there have been no performance reasons to need to upgrade to faster technology. But averages are just that and some PONs (neighborhood nodes) are starting to get a lot busier, meaning that ISPs are having to shuffle customers to maintain performance.

With the price difference finally closing, there is no reason for somebody building a new residential network to not buy the faster technology. Over the next five years as customers start using virtual reality and telepresence technology, there is likely to be a big jump up in bandwidth demand from neighborhoods. This is fueled by the fact that over 9% of homes nationwide are now subscribing to gigabit broadband service – and that’s enough homes for vendors to finally roll out applications that can use gigabit speeds. I guess the next big challenge will be in finding 10 gigabit applications!

Categories
Technology

Demystifying Fiber Terminology

It’s common when a community is getting fiber to have the engineers tossing around technical terms that a layperson is not going to understand. Today’s blog will try to demystify the more common fiber terminology that you’ll likely hear.

Absorption: This is the phenomenon where the natural impurities in glass absorb some of the light signal inside of a fiber path.

ADSS (all-dielectric self-supporting). This is a hardened fiber cable that can be hung directly without a supporting messenger wire. This is primarily used for hanging fiber near electric wires (avoids have a metal wire near to the power lines).

Attenuation: This is a term that defines the amount of light that is lost during a fiber transmission due to absorption and scattering. Attenuation is usually measured in decibels (dB) per kilometer.

Attenuator: This is a device that is used to purposefully reduce signal power in a fiber optic link.

 Back Reflection (BR) refers to any situation that causes the light signal inside fiber to change direction. The most common form of back reflection happens when there is an interface between a lit fiber and air.

Buffer: This is the protective outer layer of material that is in direct contact with the fiber. Manufacturers offer a wide range of different buffering materials.

Insertion Loss: This describes the phenomenon where the light signal gets interrupted and diminished when the light signal encounters a fiber splice point or some electronic component in the network.

Messenger. This refers to a galvanized steel cable that is strung between poles and which is used to support fiber cable.

Multimode: This is a fiber that is capable of transmitting multiple wavelengths of light. Multimode fibers are larger than other fiber and come in two typical sizes – 50µm (microns) or 62.5 µm, compared to 2µm to 9µm for single-mode fiber. Multimode fiber is most commonly used for short transmission distances.

Return Loss: This measures the amount of light that completes the path through a fiber, expressed in decibels. The higher the return loss, the better.

Scattering: This is the other primary reason for signal loss in a fiber (along with absorption). Scattering occurs when light collides with small particles in the fiber path.

Single Mode: This is a fiber with a small fiber core size of 8-10 µm (microns). This fiber is used to transmit a single wavelength for long distances at high speeds.

Wavelength: This is a measure of the frequency (color) of light, expressed in microns or nanometers. The most typical wavelengths used in fiber cables are 850nm, 1300nm, and 1350nm.

Categories
Technology

High Precision GPS

Sometimes important changes for our industry come from outside the industry. We’re getting close to seeing affordable GPS devices that can measure accuracy within a centimeter. Higher precision GPS will be invaluable for ISPs and broadband technologies.

Normal GPS isn’t highly accurate. For example, a GPS-enabled smartphone is only accurate to within 4.9 meters (16 feet) under the open sky. Accuracy is even less around tall buildings, trees, bridges, or other obstacles that can block or deflect signals from satellites. This is plenty of accuracy for providing driving directions, which is the use of GPS that most people are familiar with – although occasionally you’ll get bad driving directions in a major city center when your mapping software thinks you’re on a different street.

Applications like using GPS for driving directions use a single frequency with the GPS device (smartphone or car) connecting to a single GPS satellite. The GPS satellites operated by the government can theoretically provide greater accuracy within 2.3 feet. But accuracy is reduced by local factors such as atmospheric conditions, signal blockage, the quality of the receiver, and the position of the satellite in relation to the user. All of these factors contribute to the lessened accuracy for the normal cellphone or car GPS unit.

High-precision GPS has been around for a while. But the current generation of high-precision technology has not been suitable for applications like driving. High-precision GPS devices work by using multiple frequencies and connecting to two GPS satellites. The technology uses complex mathematics models to calculate precise locations. This would normally require three signals (triangulation), but the devices do a good job at determining position based upon two signals. High-precision devices are also expensive to operate, with an annual subscription as high as $1,000 per device.

The new generation of GPS devices will overcome several of the major shortfalls. The new devices do away with the need for two frequencies. That limitation meant that high-precision devices still won’t work while moving in a car – the needed mathematical calculations cannot keep up in a receiver that’s moving.

The new devices instead are using a clever solution. Each device can create a real-time model of the environment that incorporates all of the known factors in the region that affect GPS accuracy. In essence, a single cellphone is preloaded with a simulation of the GPS environment, and the cellphone can then correct for expected distortions in the GPS measurement – meaning much higher accuracy in locations. These regional models are updated during the day to account for changes in temperature and weather and are beamed to any device trying to use GPS.

Higher precision to GPS opens up applications that were unattainable in the past. The simplest application will be precision locations for things like handholes. Technicians will no longer need to search through tall grass along a rural road to find a handhole because with the new GPS they’ll know the exact location.

Better GPS will be invaluable in locating existing utilities and in siting new buried construction. An engineer that first walks a route can define exactly where to dig or where to place a buried fiber.

Better GPS will be invaluable to rural broadband services like precision agriculture. Once a farmer precisely maps a field, they can tell a self-driving tractor or harvester exactly where to drive without needing a driver in the cab of each device.

And better GPS will help daily work functions in numerous ways we will discover once it’s routinely available. Somebody will be able to survey a site for a new hut and precisely define the location of a concrete pad or a fence without having to return to oversee the construction process. Companies will be able to precisely tell a homeowner where to find a buried conduit without having to always wait for a locator. We’ll quickly get used to more precise field measurements just like we all quickly adapted to trusting GPS driving directions.

Categories
Technology

The WiFi 6 Revolution

We’re edging closer every day to seeing WiFi 6 in our homes. WiFi 6 will be bolstered by the newly approved 6 GHz frequency, and the combination of WiFi 6 and 6 GHz spectrum is going to revolutionize home broadband.

I don’t think many people understand how many of our home broadband woes are caused by current WiFi technology. WiFi has been an awesome technology that freed our homes from long category 5 wires everywhere, but WiFi has a basic flaw that became apparent when homeowners started to buy hordes of WiFi-enabled devices. WiFi routers are lousy at handling multiple requests for simultaneous service. It’s not unusual for 25% or more of the bandwidth in a home to get eaten by WiFi interference issues.

The WiFi standard was designed to give equal opportunity to any device to use a broadband network. What that means in practical use is that a WiFi router is designed to stop and start to give every broadband device in range a chance to use the available spectrum. Most of us have numerous WiFi devices in our home including computers, tablets, TVs, cellphones, and a wide range of smart home devices, toys, etc. Behind the scenes, your WiFi router pauses when you’re downloading a big file to see if your smart thermostat or smartphone wants to communicate. This pause might seem imperceptible to you and happens quickly, but during the time that the router is trying to connect to your thermostat, it’s not processing your file download.

To make matters worse, your current WiFi router also pauses for all of your neighbor’s WiFi networks and devices. Assuming your network is password-protected, these nearby devices won’t use your broadband – but they still cause your WiFi router to pause to see if there is a demand for communications.

The major flaw in WiFi is not the specification that allows all devices to use the network, but the fact that we currently try to conduct all of our WiFi communications through only a few channels. The combination of WiFi 6 and 6 GHz is going to fix a lot of the problems. The FCC approved 6 GHz frequency for WiFi use in April 2020. This quadruples the amount of bandwidth available for WiFi. More importantly, the new spectrum opens multiple new channels (adds fourteen 80 MHz channels and seven 160 MHz channels). This means homes can dedicate specific uses to a given channel – direct computers to one channel, smart TVs to another, cellphones to yet another channel. You could load all small bandwidth devices like thermostats and washing machines to a single channel – it won’t matter if it’s crowded for devices that use tiny bandwidth. Separating devices by channel will drastically reduce the interference and delays that come from multiple devices trying to use the same channel.

The introduction of WiFi 6 is going to require the introduction of devices that can use the WiFi 6 standard and that enable the 6 GHz spectrum. We’re just starting to see devices that take advantage of WiFi 6 and 6 GHz in stores.

It looks like the first readily available use of the new technology is being marketed as WiFi 6E. This application is being aimed at wireless devices. Samsung has released WiFi 6E in the Galaxy S21 Ultra phone. It’s been rumored that WiFi 6E will be in Apple’s 1Phone 13 handsets. Any phone using Qualcomm’s FastConnect 6700 or 6900 chips will be able to use the 6 GHz spectrum. That’s likely to include laptop computers in addition to cellphones.

It’s going to take a while to break the new technology into practical use. You can buy routers today that will handle WiFi 6E from Netgear and a few other vendors, meaning that you could use the new spectrum at home for smartphones and devices with a 6E chip. The advantage of doing so would be to move cellphones off of the spectrum being used for applications like gaming, where WiFi interference is a material issue. The new WiFi 6E chips will also handle bandwidth speeds greater than 1 Gbps, which might benefit a laptop but is largely lost on a smartphone. It’s going to be a while until WiFi 6 is available at work or in the public – but over a few years, it will be coming.

The home WiFi network of the future is going to look drastically different than today’s network. One of the downsides of the 6 GHz spectrum is that it doesn’t travel as well through walls as current WiFi, and most homes are going to have to migrate to meshed networks of routers. Smart homeowners will assign various devices to specific channels and I assume that router software will make this easy to do. Separating WiFi devices to different channels is going to eliminate almost all of the WiFi interference we see today. Big channels of 6 GHz spectrum will mean that devices can grab the bandwidth needed for full performance (assuming the home has good broadband from an ISP).

Categories
Technology

Is Fiber a Hundred Year Investment?

I think every client who is considering building a fiber network asks me how long the fiber is going to last. Their fear is having to spend the money at some future point to rebuild the network. Recently, my response has been that fiber is a hundred-year investment – and let me explain why I say that.

We’re now seeing fiber built in the 1980s becoming opaque or developing enough microscopic cracks that impede the flow of light. A fiber built just after 1980 is now forty years old, and the fact that some fiber routes are now showing signs of aging has people worried. But fiber cable is much improved over the last forty years and fiber purchased today is going to avoid many of the aging problems experienced by 1980s fiber. Newer glass is clearer and not likely to grow opaque. Newer glass is also a lot less susceptible to forming microcracks. The sheathing surrounding the fiber is vastly improved and helps to keep light transmissions on path.

We’ve also learned a lot about fiber construction since 1980. It turns out that a lot of the problems with older fiber are due to the stress imposed on the fiber during the construction process. Fiber used to be tugged and pulled too hard and the stress from construction created the places that are now cracking. Fiber construction methods have improved, and fiber enters service today with fewer stress points.

Unfortunately, the engineers at fiber manufacturers won’t cite a life for fiber. I imagine their lawyers are worried about future lawsuits. Manufacturers also understand that factors like poor construction methods or suffering constant fiber cuts can reduce the life of a given fiber. But off the record, I’ve had lab scientists at these companies conjecture that today’s fiber cable, if well handled, ought to be good for 75 years or more.

That still doesn’t necessarily get us to one hundred years. It’s important to understand that the cost of updating fiber is far less than the cost of building the initial fiber. The biggest cost of building fiber is labor. For buried fiber, the biggest cost is getting the conduit into the ground. There is no reason to think that conduit won’t last for far more than one hundred years. If a stretch of buried fiber goes bad, a network owner can pull a second fiber through the tube as a replacement – without having to pay again for the conduit.

For aerial fiber, the biggest cost is often the make-ready effort to prepare a route for construction, along with the cost of installing a messenger wire. To replace aerial fiber usually means using the existing messenger wire and no additional make-ready, so replacing aerial fiber is also far less expensive than building new fiber.

Economists define the economic life of any asset to be the number of years before an asset must be replaced, either due to obsolescence or due to loss of functionality. It’s easy to understand the economic life of a truck or a computer – there comes a time when it’s obvious that the asset must be replaced, and replacement means buying a new truck or computer.

But fiber is a bit of an unusual asset where the asset is not ripped out and replaced when it finally starts showing end-of-life symptoms. As described above, it’s much cheaper than the original construction costs to bring a replacement fiber to an aerial or buried fiber route. Upgrading fiber is more akin to upgrading a properly constructed building – with proper care buildings can last for a long time.

Many similar utility assets are not like this. My city is in the process today of upgrading a few major water mains that unbelievably used wooden water pipes a century ago. Upgrading the water system means laying down an entirely new water pipe to replace the old one.

It may sound a bit like a mathematical trick, but the fact that replacement of fiber doesn’t mean a 100% replacement cost means that the economic life is longer than with other assets. To use a simplified example, if fiber needs replacement every sixty years, and the cost of the replacement requires only half of the original cost, then the economic life of the fiber in this example is 120 years – it takes that long to have to spend as much as the original cost to replace the asset.

I know that people who build fiber want to know how long it’s going to last, and we just don’t know. We know if fiber in constructed properly that it’s going to last a lot longer than the 40-years we saw from 1980 fiber. We also know that in most cases that replacement doesn’t mean starting from scratch. Hopefully, those facts will give comfort that the average economic life of fiber is something greater than 100 years – we just don’t know how much longer.

Categories
Technology

Automation and Fiber

We have clearly entered the age of robots, which can be witnessed in new factories where robots excel at repetitive tasks that require precision. I read an interesting blog at Telescent that talks about using robots to perform routine tasks inside large data centers. Modern data centers are mostly rooms full of huge numbers of switches and routers, and those devices require numerous fiber connections.

The blog talks about the solvable challenges of automating the process of performing huge volumes of fiber cross-connects in data centers. Doing cross-connects with robots would allow for fiber connections to be made 24/7 as needed while improving accuracy. Anybody who has ever been in a big data center can appreciate the challenge of negotiating the maze of fibers running between devices. The Telescent blog predicts that we’ll be seeing the accelerated use of robots in data centers over the next few years as robot technology improves.

This raises the interesting question of whether we’ll ever see robots in fiber networks. As an industry, we’ve already done a good job of automating the most repetitive tasks in our telco, cable, and cellular central offices. Most carriers have automated functions like activating new customers, changing products and features, and disconnecting customers. This has been accomplished through software, and the savings for automation software are significant, as described in this article from Cisco.

But is there a future in the telecom industry for physical robot automation? I look around the industry and the most labor-intensive and repetitive processes are done while building new networks. There probably is no more meticulous and repetitive task than splicing fibers during the construction process or when fixing damaged fibers. Splicing fiber is almost the same process used in the past to splice large telephone copper cables. A technician must match the same fiber from both sheathes to create the needed end-to-end connection in the fiber. This isn’t too hard to do when splicing a 12-fiber cable but is challenging when splicing 144 or 288-count fibers in outdoor conditions. This is even more challenging when making emergency repairs on aerial fiber in a rain or snowstorm in the dark.

This is the kind of task that robots could master and perform perfectly. It’s not hard to imagine feeding both ends of fiber into a robotized box and then just waiting for the robot to make all of the needed connections and splices perfectly, regardless of the time of day or weather conditions.

I had a recent blog that talked about the shortage of experienced telecom technicians, and splicers are already one of the hardest technicians for construction companies to find. As we keep expanding fiber construction, we’re liable to find projects that get bogged down due to a lack of splicers.

I have no idea if any robot company has even thought about automating the splicing function. We are in the infancy of introducing robots into the workplace and there are hundreds of other repetitive tasks that are likely to be automated before fiber splicing. There might be other functions in the industry that can also be automated if robots get smart enough. The whole industry would emit a huge sigh of relief if robots could tackle make-ready work on poles.