An Attack on WiFi Spectrum

A little over a year ago the FCC approved the use of 1,200 MHz of spectrum in the 6 GHz band for public use – for new WiFi. WiFi is already the most successful deployment of spectrum ever. A year ago, Cisco predicted that by 2022 that WiFi will be carrying more than 50% of global IP traffic.

These are amazing statistics when you consider that WiFi has been limited to using 70 MHz of spectrum in the 2.4 GHz spectrum band and 500 MHz in the 5 GHz spectrum band. The additional 1,200 MHz of spectrum will vastly expand the capabilities of WiFi. WiFi performance was already slated to improve due to the introduction of WiFi 6 technology. Adding the new spectrum will drive WiFi performance to a new level. The FCC order adds seven 160 MHz channels to the WiFi environment (or alternately adds fifty-nine 20 MHz channels. For the typical WiFi environment, such as a home in an urban setting, this is enough new channels that big bandwidth devices ought to be able to grab a full 160 MHz channel. This is going to increase the performance of WiFi routers significantly by allowing homes or businesses to separate devices by channel to avoid interference.

One minor worry about the 6 GHz band is that it isn’t being treated the same everywhere. China has decided to allocate the entire 6 GHz spectrum band to 5G. Europe has allocated only 500 MHz for WiFi with the rest going to 5G. Other places like Latin America have matched the US allocation and are opting for a greatly expanded WiFi. This means that future WiFi devices won’t be compatible everywhere and will vary by the way the devices handle the 6 GHz spectrum. That’s not the ideal situation for a device maker, but this likely can be handled through software in most cases.

The GSMA, which is the worldwide association for large cellular carriers is lobbying for the US to allow 6 GHz to be used for 5G. They argue that since the 6 GHz spectrum is available to the public that cellular carriers ought to be able to use it like anybody else. They’d like to use it for License Assisted Access (LAA), which would allow the cellular carriers to use the spectrum for cellular broadband. If allowed, cellular traffic could flood the spectrum in urban areas and kill the benefits of 6 GHz for WiFi.

This is not the first time this issue was raised. The cellular industry lobbied hard to be able to use LAA when the FCC approved 5 GHz spectrum for WiFi. Luckily, the FCC understood the huge benefits of improved WiFi and chose to exclude cellular carriers from using the spectrum.

It would be a huge coup for cellular carriers to get to use the 6 GHz spectrum because they’d get it for free at a time where they’ve paid huge dollars for 5G spectrum. The FCC already heard these same arguments when they made the 6 GHz decision, so hopefully, the idea goes nowhere.

I talk to a lot of ISPs that tell me that poor WiFi performance is to blame for many of the perceived problems households have with broadband. Inefficient and out-of-date routers along with situation where too many devices are trying to use only a few channels is causing many of the problems with broadband. The 6 GHz WiFi spectrum will bring decades of vastly improved WiFi performance. It’s something that every homeowner will recognize immediately when they connect a properly configured WiFi router using the 6 GHz spectrum.

For now, there are not many devices that are ready to handle the new WiFi spectrum and WiFi 6 together. Some cellphones are now coming with the capability, and as this starts getting built into chips it will start working for laptops, tablets, PCs, and smart televisions. But homes will only see the real advantage over time as they upgrade WiFi routers and the various devices.

Interestingly, improved WiFi is a direct competitor for the cellular carriers in the home. The carriers have always dreamed of being able to sell subscriptions for homes to connect our many devices. WiFi allows for the same thing with just the cost of buying a new router. It would be an obvious boon to cellular carriers to both kill off the WiFi competitor while getting their hands on free spectrum.

Hopefully, the FCC will reject this argument as something that has already been decided. The GSMA argues that 5G will bring trillions of dollars in benefits to the world – but it can still do that without this spectrum. The benefit of improved WiFi has a huge value as well.

Fast Polymer Cables

Scientists and engineers are always looking for ways to speed up and more efficiently configure computing devices to maximize the flow of data. There are a lot of applications today that require the exchange of huge volumes of data in real-time.

MIT scientists have created a hair-like plastic polymer cable that can transmit data ten times faster than copper USB cables. The scientists recently reported speeds on the new cables in excess of 100 gigabits per second. The new fibers mimic the best characteristics of copper cable in that electronic signals can be conveyed directly from device to device.

Another interesting characteristic of polymer cables is that it’s possible to measure the flow of electrons through each cable from outside – something that is impossible to do with fiber optic cables. This is a key function that can be used to direct the flow of data at the chip level in fast computing devices.

If brought to market, these cables will solve several problems in the computing industry. The new fibers mimic the best feature of copper wires like USB cables are easily compatible with computer chips and other network devices. A copper Ethernet cable can connect two devices directly with no need to reformat data. Fiber cables are much faster than copper but require an intermediate device to convert light signals back into electronic signals at each device.

There are immediate uses for faster cables in applications like data centers, self-driving cars, manufacturing robots, and devices in space. The new cables would be a benefit anywhere that large amounts of data need to be transferred in real-time from device to device. Since the polymer fibers are thin, they could also this also be used to speed up data transfer between chips within devices.

The data transmission rates on the polymer cables are currently at 100 gigabits per second for a distance of around 30 centimeters. The MIT scientists believe they will be able to goose speeds to as much as a terabit while increasing transmission distances to a meter and beyond.

There is a long way to go to move a new technology from laboratory to production. There would first need to be industry standards developed and agreed upon by the iEEE. Using new kinds of cables means changing the interface into devices and chips. There are also the challenges of mass manufacturing the new cables and of integrated them into the existing supply chain.

I’m always amazed at how modern science seems to always find solutions when we need them. We are just now starting to routinely use computer applications like AI that rely on quickly moving huge amounts of data. Just a decade ago nobody would have been talking about chips that needed anything close to 100 gigabits of input or output. It’s easy to assume that computing devices somehow get faster when chips are made faster, but these new cables act as a reminder that there are numerous components required in making faster computing. Fast chips do not good if we can’t get data in and out of the chip fast enough.

Comcast Tests DOCSIS 4.0

Comcast recently conducted its first test of the DOCSIS 4.0 technology and achieved a symmetrical 4-gigabit connection. The test was enabled by a DOCSIS 4.0 chip from Broadcom. The DOCSIS 4.0 standard was released in March 2020 and this is the first test of the new standard. The DOCSIS 4.0 standard allows for a theoretical transmission of 10 Gbps downstream and 6 Gbps upstream – this first test achieved an impressive percentage of the capability of the standard.

Don’t expect this test to mean that cable companies will be offering fast symmetrical broadband any time soon. There is a long way to go from the first lab test to a product deployed in the field. Lab scientists will first work to perfect the DOCSIS 4.0 chip based upon whatever they found during the trial. It typically takes most of a year to create a new chip and it would not be surprising for Comcast to first spend several years and a few iterations to solidify the chip design. Assuming Comcast or some cable company is ready to buy a significant quantity of the new chips, it would be put into the product design cycle at a manufacturer to be integrated into the CMTS core and home cable modems.

That’s the point when cable companies will face to tough choice of pursuing the new standard. When the new technology was announced last year, most of the CTOs of the big cable companies were quoted that they didn’t foresee the implementation of the new standard for at least a decade. This is understandable since the cable companies recently made the expensive upgrade to DOCSIS 3.1.

An upgrade to DOCSIS 4.0 isn’t going to be cheap. It first means the replacement of all existing electronics in a rip-and-replace upgrade. That includes cable modems at every customer premise. DOCSIS 4.0 will require network capacity to be increased to at least 1.2 GHz. This likely means replacement of power taps and network amplifiers throughout the outside plant network.

There is also the bigger issue that the copper plant in cable networks is aging in the same manner as telco copper. There are already portions of many cable company networks that underperform today. Increasing the overall bandwidth of the network might result in the need for a lot of copper replacement. And that is going to create a pause for cable company management. While the upgrade to DOCSIS 3.1 was expensive, it’s going to cost more to upgrade again to DOCSIS 4.0. At what point does it make sense to upgrade instead to fiber rather than tackle another costly upgrade on an aging copper network?

There is then the market issue. The cable companies are enjoying an unprecedented monopoly position. Comcast and Charter together have over half of all broadband customers in the country. While there are households that are unhappy with Comcast or Charter broadband, most don’t have any competitive alternative. The FCC statistics and the screwball websites that claim that Americans have multiple broadband choices are all fiction. For the average urban or suburban family, the only option for functional broadband is the cable company.

This market power means that the cable companies are not going to rush into making upgrades to offer greater speeds just because the technology exists. Monopolists are always slow to introduce technology upgrades. Instead, the cable companies are likely to continue to increase customer speeds across the board. Both Charter and Comcast did this recently and increased download speeds (or at least the download speed they are marketing).

I expect that the early predictions that it would be a decade before we see widespread DOCSIS 4.0 were probably pretty prescient. However, a continued clamor for faster upload speeds or rapid deployment of fiber by competitors could always move up the time when the cable companies have to upgrade to DOCSIS 4.0 or fiber. But don’t let headlines like this make you think this is coming soon.

A 10-Gigabit Tier for Grants

One of the biggest flaws in the recent RDOF reverse auction grant was allowing fixed wireless technology to claim the same gigabit technology tier as fiber. The FCC should never have allowed this to happen. While there is a wireless technology that can deliver up to a gigabit of speed to a few customers under specific circumstances, fiber can deliver gigabit speeds to every customer in a network. This is particularly true in a rural setting where the short reach of gigabit wireless at perhaps a quarter mile is a huge limiting factor for using the technology in a rural setting.

But rather than continue to fight this issue for grant programs there is a much easier solution. It’s now easy to buy residential fiber technology that can deliver 10-gigabits of speed. There have been active Ethernet lasers capable of 10-gigabit speeds for many years. In the last year, XGS-PON has finally come into a price range that makes it a good choice for a new passive fiber network – and the technology can deliver 10-gigabit download speeds.

The FCC can eliminate the question of technology equivalency by putting fiber overbuilders into a new 10-gigabit tier. This could give funding fiber the priority over all other technologies. Fixed wireless will likely never be capable of 10-gigabit speeds. Even if that ever is made possible decades from now, by then fiber will have moved on to the next faster generation. Manufacturers are already looking at 40-gigabit speeds for the next generation of PON technology.

Cable company hybrid-fiber coaxial networks are not capable today of 10-gigabit speeds. These networks could possibly deliver speeds of around 6 or 7 gigabits, but only by removing all of the television signals and delivering only broadband.

I don’t know why it was so hard for the FCC to say no to gigabit fixed wireless technology. When the industry lobbied to allow fixed wireless into the gigabit tier, all the FCC had to do was to ask to see a working demo of wireless gigabit speeds working in a rural farm environment where farms are far apart. The FCC should have insisted that the wireless industry demonstrates how every rural household in the typical RDOF area can receive gigabit speeds. They should have been made to show the technology overcomes distance and line-of-sight issues. There is no such demo because the wireless technology can’t do this – at least not without building fiber and establishing a base transmitter at each farm. The FCC really got suckered by slick PowerPoints and whitepapers when they should have instead asked to see the working demo.

Don’t get me wrong – I don’t hate the new wireless technologies. There are small towns and neighborhoods in rural county seats that could really benefit from the technology. The new meshed networks, if fed by fiber, can superfast bandwidth to small pockets of households and businesses. This can be a really attractive and competitive technology.

But this is not fiber. Every rural community in America knows they want fiber. They understand that once you put the wires in place that fiber is going to be providing solutions for many decades into the future. I think if fiber is built right that it’s a hundred-year investment. Nobody believes this to be true of fixed wireless. The radios are all going to be replaced many times over the next hundred years and communities worry about having an ISP who will make that continual reinvestment.

But since there is such an easy way to fix this going forward, these arguments about gigabit wireless can be largely moot. If the FCC creates a 10-gigabit tier for grants, then only fiber will qualify. The fixed wireless folks can occupy the gigabit tier and leave most other technologies like low-orbit satellite to some even lower tier. The FCC made a mistake with RDOF that they can’t repeat going forward – the agency declared that other technologies are functionally equivalent to fiber – and it’s just not true.

Our Evolving Technologies

A client asked me recently for an update on all of the technologies used today to deliver broadband. The last time I talked about this topic with this client was three years ago. As I talked through each technology, it struck me that every technology we use for broadband is better now than three years. We don’t spend enough time talking about how the vendors in this industry keep improving technology.

Consider fiber. I recently have been recommending that new fiber builders consider XGS-PON. While this technology was around three years ago it was too expensive and cutting edge at the time to consider for most ISPs. But AT&T and Vodaphone have built enough of the technology that the prices for the hardware have dropped to be comparable to the commonly used GPON technology. This means we now need to start talking about FTTP as a 10-gigabit technology – a huge increase in capacity that blows away every other technology. Some improvements we see are more subtle. The fiber used for wiring inside buildings if far more flexible and bendable than three years ago.

There have been big improvements in fixed wireless technology. Some of this improvement is due to the FCC getting serious about providing more spectrum for rural fixed wireless. During the last three years, the agency has approved CBRS spectrum and white space spectrum that is now being routinely used in rural deployments. The FCC also recently approved the use of 6 GHz WiFi spectrum that will add even more horsepower. There have also been big improvements in the radios. One of the improvements that isn’t mentioned is new algorithms that speed up the wireless switching function. Three years ago, we talked about high quality fixed wireless speeds of 25 Mbps to 50 Mbps and now we’re talking about speeds over 100 Mbps in ideal conditions.

All three major cellular carriers are in the process of building out a much-improved fixed cellular broadband product. This has also benefited from new bands of frequencies acquired by the cellular carriers during the last three years. Three years ago, any customer with a cellular hotspot product complained about slow speeds and tiny monthly data caps. The new products allow for much greater monthly usage, up to unlimited and speeds are better than three years ago. Speeds are still largely a function of how far a home is from the closest cell site, so this product is still dreadful for those without good cellular coverage – but it means improved broadband with speeds up to 50 Mbps for many rural households.

Three years ago, the low-orbit satellites from Starlink were just hype. Starlink now has over 1,000 satellites in orbit and is in beta test mode with customers reporting download speeds from 50 Mbps to 150 Mbps. We’re also seeing serious progress from One Web and Jeff Bezos’s Project Kuiper, so this industry segment is on the way to finally becoming a reality. There is still a lot of hype, but that will die when homes can finally buy the satellite broadband products – and when we finally understand speeds and prices.

Three years ago, Verizon was in the early testing stage of fiber-to-the-curb. After an early beta test and a pause to improve the product, Verizon is now talking about offering this product to 25 million homes by 2025. This product uses mostly millimeter-wave spectrum to get from the curb to homes. For now, the speeds are reported to be about 300 Mbps, but Verizon says this will get faster.

We’ve also seen big progress with millimeter-wave mesh networks. Siklu has a wireless product that they tout as an ideal way to bring gigabit speeds to a small shopping district. The technology delivers a gigabit connection to a few customers and the broadband is then bounced from those locations to others. Oddly, some companies are talking about using this product to satisfy the rural RDOF grants, which is puzzling since the transmission distance is only a quarter-mile and also requires great line-of-sight. But expect to see this product pop up in small towns or retail districts all over the country.

Cable company technology has also improved over the last three years. During that time, a lot of urban areas saw the upgrade to DOCSIS 3.1 with download speeds now up to a gigabit. CableLabs also recently announced DOCSIS 4.0 which will allow for symmetrical gigabit plus speeds but which won’t be available for 3-5 years.

While you never hear much about it, DSL technology over copper has gotten better. There are new versions of G.Fast being used to distribute broadband inside apartment buildings that is significantly better than what was on the market three years ago.

Interestingly, the product that got the most hype during the last three years is 5G. If you believe the advertising, 5G is now everywhere. The truth is that there is no actual 5G yet in the market yet and this continues to be marketing hype. The cellular carriers have improved their 4G networks by overlaying new spectrum, but we’re not going to see 5G improvements for another 3-5 years. Unfortunately, I would bet that the average person on the street would say that the biggest recent telecom breakthrough has been 5G, which I guess shows the power of advertising and hype.

The 6G Hype is Already Starting

Even though 5G hasn’t yet made it onto any cellphone, the wireless vendor industry is already off and running looking at the next generation of wireless technology that has been dubbed as 6G. This recent article describes the European Union Hexa-X project that started in January to look at developing specifications for next-generation wireless technology using terahertz spectrum. The initiative will be led by Nokia Bell Labs and Ericsson. Similar research is being done elsewhere around the world by companies such as Huawei, NTT, and Samsung.

6G wireless will explore using the high frequencies between 100 GHz and 1 THz (terahertz), which are collectively being referred to as terahertz frequencies. These are radio waves that are just below the frequencies of infrared light. These frequencies have such short waves, that at the upper end, the frequencies could carry as much as 1,000 times more bandwidth than the frequencies used in cellphones today.

But there is a huge trade-off for the huge bandwidth capacity in that these frequencies travel only short distances, measured in a few feet, before starting to dissipate. These frequencies will not pass through any obstacle and need a clear line of sight.

It’s likely that any 6G technology will be used for indoor data transmission, and 6G could become the fastest delivery mechanism of bandwidth to use within a room between devices. The bandwidth capabilities of these superhigh frequencies could finally fully enable technologies like telepresence (I finally get a holodeck!), or cobots (interactive robots).

Of course, like with any new technology, there is also already hype. Samsung recently released a whitepaper that said that using terahertz waves for cellphones is ‘inevitable’. Long before we try to somehow tame terahertz frequencies in the wild, we need to first figure out millimeter-wave cellular technologies. The current use of millimeter-wave hotspots in downtown metropolitan areas has provided cover for cellular carriers to hype gigabit speeds 5G – but this is a miserable technology in terms of usefulness or reliability. The millimeter-wave spectrum is blocked by everything in the environment, including the body of the user.

More importantly, I’ve never heard anybody make a coherent description of why we need to deliver gigabit or faster speeds to cellphones. If we modify cellphones to process data that quickly we’ll need to find a way to recharge the phones every hour. While I understand why engineers go gaga over the idea of delivering a hundred or a thousand times more data to a cellphone, we need a reality check to ask why anybody would want to do that. Smartphones might be the most important technology developed in this century, but there seems to be little need to turn cellphones into a walking data center unless we want to also start carrying around small air-conditioning units to keep the chips cool.

It makes sense that device makers like Nokia and Ericsson would get excited over the next generation of wireless devices. It’s not hard to envision entirely new technologies twenty years from now that take advantage of terahertz frequencies. Seriously, who is not going to want a holodeck in their living room?

Interestingly, the introduction of 6G is likely going to be of less value to the big cellular carriers. These companies have already started to lose the indoor battle for 5G. Verizon and AT&T had once envisioned a world where homeowners would buy monthly 5G data plans for all of the wired devices in our home. But the FCC already gutted that idea by releasing 6 GHz spectrum for free use, which manufacturers are marrying to the new WiFi 6 standard. As is inevitable, a free solution that doesn’t require a monthly subscription is going to capture most of the indoor market. We’re not going to be buying a 5G subscription for our 8K TV when we have WiFi 6 operating from a $100 router.

One has to imagine the same future for terahertz frequencies. The FCC will eventually create at least one band of terahertz frequency that anybody can use, and that’s the frequency that will power the superfast devices in our homes and offices.

One thing that the early 6G hype fails to mention is the fiber networks that will be needed to fuel superfast applications. We aren’t going to be operating a holodeck using a measly 1 Gbps broadband connection. Twenty years from now, techie households will be screaming for the delivery of 100 Gbps bandwidth to support their terahertz gaming applications.

The Gigabit Wireless Controversy

One of the big controversies in the RDOF auction was that the FCC allowed three of the top ten grant winners to bid using gigabit wireless technology. This was Starry (Connect Everyone), Resound Networks, and Nextlink (AMG Technology). By bidding in the gigabit tier these technologies were given the same technology and dollar weighting as somebody bidding to build fiber-to-the-premise. There was a big outcry from fiber providers that claim that these bidders gained an unfair advantage because the wireless technology will be unable to deliver gigabit speeds in rural areas.

Fiber providers say that the bidding with gigabit wireless violates the intent of the grants. Bidding in the gigabit tier should mean that an ISP can deliver a gigabit product to every customer in an RDOF grant area. Customers don’t have to buy a gigabit product, but the capability to provide that speed to every customer must be there. This is something that comes baked-in with fiber technology – a fiber network can deliver gigabit speeds (or 10-gigabit speeds these days) to any one customer, or easily give it to all customers.

There is no denying that there is wireless technology that can deliver gigabit speeds. For example, there are point-point radios using millimeter-wave spectrum that can deliver a gigabit path for up to two miles or a multi-gigabit path for perhaps a mile. But this technology delivers the bandwidth to only a single point. This is the technology that Starry and others use in downtown areas to beam a signal from rooftop to rooftop to serve apartment buildings, with the bandwidth shared with all of the tenants in the building. This technology delivers up to a gigabit to a building, but something less to tenants. We have a good idea of what this means in real life because Starry publishes the average speed of its customers. In March 2021, the Starry website said that its average customer received 232 Mbps download and 289 Mbps up. That’s a good bandwidth product, but it is not gigabit broadband.

There is a newer technology that is more suited for areas outside of downtown metropolitan areas. Siklu has a wireless product that uses unlicensed spectrum in the V-band at 60 GHz and around 70 GHz. This uses a Qualcomm chip that was developed for the Facebook Terragraph technology. A wireless base station that is fiber-fed can serve up to 64 customers – but the catch is that the millimeter-wave spectrum used in this application travels only about a quarter of a mile. Further, this spectrum requires a nearly perfect line-of-sight.

The interesting feature of this technology is that each customer receiver can also retransmit broadband to make an additional connection. Siklu envisions a network where four or five hops are made from each customer to extend broadband around the base transmitter. Siklu advertises this product as being ideal for small-town business districts where a single fiber-fed transmitter can reach the whole downtown area through the use of the secondary beams. With a handful of customers on a system, this could deliver a gigabit wireless product. But as you start adding secondary customers, this starts acting a lot like a big urban apartment building, and the shared speeds likely start looking like what Starry delivers in urban areas – fast broadband, but that doesn’t meet the definition that every customer can receive a gigabit.

The real catch for this technology comes in the deployment. The broadband strength is pretty decent if every base transmitter is on fiber. But ISPs using the technology are likely going to cut costs by feeding additional base stations with wireless backhaul. That’s when the bandwidth starts to get chopped down. An RDOF winner would likely have to build a lot of fiber and have transmitters every mile to get the best broadband speeds – but if they dilute the backhaul by using wireless connections between transmitters, or spacing base station further apart, then speeds will drop significantly.

The other major issue with this technology is that it’s great for the small-town business district, but how will it overlay in the extremely rural RDOF areas? The RDOF grants cover some of the most sparsely populated areas in the country. The Siklu technology will be quickly neutered by the quarter-mile transmission distance when customers live more than a quarter-mile apart. Couple this with line-of-sight issues and it seems extremely challenging to reach a lot of the households in most RDOF areas with this technology.

I come down on the side of the fiber providers in this controversy. In my mind, an ISP doesn’t meet the grant requirements if they can’t reach every customer in an RDOF area. An ISP also doesn’t meet the gigabit grant requirements if only some customers can receive the gigabit speeds. That’s the kind of bait-and-switch we’ve had for years, thanks to the FCC that has allowed an ISP to bring fast broadband to one customer in a Census block and declare that everybody has access to fast speeds.

It’s a shame that I feel obligated to come to this conclusion because deployed well, these wireless technologies can probably bring decent broadband to a lot of homes. But if these technologies can’t deliver a gigabit to everybody, then the ISPs gained an unfair advantage in the RDOF grant bidding. When I look at the widely spaced home in many RDOF areas I can’t picture a wireless network that can reach everybody while also delivering gigabit capabilities. The only way to make this work would be to build fiber close to every customer in an RDOF area – and at that point, the wireless technology would be nearly as costly as FTTH and a lot more complicated to maintain. I think the FCC bought the proverbial pig-in-a-poke when they approved rural gigabit wireless.

Next Generation PON is Finally Here

For years, we’ve been checking the prices of next-generation passive optical network (PON) technology as we help clients consider building a new residential fiber network. As recently as last year there was still at least a 15% or more price penalty for buying 10 Gbps PON technology using the NG-PON2 or XGS-PON standards. But recently we got a quote for XGS-PON that is nearly identical in price to buying the GPON that’s been the industry standard for over a decade.

New technology is usually initially more expensive for two reasons. Manufacturers hope to reap a premium price from those willing to be early adapters. You’d think it would be just the opposite since the first buyers of new technology are the guinea pigs who have to help debug all of the inevitable problems that crop up in new technology. But the primary reason that new technology costs more is economy of scale for the manufacturers – prices don’t drop until manufacturers start manufacturing large quantities of a new technology.

The XGS-PON standard provides a lot more bandwidth than GPON. The industry standard GPON technology delivers 2.4 Gbps download and 1 Gbps upload speed to a group of customers – most often configured at 32 passings. XGS-PON technology delivers 10 Gbps downstream and 2.5 Gbps upstream to the same group of customers—a big step up in bandwidth.

The price has dropped for XGS-PON primarily due to its use by AT&T in the US and Vodaphone in Europe. These large companies and others have finally purchased enough gear to drive down the cost of manufacturing.

The other next-generation PON technology is not seeing the same price reductions. Verizon has been the only major company pursuing the NG-PON2 standard and is using it in networks to support large and small cell sites. But Verizon has not been building huge amounts of last-mile PON technology and seems to have chosen millimeter-wave wireless technology as the primary technology for reaching into residential neighborhoods. NG-PON2 works by having tunable lasers that can function at several different light frequencies. This would allow more than one PON to be transmitted simultaneously over the same fiber but at different wavelengths. This is a far more complex technology than XGS-PON, which basically has faster lasers than GPON.

One of the best features of XGS-PON is that some manufacturers are offering this as an overlay onto GPON. An overlay means swapping out some cards in a GPON network to provision some customers with 10 Gbps speeds. An overlay means that anybody using GPON technology ought to be able to ease into the faster technology without a forklift upgrade.

XGS-PON is not a new technology and it’s been around for around five years. But the price differential stopped most network owners from considering the technology. Most of my clients tell me that their residential GPON networks average around 40% utilization, so there have been no performance reasons to need to upgrade to faster technology. But averages are just that and some PONs (neighborhood nodes) are starting to get a lot busier, meaning that ISPs are having to shuffle customers to maintain performance.

With the price difference finally closing, there is no reason for somebody building a new residential network to not buy the faster technology. Over the next five years as customers start using virtual reality and telepresence technology, there is likely to be a big jump up in bandwidth demand from neighborhoods. This is fueled by the fact that over 9% of homes nationwide are now subscribing to gigabit broadband service – and that’s enough homes for vendors to finally roll out applications that can use gigabit speeds. I guess the next big challenge will be in finding 10 gigabit applications!

Demystifying Fiber Terminology

It’s common when a community is getting fiber to have the engineers tossing around technical terms that a layperson is not going to understand. Today’s blog will try to demystify the more common fiber terminology that you’ll likely hear.

Absorption: This is the phenomenon where the natural impurities in glass absorb some of the light signal inside of a fiber path.

ADSS (all-dielectric self-supporting). This is a hardened fiber cable that can be hung directly without a supporting messenger wire. This is primarily used for hanging fiber near electric wires (avoids have a metal wire near to the power lines).

Attenuation: This is a term that defines the amount of light that is lost during a fiber transmission due to absorption and scattering. Attenuation is usually measured in decibels (dB) per kilometer.

Attenuator: This is a device that is used to purposefully reduce signal power in a fiber optic link.

 Back Reflection (BR) refers to any situation that causes the light signal inside fiber to change direction. The most common form of back reflection happens when there is an interface between a lit fiber and air.

Buffer: This is the protective outer layer of material that is in direct contact with the fiber. Manufacturers offer a wide range of different buffering materials.

Insertion Loss: This describes the phenomenon where the light signal gets interrupted and diminished when the light signal encounters a fiber splice point or some electronic component in the network.

Messenger. This refers to a galvanized steel cable that is strung between poles and which is used to support fiber cable.

Multimode: This is a fiber that is capable of transmitting multiple wavelengths of light. Multimode fibers are larger than other fiber and come in two typical sizes – 50µm (microns) or 62.5 µm, compared to 2µm to 9µm for single-mode fiber. Multimode fiber is most commonly used for short transmission distances.

Return Loss: This measures the amount of light that completes the path through a fiber, expressed in decibels. The higher the return loss, the better.

Scattering: This is the other primary reason for signal loss in a fiber (along with absorption). Scattering occurs when light collides with small particles in the fiber path.

Single Mode: This is a fiber with a small fiber core size of 8-10 µm (microns). This fiber is used to transmit a single wavelength for long distances at high speeds.

Wavelength: This is a measure of the frequency (color) of light, expressed in microns or nanometers. The most typical wavelengths used in fiber cables are 850nm, 1300nm, and 1350nm.

High Precision GPS

Sometimes important changes for our industry come from outside the industry. We’re getting close to seeing affordable GPS devices that can measure accuracy within a centimeter. Higher precision GPS will be invaluable for ISPs and broadband technologies.

Normal GPS isn’t highly accurate. For example, a GPS-enabled smartphone is only accurate to within 4.9 meters (16 feet) under the open sky. Accuracy is even less around tall buildings, trees, bridges, or other obstacles that can block or deflect signals from satellites. This is plenty of accuracy for providing driving directions, which is the use of GPS that most people are familiar with – although occasionally you’ll get bad driving directions in a major city center when your mapping software thinks you’re on a different street.

Applications like using GPS for driving directions use a single frequency with the GPS device (smartphone or car) connecting to a single GPS satellite. The GPS satellites operated by the government can theoretically provide greater accuracy within 2.3 feet. But accuracy is reduced by local factors such as atmospheric conditions, signal blockage, the quality of the receiver, and the position of the satellite in relation to the user. All of these factors contribute to the lessened accuracy for the normal cellphone or car GPS unit.

High-precision GPS has been around for a while. But the current generation of high-precision technology has not been suitable for applications like driving. High-precision GPS devices work by using multiple frequencies and connecting to two GPS satellites. The technology uses complex mathematics models to calculate precise locations. This would normally require three signals (triangulation), but the devices do a good job at determining position based upon two signals. High-precision devices are also expensive to operate, with an annual subscription as high as $1,000 per device.

The new generation of GPS devices will overcome several of the major shortfalls. The new devices do away with the need for two frequencies. That limitation meant that high-precision devices still won’t work while moving in a car – the needed mathematical calculations cannot keep up in a receiver that’s moving.

The new devices instead are using a clever solution. Each device can create a real-time model of the environment that incorporates all of the known factors in the region that affect GPS accuracy. In essence, a single cellphone is preloaded with a simulation of the GPS environment, and the cellphone can then correct for expected distortions in the GPS measurement – meaning much higher accuracy in locations. These regional models are updated during the day to account for changes in temperature and weather and are beamed to any device trying to use GPS.

Higher precision to GPS opens up applications that were unattainable in the past. The simplest application will be precision locations for things like handholes. Technicians will no longer need to search through tall grass along a rural road to find a handhole because with the new GPS they’ll know the exact location.

Better GPS will be invaluable in locating existing utilities and in siting new buried construction. An engineer that first walks a route can define exactly where to dig or where to place a buried fiber.

Better GPS will be invaluable to rural broadband services like precision agriculture. Once a farmer precisely maps a field, they can tell a self-driving tractor or harvester exactly where to drive without needing a driver in the cab of each device.

And better GPS will help daily work functions in numerous ways we will discover once it’s routinely available. Somebody will be able to survey a site for a new hut and precisely define the location of a concrete pad or a fence without having to return to oversee the construction process. Companies will be able to precisely tell a homeowner where to find a buried conduit without having to always wait for a locator. We’ll quickly get used to more precise field measurements just like we all quickly adapted to trusting GPS driving directions.