Fixed Wireless in Cities

I am often asked by cities about the option of building a municipal fixed wireless broadband network. As a reminder, fixed wireless in this case is not a cellular system but is the point-to-multipoint technology used by WISPs. My response has been that it’s possible but that the resulting network is probably not going to satisfy the performance goals most cities have in mind.

There are several limitations of fixed wireless technology in an urban that must be considered. The first is the spectrum to be used. Cities tend to be saturated with unlicensed WiFi signals, and the amount of interference will make it a massive challenge to use unlicensed WiFi for broadband purposes. Most folks don’t realize that cellular carriers can snag a lot of the free WiFi spectrum in cities to supplement their cellular data networks – meaning that the free public spectrum is even more saturated than what might be expected.

Licensed spectrum can provide better broadband results. But in cities of any size, most of the licensed spectrum is already spoken for and belongs to cellular companies or somebody else that plans to use it. It never hurts to see if there is spectrum that can be leased, but often there will not be any.

Even if licensed spectrum is available, there are other factors that affect performance of fixed wireless in highly populated areas. The first is that most fixed wireless radios can only serve a relatively small number of customers. Cities are probably not going to be willing to make an investment that can only serve a limited number of people.

Another issue to consider is line-of-sight. In practical terms, this means that neighbor A’s home might block the signal to reach neighbor B. In the typical city, there are going to be a lot of homes that cannot be connected to a fixed wireless network unless there are a lot of towers – and most cities are averse to building more towers.

Even when there is decent line-of-sight, an urban wireless signal can be disturbed by the many routine activities in the city, such as seasonal foliage, bad weather, and even traffic.  One of the more interesting phenomenons of spectrum in an urban setting is how the signal will reflect in scatter in unexpected ways as it bounces off buildings. These factors tend to cause a lot more problems in a dense neighborhood than in a rural setting.

A point-to-multipoint fixed wireless system is also not a great solution for multi-tenant buildings. These networks are designed to provide bandwidth connections to individual users, and there is not enough bandwidth to deliver broadband from one connection to serve multiple tenants. There are also challenges in where to place antennas for individual apartments.

The combination of these issues means that fixed wireless can only serve a relatively small number of customers in an urban area. The speeds are going to bounce around due to urban interference. Speeds are not likely going to be good enough to compete with cable technology.

There is a good analogy to understand the limitations on wireless technologies in cities. Cell carriers have one advantage over many WISPs by owning licensed spectrum. But even with licensed spectrum there are usually numerous small dead spots in cities where the signals can’t reach due to line-of-sight. Cellular radios can serve a lot more customers than fixed wireless radios, but there are still limitations on the number of customers who can buy cellular FWA broadband in a given neighborhood. Any issues faced by cellular networks are worse for a point-to-multipoint network.

The bottom line is that there are a lot of limitations on urban fixed wireless networks that make it a risky investment. Tower space is usually at a premium in cities, and it’s hard to build a network that will reach many customers. There is a lot more interference and line-of-sight issues in a city that makes it hard to maintain a quality connection.

But this doesn’t mean there are no applications that make sense. For example, a fixed wireless network might be ideal for creating a private network for connecting to city locations that don’t need a lot of broadband, like sensor monitoring. That makes a lot more sense than trying to use the technology as an alternative ISP connection for residences and businesses.

When Fiber Construction Goes Wrong

The Common Ground Alliance (CGA) recently issued its 2021 Damage Information Reporting Tool (DIRT). CGA was founded in 2000 and is an association of companies that engage in underground construction. Members include excavators, locators, road builders, electric companies, telecom companies, oil and gas distribution and transmission, railroads, One Call, municipal public works, equipment manufacturers, State regulators, insurance companies, emergency services, and engineering/design firms. The goal of the CGA is to highlight and reduce damages done to all utilities when working underground.

Here are the current trends discussed in the DIRT report:

  • CGA used statistical models that show that there has been a plateau, or perhaps a tiny decrease in the frequency of damages caused by underground construction since 2019.
  • Calls to locating services increased by 8% in 2021, which CGA believes is a precursor to the construction that will result from the Infrastructure Investment and Jobs Act. In past years the frequency of damages has correlated to the overall volume of construction work, so the expectation is that damages due to construction will increase over the next few years.
  • The vast majority of damages (80%) are caused by professional excavators. The rest are caused by individuals and farmers, municipalities, and utilities.
  • The most common source of damages (almost half) is work done by a backhoe.
  • The most commonly damaged infrastructure is natural gas and telecom infrastructure.
  • There were 81,000 damage reports to natural gas systems and 75,000 reports of damage to telecom networks.
  • CGA notes 25 different causes of damage, with six causes accounting for 76% of damage reports.
  • The most prevalent cause of damage (25%) occurs when work is done without first calling to locate other utilities. CGA research says that professional awareness of the need for locating services is high, but 60% of all damages due to no notification are attributed to professional excavators.
  • The next two primary reasons for damages are excavators failing to pothole, failing to maintain sufficient clearance between digging equipment and buried facilities, and facilities not being marked or being marked inaccurately due to locator error and/or incorrect facility records/maps.
  • Nearly a quarter of damages reported by excavators resulted in downtime, so better practices would be a time and money saver.
  • CGA gathered over 217,000 reports of damage in the U.S. in 2021 and another 11,000 in Canada.

This report is an interesting reminder that good work practices can make a big difference in avoiding damage. Fiber construction projects are often brought to a screeching halt when damage is done to existing utilities, particularly gas and water lines. This is well worth reading for anybody associated with construction.

Packet Loss and Broadband Performance

In a recent article in FierceWireless, Joe Madden wrote an article looking at the various wireless technologies he has used at his home in rural central California. Over time he subscribed to a fixed wireless network using WiFi spectrum, cellular LTE broadband, Starlink, and a fixed wireless provider using CBRS spectrum. A lot of rural folks can describe a similar path where they have tried all of the broadband technologies available to them.

Since Joe is a wireless expert who works at Mobile Experts, he was able to analyze his broadband performance in ways that are not easily understood by the average subscriber. Joe came to an interesting conclusion – the difference in performance between various broadband technologies has less to do with speed than with the consistency of the broadband signal.

The average speed tests on the various products varied from 10/2 Mbps on fixed wireless using WiFi, to 117/13 Mbps on Starlink. But what Joe found was that there was a huge difference in consistency as measured by packet loss. Fixed wireless on WiFi had packet loss of 8.5%, while the packet loss on fixed wireless using CBRS spectrum dropped to 0.1%. The difference is stark and is due to the interference that affects using unlicensed spectrum compared to a cleaner signal on licensed spectrum.

But just measuring packet loss is not enough to describe the difference in the performance of the various broadband connections. Joe looked at the number of lost packets that were delivered over 250 milliseconds. That will require some explanation. Packet loss in general describes the percentage of data packets that are not delivered on time. In an Internet transmission, some packets are always lost somewhere in the routing to customers – although most packets are lost due to the local technology at the user end.

When a packet doesn’t show up as expected, the Internet routing protocols ask for that packet to be sent again. If the second packet gets to the user quickly enough, it’s the same, from a user perspective, as if that packet was delivered on time. Joe says that re-sent packets that don’t arrive until after 250 milliseconds are worthless because by then, the signal has been delivered to the user. The easiest way to visualize this is to look at the performance of Zoom calls for folks using rural technologies. Packets that don’t make it on time result in a gap in the video signal that manifests as fuzziness and unclear resolution on the video picture.

Packet loss is the primary culprit for poor Zoom calls. Not receiving all of the video packets on time is why somebody on a Zoom call looks fuzzy or pixelated. If the packet loss is high enough, the user is booted from the Zoom call.

The difference in the percentage of packets that are delivered late between the different technologies is eye-opening. In the fixed wireless using WiFi spectrum an astounding 65% of re-sent packets took longer than 250 ms. Cellular LTE broadband was almost as bad at 57%. Starlink was better at 14%, while fixed wireless using CBRS was lowest at 5%.

Joe is careful to point out that these figures only represent his home and not the technologies as deployed everywhere. But with that said, there are easily explainable technology reasons for the different levels of packet delay. General interference plays havoc with broadband networks using unlicensed spectrum. Starlink has delay just from the extra time for broadband signals to go to and from the satellite and the ground in both directions. The low packet losses on a CBRS network might be due to having very few other neighbors using the new service.

Joe’s comparison doesn’t include other major broadband technologies. I’ve seen some cable networks with high packet loss due to years of accumulated repairs and unresolved issues in the network. The winner of the packet loss comparison is fiber, which typically has an incredibly low packet loss and also a quick recovery rate for lost packets.

The bottom line from the article is that speed isn’t everything. It’s just one of the characteristics that define a good broadband connection, but we’ve unfortunately locked onto speed as the only important characteristic.

Getting Serious About Satellite Texting

One of the more interesting telecom announcements at the CES electronics show in Vegas was the announcement from the partnership of Qualcomm and Iridium of plans to bring satellite texting capability to many more cell phones and other devices.

We’ve already seen a few other announcements recently of the ability to make emergency text calls when out of reach of cell coverage. The team of T-Mobile and SpaceX say that T-Mobile customers will be able to reach 911 through a satellite some time in 2023. Apple launched an Emergency SOS system for its newest iPhone users in a partnership with Globalstar, but the service is only available in a handful of cities.

Qualcomm is building this feature into its premier Snapdragon 8 Gen 2 chips, so any new phone or other device using that chip will have texting capabilities. The company says it plans to eventually build the capability into other more affordable chips as well.

For now, Qualcomm has established a 911 service similar to the T-Mobile plans where people can reach 911 when out of the range of the normal cellular network. But the company envisions that cellular carriers will develop price plans to let users text for a fee. That would provide folks with the ability to stay connected while hiking in remote wilderness or during a sea cruise.

Qualcomm is in the business of selling chips, and it would love to see this capability expanded to other places, like built into laptops or new cars. Putting the technology in cars is a major selling point since it would enable features like automatically contacting 911 after an accident.

This first-generation product will be far from perfect, but that’s to be expected from what is basically a beta test. For example, while Iridium satellites blanket the earth, there are times when there is no satellite overhead, and a user might have to wait ten minutes for the next satellite. It seems this issue can be resolved by cell carriers partnering with multiple satellite providers.

This new technology opens up the possibility for people to have some limited connectivity almost anywhere on the globe. For the younger connected generations, this has great appeal. Most people I know with GenZ kids tell me that it’s like banishment to take kids out of reach of connectivity. But more practically, much of the world does not have reliable cellular coverage, and this can bring some form of communication to all.

I know people will read this and assume that the next step is to use satellites to provide data connectivity to cell phones or laptops from anywhere. However, there are limits of physics that make that unrealistic for a handset. The latest Starlink dishy receiver is 19 by 12 inches, and that much surface area is needed to receive the signal from a satellite. However, it’s not hard to imagine a hiker rolling out a flexible receiver to communicate with a satellite – assuming they bring along some kind of power source, perhaps solar.

I track telecom announcements of new technologies and products to give me a baseline a decade from now to see how various technologies performed. It will be interesting to see if satellite texting becomes a routine part of every cellular plan or if it withers on the vine like many other seemingly good ideas that the market didn’t embrace.

What Ever Happened to IPv6?

It’s been over ten years since the launch of IPv6, the Internet address system that was supposed to give us nearly infinite number of IP addresses. But after a decade of implementation, just over 21% of all websites worldwide will support IPv6 addresses.

On the surface, this makes no sense. The original IPv4 standard only supports about 4.3 billion IP addresses. We clearly have far more people and devices connected to the Internet than that number. By contrast, IPv6 provides 340 trillion trillion trillion IP addresses, a number that, for all practical purposes, is unlimited.

Even though we exhausted the supply of IPv4 addresses years ago, it doesn’t look like there is any rush for most of the world to move to the new IP addresses. There are obvious barriers to making the conversion that most ISPs and businesses are not ready to tackle. Most of the barriers to making the conversion can be categorized as hardware limitations, lack of training, and the overall cost of the conversion.

It’s a little hard to believe after a decade, but many older computers, servers, and routers will still not recognize IPv6 addresses. One would have to think that we’ll eventually ditch the older devices, but there are apparently still a huge number of devices that can’t process IPv6 addresses. The good news is that newer operating systems and devices will handle the new addresses. But the world still has plenty of folks using older versions of Windows, Linux, Android, and iOS. Big corporations are reluctant to make the switch to IPv6 out of fear of older technology around the company that would stop working. Smaller companies are not willing to make the change until they have no choice.

This issue is compounded by the fact that direct communication between IPv4 and IPv6 devices is impossible, and all exchange of data must pass through an IPv4/IPv6 dual-stack conversion to enable communications. This was originally envisioned as a temporary fix, but as IPv4 continues to be used, this is looking to be permanent.

Companies are also loath to tackle the cost and effort of the upgrade without some compelling reason to do so. Companies that have made the change report a number of unexpected problems with a conversion that can be disruptive, and companies are not willing to tackle something this complicated unless they have to.

It’s interesting to see how various countries have decided to make the switch to IPv6. Google has been collecting statistics on IPv6 conversions that are summarized on this map. At the time I wrote this blog, the world leaders in conversion to IPv6 are France (75%), India (68%), Germany (67%), Malaysia (62%), and Saudi Arabia (61%). Much of the rest of the world is far behind with the upgrade, including Russia (7%), China (3%), and much of Africa below 1%.

The US is just above 50% utilization of IPv6. Interestingly, the US backslid and was at a 56% IPv6 conversion rate in 2019. The resurgence of IPv4 is being credited to the huge flood of folks working at home during the pandemic – since residential ISPs have mostly not made the conversion.

Internet experts believe we’ll still be running dual IPv4 and IPv6 networks for at least a few more decades. We’ve found ways to work around the lack of IPv4 addresses, and very few companies or ISPs are seeing any urgency to rush toward a conversion. But as the worldwide penetration of broadband continues to grow and as we add more connected devices, the pressure will increase to eventually make the conversion. But don’t expect to see any headlines because it’s not happening any time soon.


Space Weather and Broadband

There was an interesting phenomenon that happened in September when Starlink launched 49 new satellites. The satellites were successfully deployed by the rocket, but as the satellites were being maneuvered to reach the final orbital slots there was a geomagnetic storm that caused 38 of the satellites to fall back to earth.

Space storms happen when radiation affects the magnetosphere that surrounds the earth. This is a band of particles that are held close to the planet due to the earth’s magnetic field. A geomagnetic storm occurs when there is an exchange of energy from outer space to the orbiting particles. The biggest storms are caused by mass ejections of particles and energy that occur during large solar flares. These solar flares release radiation and highly charged particles into space, which during a storm, interface with the magnetosphere.

It is the charged particles from the storms that manifest in the Aurora Borealis or northern lights. The extra energy from the storms can also play havoc with GPS and other space-based communications. The earth’s atmosphere keeps most of the radiation from solar flares away from the planet, but strong storms can wreak havoc with radio communications and can even produce feedback in long-haul electric wires that can disrupt the power grid.

During a geomagnetic storm, energy is pushed from the particles in the magnetosphere to the upper reaches of the ionosphere. This can temporarily increase the heat and the intensity of the ionosphere, which is what happened to the satellites. They met unexpected resistance that the tiny thrusters on the small satellites were unable to overcome.

Scientists have been looking at ways to better predict solar flares and the ensuing storms. In this case, with a warning, the satellite launch would have been delayed until the storm had passed. It’s a big challenge to predict the size and location of solar flares. The sun has an eleven-year cycle for the period of the heaviest solar flare activity, but a solar flare can erupt at any time.

Scientists around the world have been studying the sun using NASA’s Solar Dynamics Observatory. Scientists in China have had some success by tracking changes in the magnetic field of the sun, particularly in how that manifests in changes on the sun’s surface. They say that the temperature temporarily drops on the surface of the sun in the area where flares are coming. They have predicted several solar flares within 48 hours of an eruption. They have a long way to go for this to be accurate. Even when we get to the point of successfully predicting solar flares, it’s an even bigger challenge to predict if the particles from the flare will hit the earth. The worse impacts come when our planet is in the direct path of the ejected particles.

Tracking space weather matters since we are becoming reliant on space technologies. We’ve all incorporated GPS and satellite weather into our daily routines. We use space monitors for scientific research, to study farm fields, and to keep an eye on the various militaries around the planet. And suddenly, we have a lot of people using satellites for broadband. It was costly to Starlink to lose most of the satellites from a launch. But the potential damage from space storms is going to increase dramatically as we use space more and more. Starlink alone keeps talking about having 30,000 broadband satellites.

It’s not hard to picture the impact of losing these technologies for a few days up to a week. How many of you still carry an atlas in your car in case GPS doesn’t work? Businesses of all types plan outdoor work based on weather predictions that use data gathered by satellites. And having multi-day broadband outages can be devastating, particularly for rural businesses or people working from home. Space technology has become everyday technology, but it’s too easy to take for granted and to assume it will always work.

A New Definition of 6G

We now know how the wireless carriers are going to continue the string of new G generations of cellular technology.

5G was originally defined to include spectrum up to 90 GHz or 100 GHz. In the last few years, international standards bodies have been developing new 6G standards in what is called the terahertz wavelengths between 100 GHz and 1 THz. By definition, these higher frequency bands are the remaining part of the radio spectrum, and the so the 6G being defined by international scientists will be the final generation of G technology.

These super-high frequencies have a lot of interesting potential for indoor uses since this spectrum can transmit an immense quantity of data over short distances. But the high frequencies might never be used for outdoor broadband because the extremely short radio waves are easily distorted and scattered by everything in the environment, including air molecules.

Scientists have speculated that transmissions in the terahertz frequencies can carry 1,000 times more data than the current 5G spectrum bands. That’s enough bandwidth to create the 3D holograms needed for convincing virtual presence (and maybe my home holodeck).

But terahertz frequencies are going to be of little use to the cellular carriers. While cellular companies have still not deployed a lot of the 5G standards, the marketing folks at these companies are faced with a future where there would be no more G generations of cellphones – and that is clearly a lost marketing opportunity.

Several of the wireless equipment vendors have started to refer to bandwidths in the centimetric range as 6G. These are frequencies between 7GHz and 20 GHz. I have to admit that I got a really good belly laugh when I read this, because much of this frequencies is already in use – so I guess 6G is already here!

When 5G was first announced, the big news at the time was that 5G would open up the millimeter-wave spectrum between 24 GHz and 40 GHz. The equipment vendors and the cellular carriers spent an immense amount on lobbying and advertising, talking up the wonders of millimeter-wave spectrum. Remember the carefully staged cellular commercials that showed gigabit speeds on cell phones? That was done using millimeter-wave spectrum.

But now, the marketing folks have pulled a big switcheroo. They are going to rename currently used spectrum as 6G. I guess that means millimeter-wave spectrum will become 7G. This also leaves room for several more generations of G marketing before reaching the 100 GHz terahertz spectrum.

This will clearly cause a mountain of confusion. The international folks are not going to rename what they have already labeled as 6G to mollify the cellular marketers. We’re going to have articles, advertising, and lobbying talking about two completely different versions of 6G. And before the ink is dry, we’ll also be talking about 7G.

The cellular vendors also want us to change the way we talk about spectrum. The folks at Nokia are already suggesting that the newly dubbed 6G spectrum bands should be referred to as midband spectrum – a phrase today that refers to lower spectrum bands. That sets the stage to talking about upper bands of frequency as 7G, 8G, and 9G.

What is funniest about this whole process is that there still isn’t even any 5G being used in the world. The cellular carriers have implemented only a small portion of the 5G specification. But that hasn’t deterred the marketers who have convinced everybody that the new bands of spectrum being used for 4G are actually 5G. It’s a pretty slick marketing trick that lets stops the cellular carriers from not having to explain why the actual 5G isn’t here yet.

A Look at Smart Agriculture

We are at an interesting time in the history of man. The population just crossed the 8 billion mark. At the same time, we’re seeing big changes in weather patterns all over the globe that are disrupting the traditional ways that we raise crops. Some areas are already looking at prolonged droughts, while other places are a lot wetter than ever before. And just about everywhere is hotter.

I remember when I was a kid that there was a lot of talk about world starvation. The world population in 1960 had just hit 3 billion people, and there were a lot of countries on the edge of starvation. Science came to the rescue with new varieties of wheat, rice, and corn developed by Norman Borlaug and others, and food production around the globe soared.

The way to feed today’s population is through smart agriculture, and we don’t have far to look to see what that looks like. The Netherlands, at about the same size as Maryland, is one of the major food producers in Europe and the second biggest food exporter behind the U.S. The small country produces 4 million cows, 13 million pigs, and 104 million chickens annually.

Netherlands is also one of the major providers of vegetables for Europe. The county has an amazing 24,000 acres of greenhouses that grow crops. The greenhouses are efficient and can raise ten times more crops per acre than traditional fields, using less fertilizer. It takes only a half-gallon of water to grow a pound of tomatoes in greenhouses compared to the global average of 28 gallons.

Netherlands is also the world’s top supplier of seeds for ornamental plants and vegetables. There are multiple climate-controlled seed banks that maintain multiple strains of plant seeds to be able to provide the diversity that is needed in the race to keep crop strains ahead of the diseases that can destroy crops.

Greenhouse agriculture is highly dependent on technology. Greenhouses are automated for the watering and tending of crops. Greenhouses utilize a system called Scoutbox that captures and analyzes insects in greenhouses to allow for a quick reaction to avoid infestations. Farmers have virtually eliminated pesticides in greenhouses. Greenhouses are automated for the watering, tending, and shipping of produce – they are food-producing factories.

Field crop agriculture is taking advantage of smart tractors and other smart equipment. Drones are widely used to monitor field crops. Satellite images are analyzed to pinpoint areas of fields that need water, fertilizer, or other amendments. Computers track and monitor farm animals from birth. The county has developed a side industry that gathers food and crop waste to feed animals.

The country is a hub for agricultural research, with 15 of the top twenty agribusinesses having research and development labs in the country. All of this agriculture needs broadband. Like the U.S., the rural areas of the country are the last to get broadband. But the country has put a big push on connectivity. 100% of homes and farms can buy DSL. This is not like the slow rural U.S. DSL, but mostly with reliable speeds between 25 Mbps and 50 Mbps. Over 92% of residents have access to cable company broadband. Over 30% of homes now have access to fiber.

It’s obviously easier to fully wire a small country than our humongous far-flung farming areas. But the Netherlands example is highlighting a different way to raise food by putting greenhouses close to the people who consume the crops.

The one drawback to the agricultural methods in the country is that greenhouses require a lot of power. That’s a particularly pressing problem in a year when the Ukraine war is restricting oil and natural gas supplies. Like much of Europe, this tough time is goading the country to move more quickly to alternate energy sources. The country is already getting a lot of energy from wind and is working towards creating electricity with biomass and geothermal technologies.

The U.S. is experimenting with all of the same agricultural technologies being used in the Netherlands. But this small country is way ahead of us in terms of implementation. You have to wonder which region of the country will push these new technologies forward the fastest – it could be a big deal for an area looking to create jobs.

Does New Technology Thrill You?

Today’s blog is not about broadband, or perhaps only peripherally. As I write this holiday weekend blog, I find myself thinking a lot about an article written last month by Shannon Vallor in the MIT Technology Review. She asks the question, “We used to get excited about technology. What happened?”.

The world is full of new technologies, yet I’ve had the same feeling as Shannon that these new technologies don’t excite me as they once did. She recalls a few technologies that brought her wonder and awe, such as her first ride on the San Francisco BART, seeing a Concorde for the first time, or her first Commodore PET.

We all have our own list of technologies that thrilled us or that we recognized instantly as game changers. My list includes things like Alan Shepard in the first Mercury flight, my first DSL connection that got me off dial-up, online music libraries like Napster and Spotify, and seeing the first iPhone.

The technological breakthroughs I loved the most were good for me or good for mankind. The childhood me saw the Mercury flight as the first step towards mankind expanding our boundaries past this planet. DSL liberated me to finally search the whole world from my living room. Online music meant I was no longer constrained to the music I could afford to buy and could explore the forty different genres of music I like. The iPhone gave everybody a portable handheld computer. The many other technologies I loved at first sight had similar benefits.

The article discusses how a lot of new breakthroughs feel small and somewhat tawdry because they are aimed at helping the companies that sell the technology more than the people who buy it. She cites how farmers feel captive to John Deere because of the way it controls self-driving tractors. She talked about how Roombas and smart refrigerators spy on us  – our transaction with technology companies doesn’t stop when we bring the technology home.

I remember going to Epcot when it first opened. I’m the first to admit that Disney’s vision of the future was schmaltzy, but the vision shown in the Epcot globe is how the history of technology is inexorably tied to making people’s lives better. The century before I was born saw amazing new technologies like electricity in homes, automobiles and planes, refrigeration, vaccines against some of the worst diseases, and mass communications through telegraphs, telephones, and radio.

The article talks about how technology breakthroughs today seem to be more about making the developers rich. If there is any one technology trend I’d like to see undone, it is how we’ve decided to reward companies with breakthrough technology as unicorns and make the founders into instant billionaires. I’m having a hard time getting as excited as I once with space when we’re using the latest technologies to provide private space rides to billionaires. It’s disheartening to see drones becoming the next weapons of war that can threaten us all. It’s disturbing to see vaccines going to wealthy countries instead of everybody. It’s scary that a lot of the electronics we bring into our homes are watching us and reporting back to parties unknown.

However, while I share the same unease as Vallor, I also read a lot about science breakthroughs in labs around the world. We are surrounded by breakthroughs that would have amazed us a few decades ago that barely rate a mention in the press. We’re discovering amazing materials that will enable the next generation of energy use and communications. The breakthroughs in biology are amazing, and we’re probably not far from finding a cure for the common cold and many cancers. We don’t seem to be far away from the first working generation of fusion reactors.

I guess I’m still hopeful, but at the same time, I’ve been thinking about reducing the number of gadgets in my life instead of adding more. I say all of this knowing that I might get thrilled with a new technology announced tomorrow. But then again, maybe I won’t.

Is it Time to Say Farewell to GPON?

GPON is a great technology, GPON stands for gigabit passive optical network, and it is the predominant technology in place that is delivering fiber last mile broadband. The GPON standard was first ratified in 2003, but like most new technologies, it took a few years to hit the market.

GPON quickly became popular because it allowed the provisioning of a gigabit service to customers. A GPON link delivers 2.4 gigabits downstream and 1.2 gigabits upstream to serve up to 64 customers, although most networks I’ve seen don’t deliver to more than 32 customers.

There is still some disagreement among ISPs about the best last-mile fiber technology, and some ISPs still favor active Ethernet networks. The biggest long-term advantage of GPON is that the technology serves more customers than active Ethernet, and most of the R&D for last-mile fiber over the past decade has gone to PON technology.

There are a few interesting benefits of GPON versus active Ethernet. One of the most important is the ability to serve multiple customers on a single feeder fiber. PON has one laser at a hub talking to 32 or more customers. This means a lot less fiber is needed in the network. The other advantage of PON that ISPs like is that there are no active electronics in the network – electronics are only at hubs and at the customer. That’s a lot fewer components to go bad and a less repairs to make in the field.

We’re now seeing most new fiber designs using XGS-PON. This technology increases bandwidth and delivers a symmetrical 10-gigabit path to a neighborhood (for purists, it’s actually 9.953 gigabits). The technology can serve up to 256 customers on a fiber, although most ISPs will serve fewer than that.

The biggest advantage of XGS-PON is that the electronics vendors have all gotten smarter, and XGS-PON is being designed as an overlay onto GPON networks. An ISP can slip an XGS_PON card into an existing GPON chassis and instantly provision customers with faster broadband. The faster speeds just require an upgraded ONT – the electronics at the customer location.

The vendors did this because they took a lot of grief from the industry when they converted from the earlier BPON or APON to GPON. The GPON electronics were incompatible with older PON, and it required a forklift upgrade, meaning a replacement of all electronics from the core to the customer for the upgrade. I helped a few clients through the BPON to GPON upgrade, and it was a nightmare, with staff working late nights since neighborhood networks had to be taken out of service one at a time to make the upgrade.

The other interesting aspect of XGS-PON is that the technology is also forward-looking. The vendors are already field-testing 25-gigabit cards and are working on 40-gigabit cards in the lab. A fiber network provisioned with XGS-PON has an unbelievable capacity, and with new cards added is going to make networks ready for the big bandwidth needs of the future. Any talk of having online virtual reality and telepresence can’t happen until ISPs can provision multi-gigabit connections to multiple homes in a neighborhood – something that would stress even a 10-gigabit XGS-PON connection.

XGS-PON is going to quickly open up a new level of speed competition. I have one new ISP client using XGS-PON that has three broadband products with download speeds of 1, 2, and 5 gigabits, all with an upload speed of 1 gigabit. The cable companies publicly say they are not worried about fiber competition, but they are a long way away from competing with those kinds of speeds.

I’m sure GPON will be around for years to come. But as happens with all technology upgrades, there will probably come a day when the vendors stop supporting old GPON cards and ONTs. The good news for ISPs is that I have a lot of clients that have GPON connections that have worked for over a decade without a hiccup, and there is no rush to replace something that is working great.