Getting Serious About Satellite Texting

One of the more interesting telecom announcements at the CES electronics show in Vegas was the announcement from the partnership of Qualcomm and Iridium of plans to bring satellite texting capability to many more cell phones and other devices.

We’ve already seen a few other announcements recently of the ability to make emergency text calls when out of reach of cell coverage. The team of T-Mobile and SpaceX say that T-Mobile customers will be able to reach 911 through a satellite some time in 2023. Apple launched an Emergency SOS system for its newest iPhone users in a partnership with Globalstar, but the service is only available in a handful of cities.

Qualcomm is building this feature into its premier Snapdragon 8 Gen 2 chips, so any new phone or other device using that chip will have texting capabilities. The company says it plans to eventually build the capability into other more affordable chips as well.

For now, Qualcomm has established a 911 service similar to the T-Mobile plans where people can reach 911 when out of the range of the normal cellular network. But the company envisions that cellular carriers will develop price plans to let users text for a fee. That would provide folks with the ability to stay connected while hiking in remote wilderness or during a sea cruise.

Qualcomm is in the business of selling chips, and it would love to see this capability expanded to other places, like built into laptops or new cars. Putting the technology in cars is a major selling point since it would enable features like automatically contacting 911 after an accident.

This first-generation product will be far from perfect, but that’s to be expected from what is basically a beta test. For example, while Iridium satellites blanket the earth, there are times when there is no satellite overhead, and a user might have to wait ten minutes for the next satellite. It seems this issue can be resolved by cell carriers partnering with multiple satellite providers.

This new technology opens up the possibility for people to have some limited connectivity almost anywhere on the globe. For the younger connected generations, this has great appeal. Most people I know with GenZ kids tell me that it’s like banishment to take kids out of reach of connectivity. But more practically, much of the world does not have reliable cellular coverage, and this can bring some form of communication to all.

I know people will read this and assume that the next step is to use satellites to provide data connectivity to cell phones or laptops from anywhere. However, there are limits of physics that make that unrealistic for a handset. The latest Starlink dishy receiver is 19 by 12 inches, and that much surface area is needed to receive the signal from a satellite. However, it’s not hard to imagine a hiker rolling out a flexible receiver to communicate with a satellite – assuming they bring along some kind of power source, perhaps solar.

I track telecom announcements of new technologies and products to give me a baseline a decade from now to see how various technologies performed. It will be interesting to see if satellite texting becomes a routine part of every cellular plan or if it withers on the vine like many other seemingly good ideas that the market didn’t embrace.

What Ever Happened to IPv6?

It’s been over ten years since the launch of IPv6, the Internet address system that was supposed to give us nearly infinite number of IP addresses. But after a decade of implementation, just over 21% of all websites worldwide will support IPv6 addresses.

On the surface, this makes no sense. The original IPv4 standard only supports about 4.3 billion IP addresses. We clearly have far more people and devices connected to the Internet than that number. By contrast, IPv6 provides 340 trillion trillion trillion IP addresses, a number that, for all practical purposes, is unlimited.

Even though we exhausted the supply of IPv4 addresses years ago, it doesn’t look like there is any rush for most of the world to move to the new IP addresses. There are obvious barriers to making the conversion that most ISPs and businesses are not ready to tackle. Most of the barriers to making the conversion can be categorized as hardware limitations, lack of training, and the overall cost of the conversion.

It’s a little hard to believe after a decade, but many older computers, servers, and routers will still not recognize IPv6 addresses. One would have to think that we’ll eventually ditch the older devices, but there are apparently still a huge number of devices that can’t process IPv6 addresses. The good news is that newer operating systems and devices will handle the new addresses. But the world still has plenty of folks using older versions of Windows, Linux, Android, and iOS. Big corporations are reluctant to make the switch to IPv6 out of fear of older technology around the company that would stop working. Smaller companies are not willing to make the change until they have no choice.

This issue is compounded by the fact that direct communication between IPv4 and IPv6 devices is impossible, and all exchange of data must pass through an IPv4/IPv6 dual-stack conversion to enable communications. This was originally envisioned as a temporary fix, but as IPv4 continues to be used, this is looking to be permanent.

Companies are also loath to tackle the cost and effort of the upgrade without some compelling reason to do so. Companies that have made the change report a number of unexpected problems with a conversion that can be disruptive, and companies are not willing to tackle something this complicated unless they have to.

It’s interesting to see how various countries have decided to make the switch to IPv6. Google has been collecting statistics on IPv6 conversions that are summarized on this map. At the time I wrote this blog, the world leaders in conversion to IPv6 are France (75%), India (68%), Germany (67%), Malaysia (62%), and Saudi Arabia (61%). Much of the rest of the world is far behind with the upgrade, including Russia (7%), China (3%), and much of Africa below 1%.

The US is just above 50% utilization of IPv6. Interestingly, the US backslid and was at a 56% IPv6 conversion rate in 2019. The resurgence of IPv4 is being credited to the huge flood of folks working at home during the pandemic – since residential ISPs have mostly not made the conversion.

Internet experts believe we’ll still be running dual IPv4 and IPv6 networks for at least a few more decades. We’ve found ways to work around the lack of IPv4 addresses, and very few companies or ISPs are seeing any urgency to rush toward a conversion. But as the worldwide penetration of broadband continues to grow and as we add more connected devices, the pressure will increase to eventually make the conversion. But don’t expect to see any headlines because it’s not happening any time soon.


Space Weather and Broadband

There was an interesting phenomenon that happened in September when Starlink launched 49 new satellites. The satellites were successfully deployed by the rocket, but as the satellites were being maneuvered to reach the final orbital slots there was a geomagnetic storm that caused 38 of the satellites to fall back to earth.

Space storms happen when radiation affects the magnetosphere that surrounds the earth. This is a band of particles that are held close to the planet due to the earth’s magnetic field. A geomagnetic storm occurs when there is an exchange of energy from outer space to the orbiting particles. The biggest storms are caused by mass ejections of particles and energy that occur during large solar flares. These solar flares release radiation and highly charged particles into space, which during a storm, interface with the magnetosphere.

It is the charged particles from the storms that manifest in the Aurora Borealis or northern lights. The extra energy from the storms can also play havoc with GPS and other space-based communications. The earth’s atmosphere keeps most of the radiation from solar flares away from the planet, but strong storms can wreak havoc with radio communications and can even produce feedback in long-haul electric wires that can disrupt the power grid.

During a geomagnetic storm, energy is pushed from the particles in the magnetosphere to the upper reaches of the ionosphere. This can temporarily increase the heat and the intensity of the ionosphere, which is what happened to the satellites. They met unexpected resistance that the tiny thrusters on the small satellites were unable to overcome.

Scientists have been looking at ways to better predict solar flares and the ensuing storms. In this case, with a warning, the satellite launch would have been delayed until the storm had passed. It’s a big challenge to predict the size and location of solar flares. The sun has an eleven-year cycle for the period of the heaviest solar flare activity, but a solar flare can erupt at any time.

Scientists around the world have been studying the sun using NASA’s Solar Dynamics Observatory. Scientists in China have had some success by tracking changes in the magnetic field of the sun, particularly in how that manifests in changes on the sun’s surface. They say that the temperature temporarily drops on the surface of the sun in the area where flares are coming. They have predicted several solar flares within 48 hours of an eruption. They have a long way to go for this to be accurate. Even when we get to the point of successfully predicting solar flares, it’s an even bigger challenge to predict if the particles from the flare will hit the earth. The worse impacts come when our planet is in the direct path of the ejected particles.

Tracking space weather matters since we are becoming reliant on space technologies. We’ve all incorporated GPS and satellite weather into our daily routines. We use space monitors for scientific research, to study farm fields, and to keep an eye on the various militaries around the planet. And suddenly, we have a lot of people using satellites for broadband. It was costly to Starlink to lose most of the satellites from a launch. But the potential damage from space storms is going to increase dramatically as we use space more and more. Starlink alone keeps talking about having 30,000 broadband satellites.

It’s not hard to picture the impact of losing these technologies for a few days up to a week. How many of you still carry an atlas in your car in case GPS doesn’t work? Businesses of all types plan outdoor work based on weather predictions that use data gathered by satellites. And having multi-day broadband outages can be devastating, particularly for rural businesses or people working from home. Space technology has become everyday technology, but it’s too easy to take for granted and to assume it will always work.

A New Definition of 6G

We now know how the wireless carriers are going to continue the string of new G generations of cellular technology.

5G was originally defined to include spectrum up to 90 GHz or 100 GHz. In the last few years, international standards bodies have been developing new 6G standards in what is called the terahertz wavelengths between 100 GHz and 1 THz. By definition, these higher frequency bands are the remaining part of the radio spectrum, and the so the 6G being defined by international scientists will be the final generation of G technology.

These super-high frequencies have a lot of interesting potential for indoor uses since this spectrum can transmit an immense quantity of data over short distances. But the high frequencies might never be used for outdoor broadband because the extremely short radio waves are easily distorted and scattered by everything in the environment, including air molecules.

Scientists have speculated that transmissions in the terahertz frequencies can carry 1,000 times more data than the current 5G spectrum bands. That’s enough bandwidth to create the 3D holograms needed for convincing virtual presence (and maybe my home holodeck).

But terahertz frequencies are going to be of little use to the cellular carriers. While cellular companies have still not deployed a lot of the 5G standards, the marketing folks at these companies are faced with a future where there would be no more G generations of cellphones – and that is clearly a lost marketing opportunity.

Several of the wireless equipment vendors have started to refer to bandwidths in the centimetric range as 6G. These are frequencies between 7GHz and 20 GHz. I have to admit that I got a really good belly laugh when I read this, because much of this frequencies is already in use – so I guess 6G is already here!

When 5G was first announced, the big news at the time was that 5G would open up the millimeter-wave spectrum between 24 GHz and 40 GHz. The equipment vendors and the cellular carriers spent an immense amount on lobbying and advertising, talking up the wonders of millimeter-wave spectrum. Remember the carefully staged cellular commercials that showed gigabit speeds on cell phones? That was done using millimeter-wave spectrum.

But now, the marketing folks have pulled a big switcheroo. They are going to rename currently used spectrum as 6G. I guess that means millimeter-wave spectrum will become 7G. This also leaves room for several more generations of G marketing before reaching the 100 GHz terahertz spectrum.

This will clearly cause a mountain of confusion. The international folks are not going to rename what they have already labeled as 6G to mollify the cellular marketers. We’re going to have articles, advertising, and lobbying talking about two completely different versions of 6G. And before the ink is dry, we’ll also be talking about 7G.

The cellular vendors also want us to change the way we talk about spectrum. The folks at Nokia are already suggesting that the newly dubbed 6G spectrum bands should be referred to as midband spectrum – a phrase today that refers to lower spectrum bands. That sets the stage to talking about upper bands of frequency as 7G, 8G, and 9G.

What is funniest about this whole process is that there still isn’t even any 5G being used in the world. The cellular carriers have implemented only a small portion of the 5G specification. But that hasn’t deterred the marketers who have convinced everybody that the new bands of spectrum being used for 4G are actually 5G. It’s a pretty slick marketing trick that lets stops the cellular carriers from not having to explain why the actual 5G isn’t here yet.

A Look at Smart Agriculture

We are at an interesting time in the history of man. The population just crossed the 8 billion mark. At the same time, we’re seeing big changes in weather patterns all over the globe that are disrupting the traditional ways that we raise crops. Some areas are already looking at prolonged droughts, while other places are a lot wetter than ever before. And just about everywhere is hotter.

I remember when I was a kid that there was a lot of talk about world starvation. The world population in 1960 had just hit 3 billion people, and there were a lot of countries on the edge of starvation. Science came to the rescue with new varieties of wheat, rice, and corn developed by Norman Borlaug and others, and food production around the globe soared.

The way to feed today’s population is through smart agriculture, and we don’t have far to look to see what that looks like. The Netherlands, at about the same size as Maryland, is one of the major food producers in Europe and the second biggest food exporter behind the U.S. The small country produces 4 million cows, 13 million pigs, and 104 million chickens annually.

Netherlands is also one of the major providers of vegetables for Europe. The county has an amazing 24,000 acres of greenhouses that grow crops. The greenhouses are efficient and can raise ten times more crops per acre than traditional fields, using less fertilizer. It takes only a half-gallon of water to grow a pound of tomatoes in greenhouses compared to the global average of 28 gallons.

Netherlands is also the world’s top supplier of seeds for ornamental plants and vegetables. There are multiple climate-controlled seed banks that maintain multiple strains of plant seeds to be able to provide the diversity that is needed in the race to keep crop strains ahead of the diseases that can destroy crops.

Greenhouse agriculture is highly dependent on technology. Greenhouses are automated for the watering and tending of crops. Greenhouses utilize a system called Scoutbox that captures and analyzes insects in greenhouses to allow for a quick reaction to avoid infestations. Farmers have virtually eliminated pesticides in greenhouses. Greenhouses are automated for the watering, tending, and shipping of produce – they are food-producing factories.

Field crop agriculture is taking advantage of smart tractors and other smart equipment. Drones are widely used to monitor field crops. Satellite images are analyzed to pinpoint areas of fields that need water, fertilizer, or other amendments. Computers track and monitor farm animals from birth. The county has developed a side industry that gathers food and crop waste to feed animals.

The country is a hub for agricultural research, with 15 of the top twenty agribusinesses having research and development labs in the country. All of this agriculture needs broadband. Like the U.S., the rural areas of the country are the last to get broadband. But the country has put a big push on connectivity. 100% of homes and farms can buy DSL. This is not like the slow rural U.S. DSL, but mostly with reliable speeds between 25 Mbps and 50 Mbps. Over 92% of residents have access to cable company broadband. Over 30% of homes now have access to fiber.

It’s obviously easier to fully wire a small country than our humongous far-flung farming areas. But the Netherlands example is highlighting a different way to raise food by putting greenhouses close to the people who consume the crops.

The one drawback to the agricultural methods in the country is that greenhouses require a lot of power. That’s a particularly pressing problem in a year when the Ukraine war is restricting oil and natural gas supplies. Like much of Europe, this tough time is goading the country to move more quickly to alternate energy sources. The country is already getting a lot of energy from wind and is working towards creating electricity with biomass and geothermal technologies.

The U.S. is experimenting with all of the same agricultural technologies being used in the Netherlands. But this small country is way ahead of us in terms of implementation. You have to wonder which region of the country will push these new technologies forward the fastest – it could be a big deal for an area looking to create jobs.

Does New Technology Thrill You?

Today’s blog is not about broadband, or perhaps only peripherally. As I write this holiday weekend blog, I find myself thinking a lot about an article written last month by Shannon Vallor in the MIT Technology Review. She asks the question, “We used to get excited about technology. What happened?”.

The world is full of new technologies, yet I’ve had the same feeling as Shannon that these new technologies don’t excite me as they once did. She recalls a few technologies that brought her wonder and awe, such as her first ride on the San Francisco BART, seeing a Concorde for the first time, or her first Commodore PET.

We all have our own list of technologies that thrilled us or that we recognized instantly as game changers. My list includes things like Alan Shepard in the first Mercury flight, my first DSL connection that got me off dial-up, online music libraries like Napster and Spotify, and seeing the first iPhone.

The technological breakthroughs I loved the most were good for me or good for mankind. The childhood me saw the Mercury flight as the first step towards mankind expanding our boundaries past this planet. DSL liberated me to finally search the whole world from my living room. Online music meant I was no longer constrained to the music I could afford to buy and could explore the forty different genres of music I like. The iPhone gave everybody a portable handheld computer. The many other technologies I loved at first sight had similar benefits.

The article discusses how a lot of new breakthroughs feel small and somewhat tawdry because they are aimed at helping the companies that sell the technology more than the people who buy it. She cites how farmers feel captive to John Deere because of the way it controls self-driving tractors. She talked about how Roombas and smart refrigerators spy on us  – our transaction with technology companies doesn’t stop when we bring the technology home.

I remember going to Epcot when it first opened. I’m the first to admit that Disney’s vision of the future was schmaltzy, but the vision shown in the Epcot globe is how the history of technology is inexorably tied to making people’s lives better. The century before I was born saw amazing new technologies like electricity in homes, automobiles and planes, refrigeration, vaccines against some of the worst diseases, and mass communications through telegraphs, telephones, and radio.

The article talks about how technology breakthroughs today seem to be more about making the developers rich. If there is any one technology trend I’d like to see undone, it is how we’ve decided to reward companies with breakthrough technology as unicorns and make the founders into instant billionaires. I’m having a hard time getting as excited as I once with space when we’re using the latest technologies to provide private space rides to billionaires. It’s disheartening to see drones becoming the next weapons of war that can threaten us all. It’s disturbing to see vaccines going to wealthy countries instead of everybody. It’s scary that a lot of the electronics we bring into our homes are watching us and reporting back to parties unknown.

However, while I share the same unease as Vallor, I also read a lot about science breakthroughs in labs around the world. We are surrounded by breakthroughs that would have amazed us a few decades ago that barely rate a mention in the press. We’re discovering amazing materials that will enable the next generation of energy use and communications. The breakthroughs in biology are amazing, and we’re probably not far from finding a cure for the common cold and many cancers. We don’t seem to be far away from the first working generation of fusion reactors.

I guess I’m still hopeful, but at the same time, I’ve been thinking about reducing the number of gadgets in my life instead of adding more. I say all of this knowing that I might get thrilled with a new technology announced tomorrow. But then again, maybe I won’t.

Is it Time to Say Farewell to GPON?

GPON is a great technology, GPON stands for gigabit passive optical network, and it is the predominant technology in place that is delivering fiber last mile broadband. The GPON standard was first ratified in 2003, but like most new technologies, it took a few years to hit the market.

GPON quickly became popular because it allowed the provisioning of a gigabit service to customers. A GPON link delivers 2.4 gigabits downstream and 1.2 gigabits upstream to serve up to 64 customers, although most networks I’ve seen don’t deliver to more than 32 customers.

There is still some disagreement among ISPs about the best last-mile fiber technology, and some ISPs still favor active Ethernet networks. The biggest long-term advantage of GPON is that the technology serves more customers than active Ethernet, and most of the R&D for last-mile fiber over the past decade has gone to PON technology.

There are a few interesting benefits of GPON versus active Ethernet. One of the most important is the ability to serve multiple customers on a single feeder fiber. PON has one laser at a hub talking to 32 or more customers. This means a lot less fiber is needed in the network. The other advantage of PON that ISPs like is that there are no active electronics in the network – electronics are only at hubs and at the customer. That’s a lot fewer components to go bad and a less repairs to make in the field.

We’re now seeing most new fiber designs using XGS-PON. This technology increases bandwidth and delivers a symmetrical 10-gigabit path to a neighborhood (for purists, it’s actually 9.953 gigabits). The technology can serve up to 256 customers on a fiber, although most ISPs will serve fewer than that.

The biggest advantage of XGS-PON is that the electronics vendors have all gotten smarter, and XGS-PON is being designed as an overlay onto GPON networks. An ISP can slip an XGS_PON card into an existing GPON chassis and instantly provision customers with faster broadband. The faster speeds just require an upgraded ONT – the electronics at the customer location.

The vendors did this because they took a lot of grief from the industry when they converted from the earlier BPON or APON to GPON. The GPON electronics were incompatible with older PON, and it required a forklift upgrade, meaning a replacement of all electronics from the core to the customer for the upgrade. I helped a few clients through the BPON to GPON upgrade, and it was a nightmare, with staff working late nights since neighborhood networks had to be taken out of service one at a time to make the upgrade.

The other interesting aspect of XGS-PON is that the technology is also forward-looking. The vendors are already field-testing 25-gigabit cards and are working on 40-gigabit cards in the lab. A fiber network provisioned with XGS-PON has an unbelievable capacity, and with new cards added is going to make networks ready for the big bandwidth needs of the future. Any talk of having online virtual reality and telepresence can’t happen until ISPs can provision multi-gigabit connections to multiple homes in a neighborhood – something that would stress even a 10-gigabit XGS-PON connection.

XGS-PON is going to quickly open up a new level of speed competition. I have one new ISP client using XGS-PON that has three broadband products with download speeds of 1, 2, and 5 gigabits, all with an upload speed of 1 gigabit. The cable companies publicly say they are not worried about fiber competition, but they are a long way away from competing with those kinds of speeds.

I’m sure GPON will be around for years to come. But as happens with all technology upgrades, there will probably come a day when the vendors stop supporting old GPON cards and ONTs. The good news for ISPs is that I have a lot of clients that have GPON connections that have worked for over a decade without a hiccup, and there is no rush to replace something that is working great.

Using AM Radio Towers

One existing resource that is often overlooked in designing wireless networks is AM radio towers. For the most part, companies deploying fixed wireless and microwave antenna have avoided these towers. This is due to the nature of AM radio, which transmits at such a low frequency that the entire tower is effectively used as the transmitting antenna. The entire tower is energized with the AM signal, and the typical AM tower sits on a base insulator that blocks the tower from being grounded. The conventional wisdom has been to avoid the AM towers as being too hot in power and frequency to use for other purposes.

There is an additional problem with AM towers in that any tall metal structure within about three kilometers of an AM tower can become what is called a parasitic radiator and can interfere with AM transmission. This has meant that nobody builds other wireless towers close to an AM tower, and the areas around an AM tower are often cellular dead spots – to the detriment of folks that happen to live close to a tower. Since there are around 10,000 AM broadcast towers, this implies many thousand wireless dead zones.

But the AM towers don’t have to be a wasted asset. There are two methods that can be used to install other radios on AM towers that often get overlooked by cellular companies and WISPs. The methods both rely on isolating the new antennas from ground at the same frequency as the AM transmission.

The first technique is known as a folded unipole. This consists of a vertical metal rod, called a mast, that is connected at the base of the AM tower to a conductive surface called a ground plane. The mast is surrounded by a series of vertical wires attached at the top of the mast and extended to a metal ring near the mast base. The feed line for the mast is connected between the ring and the ground. These wires must be mounted at carefully calculated heights. If installed properly, the tower can be isolated and used for other radios. This is a common technique used to connect an FM transmitter to an existing AM tower, but it can also allow for cellular or fixed wireless radios.

The other method for isolation is to install electronics on the transmission line that carries the radio content signal to the antenna. The most common device is called an iso coupler, which allows RF signals within a certain frequency range to pass through while continuing to isolate the AM signal from ground. That might mean allowing through the signal from cellular or fixed wireless electronics to bypass the effects of the AM signals on the tower. Another device that performs roughly the same function is a coil device that can isolate the new antenna signals from the AM signals.

Both of these methods are referred to as detuning, meaning that a new radio can be isolated from the tuned AM signal that permeates the whole tower. Most engineers who are looking for towers avoid AM towers in the belief that it’s too complicated or costly to detune the tower to add other transmitters. Admittedly, getting this to work requires an experienced RF engineer who understands AM towers. But it’s a common practice used most often for adding FM transmitters. I’ve talked to some folks who say the process can be surprisingly affordable.

Anybody looking for tower space shouldn’t shy away from this option because the folks who own AM towers are likely open to negotiating an affordable connection since they don’t often get the opportunity.

New Science – October 2022

Today’s blog looks at some new technologies that may someday have an impact on computing and broadband. We’re living in a time when labs everywhere are making some big breakthroughs with new technology, and it’s hard to predict which ones will become part of our everyday lives.

Artificial Synapses. Engineers at MIT have developed a new kind of artificial synapse that can process data several million times faster than the human brain. The human brain is still the best computer in the world due to the unique structure of neurons and synapses. Scientists have been working for years to try to mimic the structure of the human brain by developing chips that can perform multiple computations simultaneously using data stored in local memory instead of elsewhere. Early work in the field has created neural networks to mimic the way the brain works.

The new technology differs from past attempts by using protons instead of electrons to shuttle data. The scientists created a new kind of programmable resistor that uses protons and which allows for the use of analog processing instead of precise digital processing. The core of the new device is phosphosilicate glass (PSG), which is silicon dioxide with added phosphorus. This material allows for the passage of protons at room temperature while blocking electrons.  A strong electric field can move protons through the chip at almost the speed of light, allowing for the processing of data a million times faster than earlier neural nets.

Replacement of Silicon? Researchers at the EPFL School of Engineering in Lucerne, Switzerland have discovered some interesting properties of vanadium dioxide that would allow building devices that can remember previous external stimuli. This might allow for making chips out of VO­­­­­­­­­­­­­­­2 that would play the same role today as silicon while also acting as a data storage medium. This would allow for the storage of data directly as part of the structure of a chip.

Scientists found in the past that VO2 can outperform silicon as a semiconductor. VO2 also has an interesting characteristic where it changes from an insulator to a metal at 154 degrees Fahrenheit. Researchers found that when VO2 is heated and then cooled that it remembers any data stored at the higher temperature. The researchers believe that VO2 can be used to create permanent data storage that would be embedded directly into the material comprising a chip.

One-Way Superconductor. Scientists at the Delft University of Technology in the Netherlands, along with scientists from Johns Hopkins, have been able to create one-way superconductivity without using magnetic fields – something that was thought to be impossible.  This would be an amazing breakthrough because semiconductors that use superconducting materials would be hundreds of times faster than chips today with zero energy loss during data processing – something that might remove much of the heat created in data centers.

The researchers have found the possibility by using Trinobiumoctabromide (Nb3Br8). They were able to create diodes with a film of the material only a few atoms thick to create a Josephson diode, which are the core component for quantum computing.

The biggest challenge remaining for the team is to enable the superconducting diode to function at temperatures above 77K, which would enable functioning using liquid nitrogen cooling. One of the challenges of all superconductors has been the ability to enable the process at anything other than super-cold temperatures. But it’s not hard to envision using the technology to create large data centers of quantum computers.

The 12 GHz Battle

A big piece of what the FCC does is to weigh competing claims to use spectrum. It seems like there have been non-stop industry fights over the last decade on who gets to use various bands of spectrum. One of the latest fights, which is the continuation of a fight going on since 2018, is for the use of the 12 GHz spectrum.

The big wrestling match is between Starlink’s desire to use the spectrum to communicate with its low-orbit satellites and cellular carriers and WISPs who want to use the spectrum for rural broadband. Starlink uses this spectrum to connect its ground-based terminals to satellites. Wireless carriers argue that the spectrum should also be shared to enhance rural broadband networks.

The 12 GHz band is attractive to Starlink because it contains 500 MHz of contiguous spectrum with 100 MHz channels – a big data pipe for reaching between satellites and earth. The spectrum is attractive to wireless ISPs for these same reasons, along with other characteristics. The 12 GHz spectrum will carry twice as far as the other spectrum in point-to-multipoint broadband networks, meaning it can cover four times the area from a given tower. The spectrum is also clear of any federal or military encumbrance – something that restricts other spectrum like CBRS. The spectrum also is being used for cellular purposes internationally, which makes for an easy path to find the radios and receivers to use it.

In the current fight, Starlink wants exclusive use of the spectrum, while wireless carriers say that both sides can share the spectrum without much interference. These are always the hardest fights for the FCC to figure out because most of the facts presented by both sides are largely theoretical. The only true way to find out about interference is in real-world situations – something that is hard to simulate any other way,

A few wireless ISPs are already using the 12 GHz spectrum. One is Starry, which has recently joined the 12 GHz Coalition, the group lobbying for terrestrial use of the spectrum. This coalition also includes other members like Dish Networks, various WISPs, and the consumer group Public Knowledge. Starry is one of the few wireless ISPs currently using millimeter-wave spectrum for broadband. The company added almost 10,000 customers to its wireless networks in the second quarter and is poised to grow a lot faster. If the FCC opens the 12 GHz spectrum to all terrestrial uses, it seems likely that use of the spectrum would quickly be used in many rural areas.

As seems usual these days, both sides in the spectrum fight say that the other side is wrong about everything they are saying to the FCC. This must drive the engineers at the FCC crazy since they have to wade through the claims made by both sides to get to the truth. The 12 GHz Coalition has engineering studies that show that the spectrum could coexist with satellite usage with a 99.85% assurance of no interference. Starlink, of course, says that engineering study is flawed and that there will be significant interference. Starlink wants no terrestrial use of the spectrum.

On the flip side, the terrestrial ISPs say that the spectrum in dispute is only 3% of the spectrum portfolio available to Starlink, and the company has plenty of bandwidth and is being greedy.

I expect that the real story is somewhere in between the stories told by both sides. It’s these arguments that make me appreciate the FCC technical staff. It seems every spectrum fight has two totally different stories defending why each side should be the one to win use of spectrum.