Categories
Technology

Is Jitter the Problem?

Most people assume that when they have broadband issues they don’t have fast enough broadband speeds. But in many cases, problems are caused by high jitter and latency. Today, I’m looking at the impact of  jitter.

What is Jitter? Jitter happens when incoming data packets are delayed and don’t show up at the expected time or in the expected order. When data is transmitted over the Internet it is broken into small packets. A typical packet is approximately 1,000 bytes or 0.001 megabytes. This means a lot of packets are sent to your home computer for even basic web transactions.

Packets are created at the location originates a web signal. This might be a site that is streaming a video, sending a file, completing a voice over IP call, or letting you shop online. The packets are sent in the order that the original data stream is encoded. Each packet takes a separate path across the Internet. Some packets arrive quickly, while others are delayed for some reason. Measuring jitter means measuring the degree to which packets end up at your computer late or in the wrong order.

Why Does Jitter Matter? Jitter matters the most when you are receiving packets for a real-time transaction like a streaming video, a Zoom call, a voice over IP call, or a video connection with a classroom. Your home computer is going to do its best to deliver the transmissions on time, even if all the packets haven’t arrived. You’ll notice missing packets of data as pixelation or fuzziness in a video, or as poor sound quality on a voice call. If enough packets are late, you might drop a VoIP call or get kicked out of a Zoom session.

Jitter doesn’t matter as much for other kinds of data. Most people are not concerned if it takes slightly longer to download a data file or to receive an email. These transactions don’t show up as received on your computer until all (or mostly all) of the packets have been received.

What Causes Jitter? The primary cause of jitter is network congestion. This happens when places in the network between the sender and the receiver are sent more data packets than can be processed in real time.

Bandwidth constraints can occur anywhere in a network where there is a possibility of overloading the capacity of the electronics. The industry uses the word chokepoint to describe any place where data can be restricted. On an incoming data transmission, an ISP might not have enough bandwidth on the incoming backbone connection. Every piece of ISP network gear that routes traffic within an ISP network is a potential chokepoint – a common chokepoint is where data is handed off to a neighborhood. The final chokepoint is at the home if data is coming in faster than the home broadband connection can handle it.

A common cause of overloaded chokepoints is old or inadequate hardware. An ISP might have outdated or too-small switches in the network. The most common chokepoints at homes are outdated WiFi modems or older computers that can’t handle the volume of incoming data.

One of the biggest problems with network chokepoints is that any time that an electronics chokepoint gets too busy, packets can be dropped or lost. When that happens, your home computer or your ISP will request the missing packets be sent again. The higher the jitter, the more packets that are lost and must be sent multiple times, and the greater the total amount of data being sent through the network. With older and slower technologies like DSL, the network can get paralyzed if failed packets accumulate to the point of overwhelming the technology.

Contrary to popular belief, faster speeds don’t reduce jitter, and can actually increase it. If you have an old inadequate WiFi modem and upgrade to a faster technology like fiber, the WiFi model will be even more overwhelmed than it was with a slower bandwidth technology. The best solution to lowering jitter is for ISPs and customers to replace equipment that causes chokepoints. Fiber technology isn’t better just because it’s faster – it also includes technology that move packets quickly through chokepoints.

Categories
Technology

What Happened to Quantum Networks?

A few years ago, there were a lot of predictions that we’d see broadband networks converting to quantum technology because of the enhanced security. As happens with many new technologies, quantum computing is advancing at a slower pace than the wild predictions that accompanied the launch of the new technology.

What are quantum computing and quantum networks? The computers we use today are all Turing machines that convert data into bits represented by either a 1 or a 0 and then process data linearly through algorithms. Quantum computing takes advantage of a property found in subatomic particles called superposition, meaning that particles can operate simultaneously in more than one state, such as an electron that is at two different levels. Quantum computing mimics this subatomic world by creating what are called qubits, which can exist as both a 1 and a 0 at the same time. One cubit can perform two calculations at once, but when many cubits are used simultaneously, the number of simultaneous calculations grows exponentially. A four-cubit computer can perform 24 or 16 calculations at the same time. Some quantum computers are currently capable of 1,000 cubits, or 21000 simultaneous calculations.

We are starting to see quantum computing in the telecom space. In 2020, Verizon conducted a network trial using quantum key distribution technology (QKD). This uses a method of encryption that might be unhackable. Photons are sent one at a time alongside an encrypted fiber optic transmission. If anybody attempts to intercept or listen to the encrypted light stream, the polarization of the photons is impacted, and the sender and receiver of the message both know instantly that the transmission is no longer safe. The theory is that this will stop hackers before they can learn enough to crack into and analyze a data stream. Verizon also added a second layer of security using a quantum random number generator that updates the encryption key randomly in a way that can’t be predicted.

A few months ago, EPB, the municipal fiber provider in Chattanooga, announced a partnership with Qubitekk to let customers on the City’s fiber network connect to a quantum computer. The City is hoping to attract companies to the City that want to benefit from quantum computing. The City has already heard from Fortune 500 companies, startups, and government agencies that are interested in using the quantum computer links.

EBP has established the quantum network separate from its last-mile network to accommodate the special needs of a quantum network transmission. The quantum network uses more than 200 existing dark fibers to establish customer links on the quantum network. EPB engineers will constantly monitor the entangled particles on the quantum network.

Quantum computing is most useful for applications that require large numbers of rapid calculations. For example, quantum computing could produce faster and more detailed weather maps in real time. Quantum computing is being used in research on drugs or exotic materials where scientists can compare multiple complex molecular structures easily. One of the most interesting current uses is that quantum computing can greatly speed up the processing power of artificial intelligence that is now sweeping the world.

It doesn’t look like quantum networking is coming to most fiber networks any time soon. The biggest holdup is the creation of efficient and cost-effective quantum computers. Today, most of these computers are in labs at universities or government facilities. The potential for quantum computing is so large that the technology could explode onto the scene when the hardware issue is solved.

Categories
Technology

The Wireless Innovation Fund

Practically everybody in the country has a cellphone, and mobile communication is now a huge part of the daily life of people and key to a huge amount of the economy. But as we found out during the pandemic, key parts of the economy, like the cellphone market, are susceptible to supply chain issues. The U.S. cellphone industry is particularly susceptible to market forces since the industry is dominated by a small number of manufacturers.

One of the many programs funded by recent legislation is the Public Wireless Supply Chain Innovation Plan that was funded by the CHIPS and Science Act of 2022. This program is being implemented with $1.5 billion to award for grants that explore ways to support open and interoperable 5G wireless networks.

The specific goals of the grant fund are to provide grants that will:

  • Accelerate commercial deployment of open, interoperable equipment;
  • Promote compatibility of new 5G equipment;
  • Allow the integration of multiple vendors into the wireless network environments;
  • Identify the criteria needed to define equipment as compliant with open standards;
  • Promote and deploy security features and network function virtualization for multiple vendors, interoperable networks.

All of this equates to opening the cellular network to multiple new U.S. vendors. That will make the cellular networks far less susceptible to foreign supply chain problems while also creating new U.S. jobs. There is also the additional goal of increasing the security of our wireless networks. This is all being done in conjunction with the other provisions of the CHIPS Act, that have already resulted in over fifty projects to build chips in the U.S.

There have already been 127 applications for grants from the fund that total to $1.39 billion. There have been three grants announced, with many more to come. The first three grants are:

Northeastern University for $1.99 million to develop an accurate testing platform to enable the construction of sustainable and energy-efficient wireless networks.

New York University for $2 million to develop testing and evaluation procedures for open and secure adaptive spectrum sharing for 5G and beyond.

DeepSig Inc. for $1.49 million to dramatically improve the fidelity, speed, and repeatability of OpenRAN air-interface performance testing using an AI model to set new standards and tools to revolutionize the evaluation of interoperable ORAN in real world conditions.

I’ve always believed that the government should take the lead on directed research of this type. I’m sure some of the ideas being funded won’t pan out, but the point of directed research is to uncover ideas that make it into the next generation of deployed technology. I’d love to see something similar done for ISP technologies. I hope this is not a one-time grant program because funding this kind of research every year is one of the best ways to keep the U.S. at the forefront of both wireless and broadband technology – using American technology.

Categories
Technology The Industry

DOCSIS 4.0 vs. Fiber

Comcast and Charter previously announced that they intend to upgrade cable networks to DOCSIS 4.0 to be able to better compete against fiber networks. The goal is to be able to offer faster download speeds and drastically improve upload speeds to level the playing field with fiber in terms of advertised speeds. It’s anybody’s guess if these upgrades will make cable broadband equivalent to fiber in consumers’ eyes.

From a marketing perspective, there are plenty of people who see no difference between symmetrical gigabit broadband offered by a cable company or a fiber overbuilder. However, a lot of the public has already become convinced that fiber is superior. AT&T and a few other big telcos say they quickly get a 30% market share when they bring fiber to a neighborhood, and telcos claim aspirations of reaching a 50% market share within 3-4 years.

At least a few big cable companies believe fiber is better. Cox is in the process of overbuilding fiber in some of its largest markets. Altice has built fiber in about a third of its markets. What’s not talked about much is that cable companies have the same ability to overlash fiber on existing coaxial cables in the same way that telcos can overlash onto copper cables. It costs Cox a lot less to bring fiber to a neighborhood than a fiber overbuilder that can’t overlash onto existing wires.

From a technical perspective, engineers and broadband purists will tell you that fiber delivers a better broadband signal. A few years back, I witnessed a side-by-side comparison of fiber and coaxial broadband delivered by ISPs. Although the subscribed download speeds being delivered were the same, the fiber connection felt cleaner and faster to the eye. There are several technical reasons for the difference.

  • The fiber signal has far less latency. Latency is a delay in getting bits delivered on a broadband signal. Higher latency means that a smaller percentage of bits get delivered on the first attempt. The impact of latency is most noticeable when viewing live sporting events where the signal is sent to be viewed without having received all of the transmitted bits – and this is seen to the eye as pixelation or less clarity of picture.
  • Fiber also has much less jitter. This is the variability of the signal from second to second. A fiber system generally delivers broadband signals on time, while the nuances of a copper network cause minor delay and glitches. As one example, a coaxial copper network acts like a giant radio antenna and as such, picks up stray signals that enter the network and can disrupt the broadband signal. Disruptions inside a fiber network are comparatively minor and usually come from small flaws in the fiber caused during installation or later damage.

The real question that will have to be answered in the marketplace is if cable companies can reverse years of public perception that fiber is better. They have their work cut out for them. Fiber overbuilders today tell me that they rarely lose a customer who returns to the cable company competitor. Even if the cable networks get much better, people are going to remember when they used to struggle on cable holding a zoom call.

Before the cable companies can make the upgrade to DOCSIS 4.0, which is still a few years away, the big cable companies are planning to upgrade upload speeds in some markets using a technology referred to as a mid-split. This will allocate more broadband to the upload path. It will be interesting to see if that is enough of an upgrade to stop people from leaving for fiber. I think cable companies are scared of seeing a mass migration to fiber in some neighborhoods because they understand how hard it will be to win people back. Faster upload speeds may fix the primary issue that people don’t like about cable broadband, but will it be enough to compete with fiber? It’s going to be an interesting marketing battle.

Categories
Current News Technology

New Battery Technology

The world is growing increasingly dependent on good batteries. It’s clear that using the new 5G spectrum drains cellphone batteries faster. Everybody has heard horror stories of lithium batteries from lawnmowers or weed eaters catching fire. Flying with lithium batters is a growing challenge. People with electric cars want better range without having to recharge. The best way to capture and use alternate forms of power is to store electricity in big batteries. The increasing demand for batteries is happening at the same time that trade wars for the raw materials used for batteries are heating up through tariffs and trade restrictions.

Luckily there is a huge amount of research underway to look for batteries that last longer, charge faster, and are made from more readily available minerals.

Zinc-manganese oxide batteries. Researchers at the Department of Energy’s Northwest National Laboratory have developed a technology that can produce high-energy density batteries out of zinc and magnesium. These are readily available minerals that could be used to create low-cost storage batteries.

Scientists have experimented with Zinc-manganese batteries since the 1990s, but they could never find a way to allow batteries to be recharged more than a few times due to the deterioration of the manganese electrode. They have found a technique that reduces and even replenishes the electrode and have created batteries that can be recharged over 5,000 times. This technology creates the larger batteries used for electric storage in solar systems, vehicles, and power plants.

Organosilicon Electrolyte Batteries. Scientists at the University of Wisconsin were searching for an alternative to lithium batteries to avoid the danger of the electrolyte catching fire. Professors Robert Hamers and Robert West developed an organosilicon electrolyte material that can greatly reduce the possibility of fires when added to current Li-ion batteries. The electrolytes also add significantly to battery life.

Gold Nanowire Gel Electrolyte Batteries. Scientists at the University of California, Irvine, have been experimenting with gels as the main filler in batteries since gets are generally not as combustible as liquids. They had also been experimenting with using nanowires as the diode, but the tiny wires were too delicate and quickly wore out. They recently found that they could use gold nanowires covered with dioxide along with an electrolyte gel. This combination has resulted in a battery that can be recharged 200,000 times, compared to 6,000 times for most good batteries.

TankTwo String Cell Batteries.  One of the biggest problems with batteries is the length of time it takes to recharge. The company TankTwo has developed a technique to build batteries in tiny modular compartments. These are tiny cells with a plastic coating and a conductive outer coating that can self-arrange within the battery. At an electric car charging station, the tiny cells would be sucked out from the battery housing and replaced with fully charged cells – reducing the recharging process to only minutes. The charging station can recharge deleted cells at times when electricity is the cheapest.

NanoBolt Lithium Tungsten Batteries. Researchers at N1 Technologies have developed a battery structure that allows for greater energy storage and faster recharging. They have added tungsten and carbon nanotubes into lithium batteries that bond to a copper anode substrate to build up a web-like structure. This web forms a much greater surface area for charging and discharging electricity.

Toyota Solid-state Batteries. Toyota recently announced it is introducing a new solid-state lithium-iron-phosphate battery as a replacement for the lithium-ion batteries currently used for its electric vehicles. These batters are lighter, cost less, and recharge faster. Toyota claims a range of 621 miles per charge. They say the battery can be fully recharged in ten minutes. By comparison, the best Tesla battery is good for about half the distance and can take a half-charge in fifteen minutes.

Categories
Technology The Industry

Getting the Lead Out

There was a recent article in the Wall Street Journal that talks about the possible contamination from copper telephone cables that have outer lead sheathing. I’m not linking to the article because it is behind a paywall, but this is not a new topic, and it’s been written about periodically for decades.

The authors looked at locations around the country where lead cables are still present around bus stops, schools, and parks. The article points out that there are still lead cables hanging on poles, crossing bridges, buried beneath rights-of-ways, and underwater.

Let’s start with a little history. Telephone cables with lead outer sheathing were produced and widely used starting in 1888. This was before we understood the dangers of lead in the environment, and lead was also widely used in paint, water pipes, and other materials used in daily life. Western Electric was the manufacturer of telephone cables for AT&T, and from what I can find, the company stopped making lead cables in the late 1940s. Lead cables were first replaced with cables using plastic sheaths and paper insulators. Starting around 1958, the industry transitioned to cables with polyethylene insulation.

I remember when I was first in the industry in the 1970s that there was already a movement to remove and replace lead cables any time there was a network upgrade to aerial cables. Many of the small telcos I worked with slowly replaced lead cables as part of routine upgrades and maintenance. But it’s a different story for the big telcos because starting in the mid-1980s, the big telcos made a decision to stop upgrading or even maintaining copper cables – what was in place stayed in place.

Even where the big telcos like AT&T and Frontier are building fiber today on poles, they keep the old copper wires. The lowest cost way to build fiber is to lash the fiber onto existing telephone cables. In most neighborhoods, the telcos add fiber and cut the copper cables dead. But those dead copper cables will easily be expected to now stay on poles for another fifty or more years.

I’ve never heard of any telephone company that has tried to retrieve buried telephone cables at the end of economic life. The cables are cut dead and abandoned underground. The idea of digging lead cables out of the ground sounds unrealistic since doing so will invariably disturb and break water, gas, electric, and telecom lines.

I’m also not surprised that the Wall Street Journal found lead cables crossing under bodies of water for the same reasons – the cables were likely cut dead and left in place. I can’t imagine the process of retrieving abandoned underwater cables – cables are laid with the help of gravity, but it’s hard picturing getting enough leverage to pull dead cables out of the water.

Telecompetitor wrote an article that quoted an estimate by New Street Research that says it might cost $60 billion to remove lead cables. I doubt that anybody has the facts needed to estimate this cost, but it points out that it would be extremely expensive to get lead cables out of the environment. I doubt that anybody even knows the location of most abandoned buried cables. It’s likely that the old hard-copy blueprints of copper networks are long forgotten or lost. It would be particularly expensive to remove lead cables that are now being used to support fiber networks – that would mean moving the fiber cables to a new messenger support wire.

The WSJ article seems to have been the catalyst for a drop in the stock value of the big telcos. The Telecompetitor article implied that the cable replacement cost is so high that it could kill the willingness of the big telcos to participate in BEAD grants.

When the WSJ article first hit, I assumed this would make a loud noise for a few weeks and would quickly fade away, as has happened every decade since the 1960s. But in this day of social media and sensationalism, there is already talk of having the EPA take up the issue. Even if that happens, there will be huge push-back from the telcos and it will likely take many years before the remaining lead wires are removed. The public should be comforted to know that the vast majority of copper cables on poles are not covered with lead – only cables built from the 1950s or earlier. The bigger concern is probably underground and underwater cables, and those have probably already been in place for at least seventy years.

Categories
Technology

Unintended Consequences of Satellite Constellations

Astronomy & Astrophysics published a research paper recently that looked at “Unintended Electromagnetic Radiation from Starlink Satellites”. The study was done in conjunction with the Low Frequency Array (LOFAR) telescope in the Netherlands.

The LOFAR telescope is a network of over forty radio antennas spread across the Netherlands, Germany, and the rest of Europe. This array can detect extremely long radio waves from objects in space. The antennas are located purposefully in remote locations to reduce interference from other radio sources.

The study documents that about fifty of the 4,000 current Starlink satellites are emitting frequencies in the range between 150.05 and 153 MHz, which have been set aside worldwide for radio astronomy by the International Telecommunications Union. The emitted radiation from the satellites is not intentional, and the guess is that these are stray frequencies being generated by components of some of the electronics. This is a common phenomenon for electronics of all sorts, but in this case, the stray frequencies are interfering with the LOFAR network.

This interference adds to the larger ongoing concern about the unintended impact of large satellite constellations on various branches of science. We already can see that satellites mar photographs of deep space as they pass in front of cameras. The intended radiation from the satellite constellations can accumulate and interfere with other kinds of radio telescopes. There is a fear that this current radiation will interfere with the Square Kilometer Array Observatory that is being built in Australia and South Africa. This new project is being built in remote locations away from cellphones, terrestrial TV signals, and other radios. But satellite arrays will still pass within the range of these highly sensitive radio sites.

The fear of scientists is that interference will grow as the number of satellites increases. Starlink’s current plans are to grow from the current 4,000 satellites to over 12,000 satellites – and the company has approval from the FCC to launch up to 30,000 satellites. There are numerous other satellite companies around the world with plans for constellations – and space is going to get very busy over the next decade.

One of the issues that concern scientists is that there is nowhere to go for relief from these kinds of issues. There are agreements reached at the International Telecommunications Union for setting aside various bands of spectrum for scientific research. But there is no international policemen with the authority to force satellite companies into compliance.

In this case, Starlink is working with the scientists to identify and isolate the issue to hopefully eliminate the stray radiation from future satellites. If the problem gets too bad, the FCC could intercede with Starlink. But who would intercede with satellites launched by governments that don’t care about these issues?

I don’t know how many of you are stargazers. When I was a kid in the early 60s, it was a big deal to see a satellite crossing the sky. A few satellites, like Telstar, were large bright objects crossing the sky. Most of the new satellites are much smaller, but it still doesn’t take very long watching the sky to see a satellite crossing. The sky is going to be busy when there are tens of thousands of satellites passing overhead. It’s hard to think that won’t have unexpected consequences.

Categories
Technology The Industry

Getting DOCSIS 4.0 to Market

If you read the press releases or listen in on investor calls for the big cable companies over the last year, you might think that the latest cable network technology, DOCSIS 4.0, is right around the corner and will be installed soon. Cable companies have been leaving this impression to fend off competition with fiber. There are millions of new fiber passings being constructed this year where cable companies serve today, and most of the companies building fiber say that they reach at least a 30% market penetration rate within the first year after fiber reaches a neighborhood.

The reality is that it will still be a while until DOCSIS 4.0 networks make it out into neighborhoods. A recent blog from CableLabs spells this out well. This month (July 2023), CableLabs is holding the first big interoperability testing event where different manufacturers will test if their DOCSIS 4.0 equipment is interoperable with other vendors. This kind of interoperability testing is a standard step in the process of moving toward gear that is approved for manufacturing.

Per the CableLabs blog, this testing is a pre-cursor for CableLabs to be able to certify specific brands of modems. The blog describes this as the first interoperability testing event that will look to see if a cable modem can be operational when working with the latest version of DOCSIS 4.0 core equipment. This test also will check if new modems are backward compatible with earlier existing versions of DOCSIS. This is only the first of multiple interoperability tests, and later tests will go deeper into more specific functions such as interfacing with the overall network, backoffice functions, etc.

It’s normal during this kind of testing that bugs are found in the software and hardware, and it’s likely that there will still be tweaks in many of the components of the DOCSIS 4.0 network.

Only after all of the testing is done and CableLabs is happy that all components of the system are operating correctly and will work together properly can the process of certifying equipment from each vendor begin. That involves sending devices to CableLabs for extensive testing and final approval by the CableLabs Certification Board. Only then will any manufacturer put a device into mass production. Any device that doesn’t pass certification will have to be reworked, and the process started again.

It’s hard to think that it won’t be at least another year until devices start to get certified. After that will be the time needed to mass produce, distribute, and install devices. That could easily mean two years before we might see the first DOCSIS 4.0 network being installed.

With that said, this entire process has been exceedingly fast by industry standards. The DOCSIS standards was completed in early 2020. This process is far ahead of where most new technologies would be only three years after standards are completed.

The cable companies are in a huge hurry to be able to declare superfast symmetrical speeds to compete against fiber. I’m sure there has been tremendous pressure on CableLabs to speed up each step of the process. This likely meant faster than normal efforts to create breadboard chips and the components needed for equipment. For example, the normal timeline for getting a new chip designed and built can easily take 18 months. DOCSIS 4.0 chips are likely on an accelerated timeline.

Who can say how long it will take cable companies to upgrade networks to DOCSIS 4.0? They will certainly start in the markets where they think the technology makes the most market sense. It could easily take several years to make this upgrade nationwide, assuming that manufacturers will be able to keep up with the demand.

Categories
Regulation - What is it Good For? Technology

FWA Mapping and BEAD Grants

There is one mapping issue that unfortunately messed up the NTIA’s count of eligible passings for BEAD, grants and that is going to be a real concern for folks who file BEAD grants. Over the last year, both T-Mobile and Verizon have activated rural cell sites that can deliver home broadband using licensed cellular spectrum that can be 100/20 Mbps or a little faster. According to the way that the NTIA and the BEAD grants determine grant eligibility, these locations are considered as served.

There are several reasons why this is going to be a practical problem in the BEAD grant process. First, the claimed areas claimed by the cellular carriers on the FCC maps are not accurate. Cellular broadband signal strength decreases with the distance between the cell tower and a customer. The easiest way to explain that is with an example. I talked to a farmer in Illinois who has the T-Mobile FWA broadband and is thrilled with it. The T-Mobile tower is on his farm and he’s getting over 200 Mbps download speed. He bragged about the technology to his neighboring farmers. One of his neighbors over a mile away is getting download speeds over 100 Mbps. But another neighbor over two miles away is getting speeds closer to 50 Mbps and doesn’t like the product.

At some future point, the FCC is supposed to require heat maps around each cell site to more accurately show the actual speeds that can be delivered, But for now, T-Mobile and Verizon are typically claiming speeds of 100/20 Mbps or faster for a sizable area around each cell site. This speed is true for the folks close to the tower, but at the outer fringe of each claimed circle are customers who are not able to receive 100/20 Mbps broadband. Those areas should be eligible for BEAD grant funding. I have no idea how State Broadband offices are going to deal with this. Any Grant office that decides to stick with the FCC maps will be condemning small pockets of folks to have worse broadband than everybody around them.

This is also another problem to deal with for an ISP seeking BEAD grants. I’ve described in the past how RDOF carved up the unserved and underserved areas in many counties into a jumbled mess, and FWA cellular coverage makes it that much harder to put together a BEAD serving area that makes both engineering and financial sense.

There is a more subtle issue that is even more troubling. The cellular carriers have no intention of serving everybody within the range of a cell site. There are constraints on the number of people they are willing to serve. This is similar to the constraints that Starlink has with serving too many people in a given small geographic area. This makes it hard to understand why NTIA rushed to define this technology as qualifying as served broadband. The willingness and ability to serve everybody ought to be one of the most prominent factors when declaring a technology to be creating served areas.

Even worse, T-Mobile says in the terms of service that it reserves the right to throttle usage on the FWA service. The bread and butter product for cellular companies is people with cell phones, and they are giving those customers priority access to the bandwidth at each tower. Any time cellular traffic demand gets too high, the usage to FWA customers will be restricted. That may not be a problem for low-population cell towers – but customers at any tower that has this restriction are going to be unhappy if broadband slows to a crawl in the evening.

My final issue with FWA cellular technology is that is expanding rapidly. Soon, it won’t just be Verizon and T-Mobile deploying the technology. UScellular, DISH, and AT&T are likely to start popping up in rural areas. I’ve been scratching my head wondering how State Grant offices and ISPs are going to deal with the technology if it’s activated during the grant review process. Cellular companies have every motivation in the world to intervene in grant applications and declare that areas are served and ineligible for grants. If the FWA carriers are allowed to make this claim for new cell sites, I can foresee numerous ISPs walking away from BEAD applications if the serving areas get carved up too badly.

This is a new technology, and, in my opinion, the NTIA rushed to accept these areas as served. The technology is so new that there was almost nobody served with cellular FWA back when the IIJA legislation enabled the BEAD grants. For the reasons I’ve discussed, it makes no sense to give cellular companies little broadband monopolies around their cell sites.

Categories
Technology

Interesting Science Summer 2023

I ran across two interesting new technologies.

Power Out of Humidity. Jun Yao, a professor of engineering at the University of Massachusetts at Amherst, published a paper in Advanced Materials that shows that energy can be pulled from the moisture in the air using material that has been harvested from bacteria. The study says that almost any material can be used for this purpose as long as the material can be smashed into tiny particles and then reformed to include microscopic pores that are less than 100 nanometers in size. The tiny holes work by allowing water vapor to pass through the device in a way that creates an electric charge imbalance between the top and bottom of the device, Each nanotube is effectively a tiny battery that can continuously produce electricity.

The test devices created by the Jun Yao team have been labeled as Air-gen. One Air-gen unit is about the size of a fingernail and as thin as a hair. One unit can produce continuous energy that is strong enough to power a dot on a computer screen.

Jun Yao refers to the Air-gen as creating a tiny man-made cloud. The next step for the team is to determine the materials that produce the most electricity. There are also challenges to efficiently harvesting and storing the power from multiple Air-gen units. Making this into a useful technology will mean somehow stacking large numbers of these units together.

The potential for the technology is immense if it can ever be scaled. This would enable power to be generated locally in a way that produces no waste or byproducts. Since humidity drives the electric power generation, this would best be used in places with high humidity instead of in deserts. The ideal clean energy technology has always been described as one that can pull power out of nature – and this technology might become the ideal source of power if it can pull electricity out of the water vapor in the air.

The Anti-Laser. Physicists at the Hebrew University of Jerusalem and the Vienna University of Technology have developed what is being dubbed the anti-laser. This is a device that traps light until it is fully absorbed.

There are a lot of uses for technologies that can absorb light. Photovoltaic cells would be increasingly efficient if all light can be absorbed and turned into electricity. Light sensors could be far more precise by eliminating stray light signals. The ability to capture faint light images with a telescope could be enhanced by eliminating spurious light.

The technology takes advantage of the quantum properties of electromagnetic waves, where waveforms can undergo destructive interference if combined in an exact way. The scientists have created a device that pulses light in a way to enhance the interference. This is accompanied by a set of mirrors and lenses that trap light inside a cavity and bounce it over and over again until the light is absorbed by light-absorbing materials.

Interestingly, this mimics a phenomenon in nature. When a flashlight is shined in the eyes of animals with night vision, like owls, the light appears to be reflected back. This is due to a reflective layer that sits behind the retina of such animals. Reflecting the light back out allows two chances for the retina to capture what is being seen in the near-dark.

When the researchers started this experiment, they found that light entering the trap from different angles was not fully absorbed, and some light escaped. They solved the problem by arranging the mirrors in a way to force all light into a circular path until it is fully absorbed.

Exit mobile version