Hydrogen Generators

There is an interesting technology that is slowly edging into the telecom industry. There are a handful of places that are using hydrogen fuel cell generators instead of the more standard diesel generators for backup power. Everybody who works with a telecom network is aware of the wide use of diesel backup generators that kick in when commercial power fails. Diesel generators are permanently installed for critical hub sites, and telecom companies use portable generators that can be quickly driven to remote powered sites like huts and cabinets.

Diesel generators have a few drawbacks. Diesel fuel in notoriously challenging to use in very cold weather. Diesel generators also expel clouds of oily smoke. The biggest downside to diesel generators is that they are loud – the larger the generator, the louder. The largest diesel generators used for large sites like data centers can operate at 110 decibels, the same sound level as a rock concert. One of the biggest complaints about neighbors of data centers is the loud noise when generators are being tested.

Hydrogen fuel cells offer an alternative to the shortcomings of diesel generators. They are nearly silent in operation. The technology doesn’t generate any heat. Most impressively, hydrogen generators don’t generate any pollution since the waste product of a hydrogen fuel cell is water.

Hydrogen fuel cells operate by a simple chemical reaction. In a hydrogen fuel cell, pure hydrogen is passed by an anode that separates the hydrogen molecule into protons and electrons. The electrons are used to power the applicable application, such as the electricity from the backup generator. The protons are passed through an electrolytic membrane where they combine with oxygen to form water.

Hydrogen fuel cell technology has been used in practical applications for decades. An early version of a hydrogen fuel cell was used to provide the electricity for the Apollo spacecraft in the 1960s. The technology began to be practically used in the 1990s when cities created zero-emission bus fleets operated by hydrogen. There are now delivery trucks that use hydrogen technology. There have been successful tests using hydrogen fuel cells to power trains and airplanes. Most car companies have experimented with making hydrogen-fueled cars. Several countries are experimenting with hydrogen power in submarines because of the silent operation and the lack of heat.

Hydrogen fuel cells have a potential place in telecom. In 2020, Microsoft was able to operate a data center continuously for two days with hydrogen fuel cell generators. Tele2 and Telia are using hydrogen fuel cell generators for telecom sites in Estonia.

https://www.popularmechanics.com/science/a33499249/microsoft-hydrogen-generator-test/

There are practical downsides to using hydrogen on a commercial basis, although cities with fleets of hydrogen buses have solved the biggest problems. Hydrogen has a low volumetric energy density, which requires storing it in large quantities. Bus fleets have solved this issue by storing hydrogen in vehicles at high pressure, which carries a different set of risks. Hydrogen is flammable, but so are fossil fuels used for combustion generators. The solution to the widespread use of hydrogen as a fuel would be to develop hydrogen depots, which would be the equivalent of gas stations, where hydrogen canisters could be refilled or swapped.

For now, the biggest downside is probably the upfront cost of the generators and the infrastructure that is needed to store the gas to support them. However, cities say that ongoing costs compare favorably to diesel generators. The number one way to get costs down would be widespread adoption, which would bring economies of scale to manufacturing the units.

This seems like a technology that data center operators should be interested in. The public is increasingly pushing back against the noise and pollution created by data centers, and hydrogen generators would help to lessen the negative impacts on those living close to a data center.

The Regulatory Death Knell for Copper?

In March, the FCC issued an order in docket WC 25-209 and 25-208 that may signal the final regulatory death knell for telephone copper networks. The docket is titled Reducing Barriers to Network Improvements and Service Changes / Accelerating Network Modernization.

The stated purpose of the order is to speed up the transition from the TDM technology used in copper telephone networks to all IP-based networks used by fiber and other newer technologies. The secondary purpose of the docket is to override state regulations that are slowing down the transition away from copper networks.

The order comes in eight parts:

  1. Adopt one consolidated rule applicable to all efforts to discontinue services and applications, eliminating all prior rules.
  2. Give blanket authority to grandfather copper services, meaning telcos don’t have to accept new orders for these services.
  3. Allow ISPs to use technologies other than copper T1s to connect to 911 centers.
  4. Give carriers the right to discontinue wholesale products provisioned over copper.
  5. Create a 31-day automatic grant of a request to discontinue services.
  6. Clarify discontinuance requests to ensure they do not adversely affect the public.
  7. Revise the rules applicable to emergency discontinuance of services during and after disasters.
  8. Eliminate redundant rules and regulations related to discontinuing services.

In probably the most consequential part of the order, the FCC is preempting all state laws and processes that hinder telcos from decommissioning copper networks.

Taken as a whole, these new rules greatly reduce the paperwork involved with decommissioning copper networks and related services, which should speed up the stated intent of the big telcos to get out of the copper business.

There are instant impacts on the public. Telcos no longer have to sell telephone service or DSL provided over copper lines. Somebody moving into a rural home with no cell service will no longer be able to count on buying a traditional telephone line.While the FCC order didn’t explicitly state it, this new requirement just killed the carrier of last resort responsibility for copper-based telcos.

The new FCC rules still retain the requirement that customers must be provided an alternative when their copper services are discontinued. However, under the new rules, a telco doesn’t have to provide the services if they can point to alternative from a different provider. I’m envisioning telcos telling the FCC and customers that the alternative is satellite broadband for voice and data – eliminating the need for the telco to provide an alternative.

One of the biggest concerns of discontinuing rural copper is connectivity to 911. A lot of rural residents still buy landlines to have a sure way to connect to 911. Consider the many people who live in rural areas with little or no cellular coverage. If these households don’t have broadband, either by choice or because of affordability, they will lose the ability to call 911. Even customers buying broadband do not automatically have connectivity to 911 through their broadband connection. Rural broadband customers have to buy an online VoIP product to be able to safely guarantee they can register their home address for identification to 911 through the broadband connection.

We’ve known this was coming for a long time, and the FCC just took the steps to make it quick and painless for telcos to walk away from copper, including in states like California that have still been enforcing copper regulations. It’s going to take a few more years for telcos to abandon copper. For example, AT&T has a goal of being out of the copper business by 2030. But this order makes it clear that the end of copper is on the horizon.

Some Thoughts on Convergence

An article in Light Reading reported that the largest cable companies captured about one-third of the net cellular customer additions in the fourth quarter of 2025. This statistic combines the cellular sales of Charter, Comcast, and Optimum. The overall cable industry statistics would be even higher if it included sales from Cox and Mediacom, which are privately held.

Industry analysts are using the word convergence as shorthand for competition that bundles cell service with broadband. Convergence is the newest strategy that replaces the traditional bundling strategy of selling a package of broadband, cable TV, and voice.

Industry press over the last year is full of articles that wonder about the ultimate success of the strategy. Cable companies seem to have the upper hand in a convergence bundle since they collectively pass roughly 122 million homes. I’ve read a few analysts who argue that the big telcos like AT&T and Verizon are at a disadvantage since they pass a lot fewer homes with fiber.

But I think these analysts are missing something. There are three players in the convergence battle, and each is using a different tactic:

  • Cable companies are finding success with the convergence bundle by combining full-price broadband with inexpensive cellular service. The main goal of the cable companies is to reduce broadband churn, and a customer loses their cable company cell service if they drop broadband.
  • The fiber parts of the telcos don’t seem to be pushing the convergence package to the same extent. They are mostly still betting that people like fiber a lot more than cable broadband. However, AT&T just announced a fiber/cellular bundle with gigabit and one cellphone for $90 and two cellphones $120.
  • The third competitors are the FWA cellular companies. They are bundling full-price cell service with inexpensive broadband. At least for now, they seem to be winning the convergence battle. In the fourth quarter of last year, AT&T, T-Mobile, and Verizon added over 1 million net FWA customers while the rest of the industry barely grew.

I know it seems odd to be counting the FWA competitors as different than the fiber telcos, since they are largely the same companies. But anybody who follows these companies understands there is not a lot of bleed-over between the wireline and the cellular parts of the businesses. The FWA division of the telcos are willing to compete for a fiber customer from their own sister companies.

It’s becoming clear that affordability is a major issue for a huge number of households. As long as that stays in the forefront, it seems like many households will lean towards the convergence plan that gives them a significant discount. I doubt that customers care if the discount comes from a lower price for broadband or cellular.

I think the cable companies are on to something with their focus on reducing churn. I talked to a few people in the last year who wanted to leave Charter and move to fiber broadband but didn’t want to lose their cheap cell service – and didn’t want to go through the hassle of replacing both services at the same time. The cable companies were really good at the triple play bundle in the 2010s, and a huge number of households felt they were held captive by the bundle. Are we headed back to that same place, but this time with multiple bundle options that force customers to buy both services from the same company?

Perhaps led by the recent new plan from AT&T, perhaps the fiber telcos are ready to jump into the convergence battle. I have wondered for years why they didn’t lead the market in this effort, and I guess it was due to internal battles over which division swallowed the bundling discount.

A Peek Into the Latest Merger

The most recently announced merger is between GFiber and Astound. It’s an interesting merger that brings together a premium fiber overbuilder and a traditional cable company that also owns some fiber assets.

GFiber has been somewhat of a mystery in the industry since its splashy launch in 2021. Known then as Google Fiber, the company was the first to introduce the whole country to the idea of gigabit fiber. There had been a few municipalities, cooperatives, and small telcos that offered gigabit broadband before 2012, but Google Fiber made big national news when it said it was going to overbuild the Kansas City metropolitan area and offer symmetrical gigabit fiber as its only broadband product. Google Fiber believed in simplicity, and originally only offered broadband before eventually layering on Google Voice and YouTube video. The company has always guarded any discussion of customer counts, but we are learning through news of the merger that GFiber has over 2.6 million passings, which means it probably has more than 1 million fiber customers.

Astound Broadband is a conglomerate of three broadband businesses.

  • The original Astound started as a cable company in the San Francisco Bay area. The company purchased additional cable properties in Washington and Oregon and rebranded as Wave Broadband.
  • RCN was founded in 1993 and had the unique business plan of overbuilding existing cable companies using cable company technology. The company was concentrated in the northeast, with the most customers in Boston, New York City, Philadelphia, Allentown, and Washington DC.
  • Grande Communications was founded in San Marcos, Texas, in 1999. The company started by providing cable TV to campuses at Texas State University, the University of the Incarnate Word, Baylor University, and the University of Texas at Austin. The company grew to have over 1.1 million passings.

The merger announcement says that Astound covers around 4.6 million passings and has around 1 million broadband customers. The combined company would have 2 million customers and 7.1 million passings. This would make the company the seventh-largest ISP after Comcast, Charter, AT&T, Verizon, Altice, and T-Mobile. The seventh ranking recognizes the merger of Frontier with Verizon, the sale of Lumen fiber customers to AT&T, and the upcoming merger of Cox and Charter.

The merger has GFiber spinning off from Google’s Alphabet. The majority owner of the combined company will be Stonepeak, with the GFiber parent retaining a significant minority stake. The merger is supposed to close in the fourth quarter of this year. The GFiber executive team will lead the combined company.

This is an interesting merger that brings together companies using different technologies. I would have to think that the goal will be to upgrade to coaxial networks to fiber, or possibly to DOCSIS 4.0 to bring symmetrical gigabit speeds.

After this merger is completed, the only remaining large merger target is Altice, with over 4 million customers. There are no other ISPs left in the market that have more than a million broadband customers.

Supreme Court Rules on ISPs and Copyrights

The Supreme Court ruled in favor of Cox Communications in the longstanding lawsuit by Sony that sought to hold Cox liable for customers who download copyrighted material. The Court’s ruling was unanimous, which is a big win for ISPs. The Supreme Court went further and said that ISPs would only be liable if they intended for their service to be used for copyright infringement.

The Supreme Court’s ruling goes back to a 2018 lawsuit where Sony sued Cox when the ISP refused to disconnect customers who were reported to be downloading music. The record labels insisted that Cox should permanently disconnect any customer who engages in repeated copyright infringement. Cox argued this would turn ISPs into Internet policemen who would have to monitor and punish customers who engage in copyright infringement. That doesn’t just mean people who download copies of music, but could also apply to movies, games, books, pirated sports events, and copyrighted written materials. A Virginia Court agreed with the music labels and found Cox liable for both contributory and vicarious copyright infringement and awarded the record companies an astonishing $1 billion in damages.

Cox appealed the ruling, and the Fourth Circuit U.S. Court of Appeals reversed the penalties for vicarious infringement and vacated the $1 billion of damages. This was still a troublesome ruling for Cox because the company was still considered to be liable for contributory damages for actions taken by its customers.

ISPs would be in an uncomfortable position if the Court had ruled in favor of the music labels. ISPs don’t monitor customer usage because that would mean looking closely at everything that customers do. The music labels wanted Cox and other ISPs to react to complaints made by copyright holders. That might make sense in a perfect world, but complaints to ISPs rarely come directly from copyright holders. Instead, there is an entire industry that makes a living by issuing takedown requests for infringements of copyrighted materials.

The music companies expected Cox to cut off subscribers after only a few violations of copyright. The permanent loss of a customer would be a severe financial penalty for Cox. It would be a severe penalty for the public, where a household would lose broadband in a world where a lot of markets have only one realistic choice of fast broadband. It’s not hard to imagine a scenario where a teenager or visitor to a home violates copyrights and gets the entire household disconnected from broadband. The music industry was trying to avoid the harder solution, which is to legally pursue and seek fines for people who violate copyright.

The ruling has wider implications than just for ISPs and record labels. Social media platforms are filled with news articles, video clips, and other copyrighted materials that users post. The Court’s ruling was an affirmation that companies are not liable for users who abuse online services by downloading or posting copyrighted material. Ultimately, the people who violate copyright should be held liable, regardless of how hard that is to implement.

This ruling doesn’t take online platforms off the hook for responding to takedown requests to block copyrighted materials. However, the ruling does lower the potential for copyright holders to seek huge dollar judgments against ISPs that refuse to act as copyright policemen.

Americans and Our Smartphones

According to a survey conducted by Reviews.org, many Americans spend a lot of their waking day with their smartphones. The survey was conducted with 1,000 people in the fourth quarter of 2025. This was not a high-accuracy survey, and the results have an accuracy of plus or minus 4%. But the overall trends are clear, and the survey results don’t vary a lot from year to year. Consider some of the following statistics:

The average participant in the survey used their phone 5 hours and 1 minute per day, which works out to 83 days of the entire year. Boomers used their phones the least, at an average of 2 hours and 8 minutes per day.

The average American checks their smartphone 186 times per day. That works out to almost 8 times per hour. This is lower than the statistic from 2024 of 205 times per day.

84% of respondents checked their phone within ten minutes of getting out of bed.

50% of people sleep with their phone by their bed.

Something that doesn’t surprise anybody who has gone to a restaurant lately, 56% of respondents use their phone while eating dinner.

71% of respondents check their phone within five minutes of getting a notification.

Something that sounds icky to me, 68% of people use their phone while sitting on the toilet.

87% of people use their phone while watching TV.

72% of respondents use their phone while at work.

A scary statistic is that 29% of respondents use their phone while driving.

61% of respondents have texted somebody who is in the same room.

53% of respondents have never gone an entire day without using their smartphone.

41% of respondents panic when their battery drops below 20%.

Probably the most telling statistic is that 46% of respondents say they are addicted to their smartphone. This is up from 43% reported in the 2024 survey.

I’m not a big smartphone user, and these statistics always surprise me. The statistics help to explain why the new converged bundle of broadband and cellular is so powerful.

Impacts of the RAM Shortage

Starting in late 2025, the world began experiencing a big shortage of memory chips used in the manufacture of smartphones, computers, and other consumer electronics. The shortage has been caused by chip makers across the industry deciding to manufacture more lucrative chips for AI data centers. As an example, during the last year, we saw Micron, Samsung, and SK Hynix stop making RAM for consumer devices in favor of AI chips.

Random access memory, or RAM, is a crucial component in devices like smartphones, computers, and game consoles. RAM chips are what allow a computer to perform functions like keeping multiple tabs open in  a browser,

In the fourth quarter of last year, the demand for RAM chips exceeded supply by 10%, and the shortage is quickly growing. By the end of 2025, the price for RAM increased by 50%, and the supply chain delays to get chips suddenly slowed to a crawl. If an electronics factory wants chips sooner, they’re being forced to pay a premium price and pre-pay for a large supply. The shortage is expected to last at least into 2027. A few companies, like ChangXin Memory Technology and Yangtze Memory Technologies Corp. have stepped up to enter the consumer RAM market. There are predictions that RAM prices will increase at least 60% this year, with specialty chips possibly doubling or tripling in price.

This is bad news for the broadband industry since the price of computers and smartphones will climb, likely out of the reach of the budgets of many households. This is going to increase the cost of all of the network electronics used for fiber, cable HFC, and wireless networks.

This is bad news for the nonprofits that have been refurbishing used computers and smartphones. One important part of many upgrades is to increase RAM capacity for old computers to be able handle new web needs. If RAM prices double, these entities will not be able to help nearly as many people. The problem will be made worse since small buyers of RAM will probably be the ones seeing the biggest price increases.

Digitunity recently published an article that estimates that 32.9 million people can’t access broadband from due to the lack of a computer. That’s about 10% of households, a number that compares with other estimates of the homes with broadband.

More expensive computers will hurt broadband adoption, and that hurts the public and the economy. People are increasingly reliant on access to broadband. The federal government, and many state and local governments, are eliminating the ability to communicate with the government by anything other than web portals. Federal services of all sorts, like veterans benefits and Social Security, are moving online.

The IRS and many states expect taxpayers to file tax returns using online software. This software is difficult to navigate with a smartphone, as are many other government portals. The IRS and other federal agencies will also no longer issue paper checks, forcing people to have an electronic way to receive and access payments from the government.

FEMA announced last year that anybody affected by a disaster must make a claim online, which is a particularly ironic requirement for folks who might have just lost a home due to a flood, tornado, or hurricane. For anybody who has ever dealt with a disaster result, there is a mountain of communication needed to push a claim through to the finish line.

People in rural areas increasingly need to use telemedicine as rural hospitals and clinics continue to fail and close.

A computer at home is vital for working from home or taking college and other classes online. These are also tasks that can’t easily be done by smartphone.

A Rural Cellular Story

I was looking through the FCC cellular map in Buncombe County, North Carolina, where I live. For those not fully familiar with the FCC broadband maps, the agency publishes two maps: the more familiar one that shows broadband coverage and a second that shows cellular coverage. You can toggle between the two maps at the FCC’s map website.

It struck me while looking at the details in the maps that rural cellular coverage is changing, and not in a good way. I started by looking at a small section of the county that is on the outer fringe of where the Asheville outer suburbs turn rural. According to the FCC cellular map, the area I selected has the following cellular coverage:

These two tables tell me the following:

  • AT&T and Verizon have some 4G coverage. But the Verizon coverage is likely very weak since they don’t claim it will work in a moving vehicle. While AT&T claims its 4G coverage will work in a moving vehicle, it’s curious that AT&T doesn’t have 5G. This tells me that the AT&T signal is also likely weak since it is outside the 5G coverage area.
  • The only carrier claiming relatively solid 5G (35/3 Mbps) is Project Genesis, which is EchoStar. The company has exited the facility-based cellular business and is in the process of dismantling cell sites.
  • T-Mobile claims both 4G and 5G for outdoor cellular coverage, but doesn’t claim it can work in a moving vehicle, meaning the coverage is also probably weak.
  • The last carrier listed is UScellular, which claims 7/1 speeds on 5G, but doesn’t claim to be able to provide coverage in vehicles. UScellular was purchased by T-Mobile, and the rumor is that any UScellular towers that already duplicate T-Mobile coverage are likely to be decommissioned.

The bottom line is that this particular neighborhood has weak cell coverage. The only carrier that claimed to be able to deliver 5G to a moving vehicle is now out of business.

I picked this neighborhood at random, but I think I would find the same story in most of the areas on the fringe of the metropolitan area. The coverage in areas that are completely rural is worse. The story I gleaned from this neighborhood is troublesome for several reasons.

  • The folks who live here don’t have a lot of options. The only carrier that might work in the way people need cellular to work is AT&T, but this neighborhood is outside the AT&T 5G coverage, and the 4G coverage is likely weak.
  • It looks like decent coverage was finally becoming available from EchoStar, but that’s now gone.
  • The speeds shown in the table are for outdoor coverage, and speeds inside homes are typically half of outdoor speeds.
  • When you look at the details in the FCC cellular map you quickly understand how the advertised national footprints of the big carriers are exaggerated.
  • The bad news is that the FCC considers this neighborhood to be served by cellular. That means if the FCC finally launches the 5G Fund for Rural America, this neighborhood will not be considered for funding to add a new cell tower.

Filling the Sky with Satellites

The skies are quickly filling with communications satellites. Following is a short list of the many ventures that have or will soon be launching large numbers of broadband satellites.

Starlink now has over 10,000 operational satellites in orbit, with the ultimate announced goal of reaching 42,000 satellites. The company is not sitting still and will be introducing its new V3 satellites sometime this year, that promises to provide 10 times the download and 24 times the upload capacity of the current V2 satellites. That should mean a big boost in the capacity of the Starlink constellation and faster speeds. Starlink is likely to maintain a major advantage over competitors through its use of the reusable Starship rocket.

Amazon Leo (formerly Project Kuiper) currently has around 212 satellites in orbit. The company was recently granted a two-year delay by the FCC of its original commitment to have an operational network by this summer. The company also recently got approval from the FCC to increase the constellation size to 7,700 satellites. The company is working to accelerate satellite launches and launched 32 satellites in February using the Ariane 64 rocket. Amazon Leo has contracted for 18 additional launches with Arianespace.

Eutelsat OneWeb is currently operating a 648 satellite constellation in twelve polar planes that is providing broadband to enterprise, government, and maritime customers. Its key markets today are in places like Ukraine, Saudi Arabia, and Taiwan. The company has ordered over 300 additional generation 2 satellites that should start being deployed later this year.

Blue Origin, a rocket company, plans to launch a constellation of 5,408 TeraWave satellites starting at the end of 2027. The company is promising speeds up to 6 Tbps. The constellation will be comprised of optically connected satellites using both low Earth orbit (LEO) and medium Earth orbit (MEO). The satellites will be interconnected using optical lasers. The target market for Blue Origin will be enterprise, data center, and government customers who need a reliable primary or secondary broadband connection. They think their primary market will be in remote, rural, and suburban areas around the world, where the cost of providing diverse fiber paths is too expensive.

Telesat’s Lightspeed satellite business got its start in December 2026 with the launch of its first two satellites. It plans are to launch 157 satellites by the end of 2027, with an ultimate goal of 298. The first 156 satellites will focus on support for NATO and allied nations. After that, the company hopes to be able to provide global coverage for enterprise customers, including the aviation, maritime, energy, and government sectors.

China’s Guowang (the National Network) has launched 164 satellites and has plans to launch 12,992 satellites to compete with Starlink. The company plans to launch 310 satellites in 2026, 900 in 2027, and 3,600 per year starting in 2028. There will be two separate constellations, one at 500 to 600 kilometers and a second around 1,145 km.

Quinfan (also known as Spacesail or G60) is being developed by Shanghai Spacecom Satellite Technology (SSST). The company currently has 108 satellites in polar orbit as part of its first constellation of 648 satellites. The company has announced long-term plans to reach over 15,000 satellites.

Meanwhile, there is another space race happening for companies wanting to provide direct-to-device cellular service. The key players are Lynk Global, Skylo, a partnership between SpaceX and T-Mobile, a partnership between AST SpaceMobile and AT&T/Verizon, and a partnership between Globalstar and Apple.

WiFi Router Ban

The FCC issued a ban on March 23 on all consumer-grade routers made in foreign countries. A router is the device in your home that connects your ISP broadband to the WiFi that almost everybody uses to connect devices in the home. Businesses use routers to direct ISP broadband around the business on fiber or copper networks. The ban covers all new brands and models of routers except those that have been granted a Conditional Approval by the Department of Defense or the Department of Homeland Security.

The ban comes after the White House convened an interagency group comprised of government security experts, which collectively decided that new routers made overseas “pose unacceptable risks to national security of the United States and the safety and security of United States persons”. There have been previous technology bans for security reasons, such as a ban on using software from Kaspersky Lab, and telecommunications services provided by China Telecom and China Mobile International USA. It’s worth noting that the FCC cannot decide to ban any equipment or service and can only do so if directed by national security agencies.

The ban noted that malicious actors have exploited security gaps in foreign-made routers to attack households, disrupt networks, engage in espionage, and steal intellectual property. The notice says that foreign-made routers were involved in cyberattacks from Volt, Flax, and Salt Typhoon.

The ban does not stop consumers from using existing routers. It doesn’t stop retailers from selling existing stocks of routers or from continuing to buy routers that previously have been approved by the FCC’s equipment authorization process. All that is blocked is any new models or generations of routers.

Router manufacturers can petition the DoD or DHS for conditional approval, which would allow them to apply to the FCC for equipment authorization for new routers. There are no manufacturers today that have this conditional approval.

It’s hard to know where this ban will lead, but this could become a big concern for ISPs, since most ISPs provide a WiFi router for new customers. Many cable companies and fiber builders build the router into the modem. Any ISP that is currently using a router that has not been approved by the FCC is in trouble, because according to this ban, they can’t give an unauthorized router to a new customer. Every ISP should be checking this week to make sure the routers they are providing have been blessed by the FCC.

This has longer-term implications since virtually all routers are made overseas, including those made by American companies like TP-Link, which manufactures its routers in Vietnam. Manufacturers routinely upgrade and improve routers every few years, and American ISPs will be stuck with older routers if the government doesn’t approve any new brands or models of routers.

One unspoken intent of the order is probably to promote the manufacture of routers in the U.S. I have to wonder if an American-made router would be any less susceptible to hacking than a foreign-made one. If not, I’m not sure what this ban will accomplish, other than making it more expensive to get routers. It will be interesting to see if any router companies move manufacturing to the U.S. due to this ruling. A more likely outcome might be that American consumers won’t be able to get some of the newest routers that are available to the rest of the world.