Robocalls Growing

The FCC has taken a number of steps in recent years to try to cut down on nuisance robocalls. Some of the FCC’s actions include:

  • The agency mandated that companies that originate voice calls implement STIR/SHAKEN, a set of rules that require validation of caller ID. One of the primary functions of this effort is to make sure that the number displayed on caller ID is the same as the number of the originating caller. The big telcos implemented this in 2021, and smaller carriers a year later.
  • Over time, the FCC expanded STIR/SHAKEN to gateway providers (tandems) and eventually to intermediate non-gateway providers.
  • In conjunction with the industry, the FCC actively tries to shut down active robocall campaigns.
  • The FCC has taken enforcement actions, including large fines against robocallers. In 2021, the FCC issued a $225 million fine against a group of telemarketers in Texas who were selling scam life insurance plans. In February of this year, the FCC issued a fine of $4.5 million against Telnyx for making sham calls that impersonated politicians.
  • The FCC requires that anybody selling a voice product give customers a free call-blocking tool.
  • Phone companies are free to block calls by default that they believe are illegal or unwanted.

The FCC’s actions put a dent in robocalling. According to the Robocall Index from YouMail, there were 4.8 billion robocalls in February 2020. That dropped to 4.6 billion in February 2021 and 3.8 billion in February 2022, Buit since then, volumes are back up to 4.2 billion in February 2023, and now 4.5 billion in February 2025.

Not all robocalls are bad, and the FCC’s efforts are aimed at eliminating unwanted robocalls. YouMail classifies robocalls into four categories:

  • Alerts and Reminders are calls from places like a school, a local government, or a doctor or dentist office.
  • Payment Reminders are to remind folks that a payment of some sort is due, and perhaps late.

The two troublesome categories are:

  • Telemarketing calls that come from a company with which a consumer is not a current or past customer.
  • Scams are calls that the FCC considers to be fraudulent and that try to get money from folks.

In February 2024 there was 1.32 billion telemarketing calls, that rose to 1.575 billion in 2025, up 19%. There was 840 million scam calls in February 2024, up to 1.035 billion in 2025, a 23% increase.

Robocallers have obviously figured out ways around the various network tools that have been implemented to stop them. Considering the sizes of fines that have been issued, there must be big payday from making large numbers of scam calls. The FCC says that stopping robocalls is still a top priority and posted this message to consumers earlier this year.

The bottom line is that the FCC seems to be losing the war against robocallers. They’ve shifted the battleground by killing techniques that used to work, but it seems like robocallers have gotten adept at still getting call through to the public. There is a lot of speculation that scam artists are using AI to be more effective in their calling effort.

Status of NG 911

The 911 emergency calling system got started in 1968 when AT&T established the digits 9-1-1 as a universal number to reach emergency services. In 1999, the industry started to tackle an upgrade labeled as enhanced 911 that integrated called ID, first for landlines and eventually using triangulation from cell sites.

By 2007, there was recognition that 911 could do a lot more, and next generation (NG 911) was created. NG 911 enables 911 callers and first responders to transmit text, photos, videos, and other data in real-time.

There is recognition today that 911 ought to be able to integrate IoT devices like home security cameras and wearables. There is work being done to introduce proactive incident detection and predictive analytics using AI to further assist first responders. There is also a recognition that 911 systems need better cybersecurity. The movement to introduce the latest technology is referred to as NGERS – next generation emergency response systems.

But before introducing the newest technologies, a lot of states still have a lot of work to do to implement NG 911. Maine, Tennessee, and Vermont implemented NG 911 in 2015, followed within a few years by Connecticut and Massachusetts.

A lot of states still have not completed the full transition to NG 911. The steps needed to implement include a lot of technology upgrades such as compliant call handling, computer-aided dispatch, and call recording systems. Work is also needed to align 911 with GIS data – and there are still many counties that have not converted property records to GIS. The number one issue cited by states that haven’t made the transition to NG 911 is funding. Many states have been hoping for more federal funding for the transition, which largely never was forthcoming. There was an effort to create federal legislation to promote the effort that never materialized.

Meanwhile, there was a federal effort through the National Highway Traffic Safety Administration (NHTSA) and NTIA to create a framework where state 911 systems could be integrated into a national 911 grid. The effort of these two agencies seems to have slowed to nearly a stop since 2022.

One of the biggest challenges today is the introduction of new technologies that have changed to path needed to implement NG 911. It’s looking very likely that the big telcos will finally be abandoning the TDS-based public switched telephone network in favor of a digital SIP-based network. Technologies involving GIS and mapping have changed a lot in the last decade. It feels like trying to hit a moving target to complete an implementation of NG 911.

Another surprising roadblock is the number of 911 centers (PSAPs) in a given state. PSAPs have largely been funded and created locally, and that means that each 911 call center uses different hardware and software, making it hard to make any statewide updates.

Another missing part of the picture is the vendors that supply the systems to support 911. The technology that was used a decade ago is obsolete, and in many cases, there are no vendors rushing to do the R&D needed to develop the next generation of systems until there is a proven market to buy the new systems. In many ways, the rapid evolution of communications technologies has moved faster than the systems that incorporate them. One simple example is the recent announcement that satellite networks will be supporting cell phones to make 911 calls.

I think a lot of folks will be surprised that the conversion to NG 911 is still not close to complete, since industry press was full of success stories a decade ago.

Repairing Undersea Fiber

I saw several articles voicing concern about sabotage when two different undersea fiber operators, C-Lion and BCS East-West Interlink, reported breaks in fiber in the Baltic Sea in the same week. There was speculation that Russia was cutting fibers to try to disrupt European broadband. It was eventually reported that the cuts looked like accidents, but conspiracy theorists still like the sabotage story better. Having two cables broken in the Baltic Sea got headlines because of tensions caused by the war in Ukraine. To put the Baltic Sea fiber cuts into perspective, there are two to four cable cuts to undersea fiber somewhere in the world every week.

Interestingly, a fiber cut to an undersea fiber doesn’t cause as much harm as most people imagine. This map that shows all of the current submarine cable routes. There are a huge amount of redundant routes to most of the world. A single fiber getting cut is an inconvenience and not a huge problem. Even if all of the fibers in the Baltic Sea were cut, Internet traffic would still be delivered through long-haul fiber routes across Europe.

There are exceptions, and there are island nations that can be isolated by even a single fiber cut. Multiple fiber cuts can cause localized slowdowns. There were four cable cuts off Africa in a relatively short time in 2024 that caused broadband outages in Ivory Coast, Liberia, and Benin. It’s more of a challenge in Africa to reroute traffic using landline fiber since much of the continent still has inadequate middle-mile and long-haul fiber routes.

There are a wide variety of ways that undersea cables get cut. The two most predominant causes are fishing vessels and fibers snagged by anchors. Breaks can come from natural causes like earthquakes, volcanos, or heavy seas. Fibers are cut more often in relatively shallow water than in the deep water in the middle of oceans.

The constantly weekly cuts to fiber have spawned a naval repair industry of ships that constantly circle the globe to fix cable breaks. While that might sound like an exotic job, fixing fibers in the Baltic Sea in February sounds like it deserves hazard pay to me.

The process of repairing cut fibers is interesting. In shallow water, the repair ships locate and grab the end of the cut fibers using ROVs (remote operated vehicles). The mini-submarines grab the fiber with robotic claws and drag the fiber to the surface.

Repairing fibers in deep water is harder. The repair ships locate the fiber using sonar and voltage drop test equipment to locate the ends of the cut fiber. Anybody who locates buried fiber would be intrigued by the process. They then use grapnels, which are large hooks, to snag the fiber and pull it to the surface.

Once the two fiber ends are retrieved, the repair process would be familiar to any fiber field technician. The one difference is that long-haul fiber routes have periodic light repeaters built into the fiber, so it’s more challenging if one of those in part of the repair.

A Spectrum Crisis?

CTIA, the trade association for cellular companies published a recent blog titled, “The Looming Spectrum Crisis”.  The blog quotes a study from Accenture that concludes that a lack of spectrum for 5G is reaching a point of crisis. The Accenture study says that cellular networks will be unable to meet nearly one-fourth of peak-period requests for connection as soon as 2027.

My first reaction to this headline was, “Here we go again”, because this feels like the giant industry drama eight years ago when the wireless industry told everybody who would listen that the U.S. was losing the 5G war to China. That effort was also aimed at getting more spectrum to support 5G. In retrospect, it turned out that nobody cares what China does with wireless inside their own country.

The other original promise was that 5G was going to revolutionize connectivity. Cell sites were going to be upgraded so that customers could get huge amounts of bandwidth by combining signals from multiple small cell sites that were going to be on every corner. 5G was going to unleash self-driving cars, virtual reality, and even the ability for doctors to do remote operations. It turns out that none of those things were ever implemented because cell carriers quickly realized that people weren’t willing to pay extra for a faster cell signal or for the bells and whistles.

However, the scare tactics worked, and the carriers got the new spectrum. The public didn’t get the bells and whistles, but we got faster cellular networks that work better, and that’s okay.

The CTIA blog seems to be rehashing the same old claims. The blog says that without new spectrum, consumers won’t have access to next-generation products and services like remote robotics, extended reality devices, and autonomous vehicles. Lack of spectrum also means that AI will be stifled.

The biggest threatened consequence of not getting more spectrum is that competition will suffer. By that, CTIA means that the carriers want more spectrum to expand 5G FWA home broadband. That’s interesting because the CEOs of the cellular carriers have all publicly been saying that 5G home broadband is a sideline and was implemented to use up excess capacity in the network. This is the first time I can recall seeing FWA as the justification for needing more spectrum. I can understand why the carriers want more FWA – they had grown the business in only a few years to over 11.6 million customers at the end of 2024. However, wanting more spectrum to sell more FWA customers is not a looming crisis.

It is true that cellular traffic usage has been growing rapidly and likely will continue to do so. Ericsson says the rate of growth of cell phone data usage in North America will be 16% per year through 2030. That prediction must be tempered by the fact that OpenSignal says that 85% of cell phone traffic is now handled by WiFi and not with cellular spectrum.

I guess the wireless industry saw that crying wolf worked eight years ago, and are adopting the same tactic again. The industry clearly needs more spectrum in the future, but it’s not particularly believable that cell networks will be unable to complete huge numbers of connection requests only a year and a half from now.

If the industry is really going to run out of 5G spectrum by 2027, you would think there would have been a much louder stink about this before the second quarter of 2025. You also might think that an industry that was facing that kind of crisis wouldn’t have connected 11.6 million FWA home broadband customers to scarce 5G spectrum in the last few years – particularly since the average FWA customer uses up to 100 times more cellular data in a month than the average cell customer. I am sure that the real purpose of this kind of headline is to give cover for the FCC to give more spectrum. But it’s so damned dramatic.

Rural 5G

The FCC voted last year to launch the 5G Fund for Rural America to expand 5G coverage into the many parts of country with poor cell coverage. It may turn out that market forces might mean that some of that subsidy won’t be needed since the big carriers are expanding into rural areas. A recent blog from Ookla documents the rural expansion of 5G. Ookla concludes that fierce nationwide competitive pressure is driving the carriers to look harder at rural areas to gain every possible customer.

Ookla, which collects a huge volume of speed tests, is one of the few companies that can look at carrier expansion using its own data. When Ookla sees multiple speed tests on 5G, it has definitive proof that coverage is present in an area. Ookla looked at the recent rural expansion from each of the three primary carriers.

T-Mobile. Ookla shows that T-Mobile has the largest rural 5G footprint today. T-Mobile claims it covers 323 million people, or 98% of U.S. households with 5G using its low-band 600 MHz spectrum. This low-band spectrum carriers for a greater distance than the spectrum used by other carriers. The company was required to expand coverage to 97% of the population as part of the agreement with the FCC when it purchased Sprint. I have to wonder about the 98% coverage. If you look closely at the FCC cellular maps, T-Mobile shows coverage of very slow speeds over a lot of rural America, and you have to wonder if this coverage is real enough to even use for voice calls.

T-Mobile also is the fastest carrier in much of the country, which came from the deployment of the 2.5 GHz spectrum that the company acquired with the Sprint purchase. The company has used the 150 MHz band of the spectrum to increase speeds in the top 100 markets in the country. We know that T-Mobile has rural plans since the company announced in 2024 that it is hoping to achieve a 20% market share in rural America by the end of 2025. That claim is bolstered by the pending close of the purchase of 30% of the spectrum and all 4.5 million customers of UScellular.

AT&T. A lot of the company’s rural expansion comes from FirstNet. This is a nationally funded program to create a nationwide first responder network. AT&T was awarded $6.5 billion to build the network and also given 20 MHz of 700 MHz spectrum. FirstNet brought AT&T a 25-year contract with the government. There is an expected $2 billion additional investment to upgrade the network to 5G everywhere.

One of the key requirements for FirstNet is that it must be made available to first responders in rural areas. This led AT&T to install FirstNet on all of its own towers and to build over 1,000 rural towers. AT&T announced in October 2024 that it has 6.4 million connections and 29,000 public safety agencies on the network. AT&T has also invested heavily in spectrum auctions and spent $37 billion the FCC’s C-band and 3.45 GHz auctions.

Verizon. Verizon doesn’t own much low-band spectrum that would give it coverage in rural areas. Instead, the company relied on a technology called Dynamic Spectrum Sharing (DSS) that allows one spectrum band to toggle between 4G LTE and 5G  in 1 millisecond increments. While it works, this didn’t give the company the boost it was hoping for.

Verizon’s rural strategy seems to be through acquisition, and the company has bought cell carriers operating in Kentucky, Iowa, New York, Pennsylvania, Missouri, and Montana. Verizon is also buying $1 billion of 850 MHz, AWS and PCS spectrum from UScellular.

Verizon is betting on the C-Band spectrum that it purchased in 2021 for $52 billion. It’s hoping that the 161 MHz band of spectrum will carry it into the future. The company has announced it intends to deploy more rural spectrum,

None of the carriers are likely to expand into sparely populated rural areas where coverage is often nonexistent. But the current expansion plans likely will bring cellular relief to a lot of rural areas, long before any solution might come from the FCC.

Pushing the Speed Limit

Today’s blog is about several new fast broadband deployments. It seems that every year that vendors are developing new technologies that will speed up our networks and broadband connections.

The first was an announcement from AT&T that the company completed a live test of a 1.6 terabit fiber connection on a route between Newark and Philadelphia. The connection was tested over AT&T’s long-haul network that was also running 100 Gbps and 400 Gbps links.

The fast link was created by combining two 800 Gbps links created by white hardware operating with the Broadcom Jericho3 packet processor chip. The two links were combined using Ciena’s WaveLogic 6 Extreme cohere optical transponder. At the two ends of the link, the signal was processed by 800 G DR8 pluggable transceivers from Coherent which created the cross-connectivity to communicate with other packet and optical technologies.

This new link is four times faster than the 400 Gbps lasers that are being installed nationwide as the newest iteration of ling-haul and middle-mile networks – replacing the 100 Gbps lasers that were the standard for the last decade.

Lumen announced that the company successfully created a 1.2 terabyte connection connection on an 1,800 mile long link. The link was accomplished on what Lumen calls its ultra-low-loss (ULL) network. The test used Ciena’s WaveLogic 6 Extreme technology and Ciena’s Waveserver platform along with Juniper 800 Gbps routers.

The AT&T news release of the test quoted Mike Satterlee, VP of Network Infrastructure and Services as saying that these tests are vital for AT&T to keep up with future demand. He was quoted as saying that AT&T expected that overall long-haul network traffic will be doubling by 2028.

The other groundbreaking speed trial was conducted by T-Mobile. The company was able to achieve a 6.3 Gbps connection on a Samsung Galaxy S25 cell phone. The phone was using the Snapdragon X80 5G modem-RF system. The 6.3 Gbps test was achieved in the lab, and the speed achieved on a real-world 5G network was 4.3 Gbps. A second test achieved the same speeds using a non-commercial handset that used Qualcomm’s X85 5G Modem.

The network test was conducted using Nokia’s 5G radio access RAN. The sped was achieved by aggregating 2.5 GHz, AWS, and 600 MHz spectrum. The test was not as much about speed as it was the ability to combine multiple frequencies to create a high-bandwidth path.

These trials are proof that carriers are constantly pushing vendors to develop the next-generation of network gear that brings greater capacity. Middle-mile and long haul routes are under strain from unexpected traffic from AI data centers. But long-haul network operators are reporting a big uptick in requests for 100 gigabit data connections across markets and unrelated to AI.

Fixing Urban Cellular Coverage

Anybody who lives in an urban or suburban area know that cell coverage is not the same everywhere. There are neighborhoods with great cell coverage, neighborhoods with so-so  coverage, and neighborhood with little or no coverage. Nobody understands this better than first responders and city employees who work in all parts of a city.

This is all due to the physics of cell coverage. The FCC has purposefully restricted cell towers to low power levels in order to create discrete coverage areas or cells. This was done so that neighboring towers don’t interfere and cancel each other out. Coverage is also affected by the specific frequencies being used by cell carriers, with some of the higher frequencies used for 5G having shorter coverage distances. Another important factor that affects cell quality is the number of users in a neighborhood. Anybody who lives close to a busy road or a high school knows there are certain times of the day when coverage gets worse due to heavy cell usage.

The final factor that creates cellular deserts is the placement of cell sites. The big tall cell towers  were located years ago to largely take cover highways – not where people live. This was done due to a compensation system where carriers got wealthy from carrying vehicle roaming traffic for other carrier networks. Cell towers have also often been forced to locate on taller hills or away from residential neighborhoods who didn’t want a giant unsightly tower in backyards. Unfortunately, cities are now largely stuck with the original cell tower configuration.

A lot of the poor coverage can be solved with the placement of additional small cell sites to fill in neighborhoods with poor coverage. You might recall five years ago when the carrier industry promised to build a million small cell sites. For various reasons that never happened. The primary reason came when carriers realized they weren’t going to make any incremental new revenues from 5G, and they lost interest in investing in cell site infrastructure.

The good news is there is a way for cities to tackle the cellular coverage issue. My consulting firm recently helped a city in a major urban area that knew it had poor cell coverage. Using various tools, we were able to fully map all of the important factors that measure cell phone call quality.

We were able to create separate coverage maps for AT&T, T-Mobile, and Verizon, which is important because every carrier has distinctly different coverage areas based on the specific cell sites and frequencies they are using. Probably the best result of this study was a map that showed the unfortunate neighborhoods where all three carriers have poor coverage. A map overlaying poor cell coverage and household incomes was also eye-opening.

There are a lot of consequences of poor cellular coverage. The national statistics show that about 11% of homes have no home broadband and must rely on cell phones as the only source of broadband. Folks who live in neighborhoods with weak cell coverage can’t use their cell phone indoors. First responders struggle in these communities. Delivery companies struggle to find addresses when they lose cell and GPS coverage. Folks who can’t afford home broadband and who live in cellular deserts have the worst of all worlds for connectivity and are stuck having to seek out public WiFi for connectivity.

We think cities will find a cellular mapping study to be invaluable. For the first time, they’ll be able to visualize cellular coverage in ways they can understand. Armed with coverage maps, cities can have conversations with carriers about addressing some of the worst coverage. A next logical step might be forming public-private partnership or economic development initiatives to help fund improved cell coverage. But none of that can be contemplated until a city knows the facts.

Pew Report on Pole Attachments

Jake Varn of Pew authored a report based on a deep dive into the impact of pole attachments on broadband expansion. The report highlights how issues with pole attachments are adding extra costs to the many grant projects being funded like BEAD and the Capital Project Fund.

The report highlights what anybody who has been building networks has known for many years. The process of adding fiber to the pole can be horrendously complex and time consuming. The rules on how to get onto poles differ significantly by State, but also by pole owner. 23 states and Washington D.C. manage the pole attachment process while the rest of the states follow rules established by the FCC.

Every step in the pole attachment process can be expensive and time-consuming. Applications to attach fiber to a pole often must be done on a pole-by-pole basis since neighboring poles can be significantly different. Many pole owners have no good inventory of poles and must send somebody out to look at each pole to compare to what’s being requested in an application. All of the costs of this process are ultimately borne by the ISP making the application.

The real issues arise when there is not enough room to add a new fiber that will meet national standards for clearance with other wires. In many cases, there’s no room because parties that added wires in the past did not follow standards when placing their wires, and the pole owners never made sure it was done right. The new attacher must pay to fix these old errors.

The most expensive pole attachments come when a pole is too old or too full of wires to add a new fiber, in which case the pole must be replaced. The applicant is saddled with the entire cost of replacing the pole, although a federal order last year tried to soften that impact. But the real damage from having to replace poles is the time required, which differs by State and pole owner.

As the report highlights, the process to do the make-ready work just to get ready for fiber construction can take a long time, which isn’t compatible with the construction deadlines required by the many grant programs. There are already examples of ISPs who have returned RDOF and other grant awards once they realized that the poles were in worst case than they thought, or the learned the pole owner was not going to be cooperative.

The Pew report made some great recommendations, which are aimed at states and federal policymakers and pole owners:

  • The process would be speeded up if there was a requirement to create an electronic inventory of the basic data about every pole – how old, how tall, how many existing attachments? Some pole owners have done this, and it really eases the process.
  • Pew recommends that states standardize the application and permitting process to eliminate the maze of different paperwork and processes needed. A single broadband project might include poles on city, county, and state roads, with each jurisdiction requiring a different paperwork process and forms.
  • Pew recommends that governments form rapid response teams to resolve disputes or delays in the pole attachment process.
  • Pew recommends that States consider pole replacement funds to replace bad poles. They also recommend that states take advantage of any disaster relief funding to replace poles damaged by storms or other disasters.

There is still time for State and local governments to act on these recommendations – but they need to act now to ease the process of ISPs trying to bring better broadband to rural areas.

In full disclosure, I provided some input to this report.

An Industry on Hold

I keep seeing articles or podcasts every week speculating on what the new administration and Congress might change in the $42.5 billion BEAD grant program. This all seems like speculation to me since only a few people really know what might happen, and I don’t think they are talking. I don’t think any of the pundits know any more about what will happen to BEAD than what I included in a tongue-in-cheek blog last year that included a BEAD bingo card.

There is one thing that definitely has occurred. A large chunk of the industry that was expecting to participate in BEAD is largely on hold.

That obviously includes the many ISPs that have filed or plan to file BEAD applications. There is a huge amount of speculation that any significant changes to BEAD will mean repeating the BEAD application processes in the three states that have already announced awards and the twenty-plus with open BEAD grant windows.

While State Broadband Offices are marching forward with the BEAD process, they are all spending a lot of energy speculating on what they might have to rework – and worrying that they’ll not have enough money to do this all a second time.

The group feeling the most pain are the vendors expecting to sell to BEAD grant winners. This group already had a let-down when many of them guessed at the beginning of 2024 that there would be BEAD grants made last year. They now see the process entering April 2025 with no idea of when grants will be made and when ISPs might start ordering equipment. The one thing they are now seeing is that the money might be finally get released for a lot of states at the same time instead of BEAD awards dribbling out over a year.

Another group that is getting very concerned is elected officials in counties across the country. A lot of counties devoted significant resources participating in the BEAD process. Many states gave counties some power in choosing BEAD winners by giving a lot of grant points for local endorsement and local funding. A lot of counties have made broadband grants to ISPs that are contingent on them winning BEAD – and many of those grants are from ARPA funding that has a ticking time clock and expiration date. There are also a lot of rumors flying around that the federal government might claw back unspent APRA funds.

Everybody is on hold for the big decision of how much BEAD funding will go to satellite. Will it be 5%, 10%, 20%, 50%, or 80%? I’ve heard industry pundits making all of these guesses. That split is vitally important to both the ISPs and the vendors. The group most worried about this is local elected officials, who almost universally want fiber built in their counties.

The other big question that has everybody in knots is how much of the process can be changed by NTIA versus what needs to come from Congress. There is legislation still in the early stages in the House that addresses the issues, with other lawmakers drafting alternate ideas. While Congress could act quickly on this if they want to, they have a lot of other big issues on their plate right now.

Interestingly, the first Congressional bill on the issue is called the SPEED Act, but none of this is feeling very speedy. But who knows? Edicts could come down quickly and State Broadband Offices could issue grants quickly if there isn’t a lot of paperwork involved in reshuffling the rules. Meanwhile, my original BEAD bingo card is still intact.

Regulatory Shorts March 2024

Regulatory announcements seem to be hot and heavy these days, mostly related to a new administration.

USF Fee Increase. The FCC recently voted to increase the end-user fee used to fund the Universal Service Fund. The new fee for the second quarter of 2025 is now calculated at 36.6% of interstate telecommunications revenues. Everybody in the industry recognizes this fee as unsustainable, and there has been a lot of discussion, but no action taken to expand the base on which the fee is calculated. This may come to a head later this year since the Supreme Court is currently reviewing a case that challenges whether the Universal Service Fund structure is constitutional.

Net Neutrality Now a State Issue. Add Pennsylvania to the list of states that are considering a state version of a net neutrality law in a new bill introduced into the legislature. The current states that have adopted net neutrality rules are California, Colorado, Maine, Oregon, Vermont, and Washington. One of the interesting features of the Pennsylvania legislation is that it would classify ISPs as utilities, opening them to more state regulation.

Changing Regulators. FCC Commissioner Geoffrey Starks has announced he’ll be leaving the FCC some time this spring. Commissioner Starks is a Democrat, and the new administration is required by law to replace him with another democrat. The Senate has not yet confirmed Olivia Trusty, a Republican who is slated to fill the current open spot.

Adam Cassady has been named as the acting administrator of NTIA. Adam was formerly the chief of staff for FCC Commissioner Nathan Simington. The acting administrator slot is open until the Senate votes to confirm Arielle Roth.

Administration Support for the USF. The Trump administration filed a brief with the Supreme Court on March 13 that defends the Universal Service Fund and says that the claim the that the USF is unconstitutional is a strawman. There are legal scholars saying the Court might rule against the USF, even while the White House supports it, since this case has wider implications for other federal agencies.

Tower Dumps are Unconstitutional. A Mississippi federal judge has ruled that the law-enforcement technique known as a ‘tower’ dump’ is unconstitutional. Law enforcement agencies have often pulled large swaths of calling data in the area of crimes in the hope of identifying suspected criminals. In the case that triggered the order, the FBI had requested search warrants to pull cellular data from four carriers across nine cell towers in the hope of being able to sift through records to identify local gang members.

FCC Issues Shutdown Contingency Plan. Acknowledging this crazy year, the FCC issued a  Plan for Orderly Shutdown Due to Lapse of Congressional Appropriation. The document surprised me because the FCC is self-funded through regulatory fees, and I assumed that meant they would not be subject to a general government shutdown. And perhaps that is the case since the FCC shutdown would not happen “If prior year funds are made available.” I guess that raises the possibility that the FCC’s own funds could somehow be frozen in a larger shutdown.