FCC Raises Inmate Calling Rates

In what would normally be an extraordinary action, but which is becoming commonplace, the FCC thumbed its nose at Congress and raised rates for telephone and video calls placed from jails and prisons.

The FCC under Jessica Rosenworcel issued new rules in 2024 that lowered prison calling rates that came as a result of the Martha Wright-Reed Fair and Just Communication Act, enacted by Congress. Earlier this year, the FCC put the rate reductions on hold and recently ruled to make the reversal of the law permanent.

The FCC was reacting to heavy lobbying from the handful of companies that specialize in providing prison calling, along with lobbying from jail and prison officials who benefit by sharing in the revenues from inmate calls. The new Congressional Act not only cut rates but also eliminated the payment of commissions to jails.

The FCC didn’t only raise calling rates. The new order adds a 2-cent facility fee to each call. The FCC rules now allow the prison carriers to add a fee to cover “safety and security’ costs. The FCC also created a new category of even higher rates for extremely small jails that house fewer than 50 inmates. Finally, the FCC now allows prison carriers to add a 6.7% increase to rates for inflation.

The new FCC rules kept a few of the provisions of the original Act. Carriers can’t pay commissions to jails and prisons from Interstate calling. The Act also eliminated a lot of ancillary fees that were charged mostly to families of inmates.

I call the ruling extraordinary because I can’t recall an FCC in the past that so blatantly decided to ignore a law passed by Congress. The only other time I recall the FCC having a major issue with a law was when Chairman Mark Fowler in the 1980s took exception to Congressional rules related to the Fairness Doctrine, although there might have been other instances. The FCC was created as an independent agency by Congress, and it’s assumed that the Agency is required to follow explicit laws enacted by Congress.

This is not the only federal agency ignoring Congress. NTIA, at the behest of the Administration, is refusing to award grants from the Digital Equity Act. The NTIA is also mulling over sending excess BEAD funds, called non-deployment funds, back to the Treasury. These actions are in direct violation of the funding rules created by the IIJA legislation that funded BEAD and Digital Equity.

The obvious party to address a rogue FCC and NTIA is Congress, but for now, they seem to be ceding power to the Executive branch. It’s possible that the Courts could act to make the FCC and the Administration follow the law. There was a recent lawsuit filed by NDIA that challenges the ability of the Administration and the NTIA to kill the Digital Equity Act.

There are also Supreme Court rulings over the last few years that seemingly make it more difficult for federal agencies to act on their own. The Supreme Court ruled in Loper Bright Enterprises v. Raimondo  to effectively end the Chevron deference and said that federal agencies are on shaky ground when they make decisions that are not explicitly directed by Congress. In the 2025 ruling, McLaughlin Chiropractic Associates, Inc. v. McKesson Corp., the Supreme Court ruled that Courts can more easily disagree with rulings made by federal agencies. It would seem those two court decisions provide a lot of ammunition for attacking the FCC decision on prison calling rates.

Limiting Digital Device Use

The City Assembly of Toyoake, Japan, a city with 68,000 residents, recently introduced a rule limiting the use of digital devices to two hours per day outside of work and school. The idea was introduced by mayor Masafumi Kouki. He’s grown concerned that residents, particularly children, have become addicted to digital devices.

The new law has no penalties for exceeding the 2-hour limit, and the government will not be tracking cellphone and tablet usage. The hope is that in a society that has social pressure to follow official guidelines, the amount of device usage will drop.

The City is like much of the Western world, where people heavily use cellphones. A government study last year found that Japanese elementary and secondary school students use their phones on average for about five hours each day. Several recent studies in the U.S. put average usage here at over 4.5 hours per day.

This is not the first time a local government in Japan has tried to curb computer usage. In 2020, Kagawa Prefecture on the Japanese island of Shikoku passed a law limiting the time young people can spend playing video games. That law tried to limit game-playing to one hour per day on weekdays and 90 minutes per day on weekends.

While it has not yet been officially recognized as a psychological diagnosis, youth phone addiction is a growing concern for many health officials. Excess phone usage has been linked to declining grades, social withdrawal, anxiety, and depression.

There is a growing movement in the U.S. to ban smartphones from schools. Schools in more than thirty-five states banned phones last year, with even more this school year.

There have been some controlled studies that measure the impact of banning phones in school. A trial in France of 200 schools showed reduced cyber-bullying and increased social interaction of students. A trial of a ban in middle schools showed a 35% increase in student participation with students “talking to each other again”. A study in Brazil, which introduced a county-wide ban, showed a 35% improvement in mathematics and a 13% increase in Portuguese grades.

A more detailed look at the data shows some interesting trends. The impact seems biggest for middle-school students, who generally struggle at that age with impulse-control. The bans seem to have a bigger impact on grades in high-poverty areas. The bans seem to impact girls a lot more than boys.

A New ReConnect

Senators Roger Marshall (R-Kansas) and Peter Welch (D-Vermont) have introduced a new bill that would fund USDA ReConnect grants with $650 million annually through 2030. It seems probable that, if passed, this would be the only federal grant program for the foreseeable future.

The requirements listed in the bill are pretty basic and reminiscent of prior rounds of ReConnect:

  • The funding can only be used in rural areas.
  • At least 75% of households in a grant area must be unable to buy broadband of at least 100/20 Mbps. Grants applicants will get priority if 90% of locations don’t meet that speed.
  • Constructed networks must provide speeds of at least symmetrical 100 Mbps.
  • The money can be awarded as grants, loans, or grant/loan combinations.
  • Applicants must make a 25% match, but this can be waived.
  • Grantees have five years to construct the network.

As I wrote recently, I believe there will still be many millions of locations left behind after the BEAD grants. But there are a few issues that will stop the proposed new ReConnect grants if not remedied:

  • If satellite is considered when looking at existing broadband coverage, then every place in the U.S. will be considered to have adequate broadband. The last round of ReConnect allowed applicants to disregard locations where the only fast broadband came from satellite and unlicensed fixed wireless. Without a similar exclusion, there will be no chance of creating a ReConnect grant serving area.
  • I believe there are many millions of locations where an ISP claims the capability to provide 100/20 Mbps in the FCC mapping data, but delivers slower speeds. Applicants need to have some reasonable way to challenge exaggerated FCC map data. In the last round of ReConnect, USDA largely accepted the FCC maps as gospel.
  • The requirement to propose service areas where at least 75% of locations don’t have fast broadband is based on an assumption that there are still entire neighborhoods that don’t have good broadband. The reality on the ground is that most places are a jumble of served and unserved locations mixed across the geography. The only way the 75% rule can be realistically met is if ISPs can cobble together a dozen little pockets of folks with poor broadband into a single grant application. USDA theoretically allows that, but practically, they have made this difficult. This will be totally impossible if USDA sticks with looking at broadband coverage at the hexagon level instead of at individual homes.

It would be very beneficial for USDA if the proposed bill gives the agency the authority to put together study areas that ignore satellite and that provide a reasonable way for applicants to ignore faulty FCC maps. I fear that, without that specific guidance in the new Bill, there is a decent chance that USDA will find itself paralyzed and unable to find grants that can work.

It’s not hard to picture the three broadband arms of the government at odds with each other in the near future concerning the definition of broadband. NTIA won’t want to see ReConnect used to ‘overbuild’ areas where BEAD awarded grants to satellite. It’s easy to imagine the FCC defending the broadband maps and not making it easy for USDA to ignore exaggerated speed claims.

I see this bill as a positive move. It’s bipartisan, which has normally been the case for rural broadband policy. But there are some real challenges to make this work, and if the legislation is the only guiding document for the USDA, I’m not sure it is enough to make a viable future ReConnect program.

Justifying Cuts to BEAD

NTIA Assistant Secretary Arielle Roth recently made a speech at the Hudson Institute that outlined her policy positions related to reshaping the BEAD program. The changes to BEAD were initiated by Commerce Secretary Lutnick and are now being finalized and implemented by Roth. The bottom-line impact of the changes will be to cut the amount of spending from the BEAD grant program roughly in half, with the savings returned to the Treasury.

Roth says that the changes are not just about saving money. Her position is that the cuts being made are to make sure that the government doesn’t distort or sabotage the pace of technological innovation.  To quote Roth, When the government overspends or over-subsidizes a single technology, it doesn’t just waste funds; it warps the progress of innovation. Excessive subsidies crowd out private investment, slow down research and development, and delay technological progress. That’s counterproductive to BEAD’s mission, which is to close broadband gaps, not freeze technology in place. In a field as dynamic as broadband, minimizing distortion is critical, because in every case, the most significant breakthroughs in connecting rural Americans have come not from subsidies, but from technological innovation itself.

That’s an amazing policy position to take related to rural broadband. I’d like to put her position into a bigger perspective. Her argument is a good one related to the whole broadband industry. I don’t know anybody who would argue that the federal government should help to pick technology winners for the entire industry. The market today is taking care of that pretty well. Over the past several years, over 14.5 million customers have chosen to buy broadband from the FWA cellular companies, much to the dismay of the cable companies. A recent article said that major fiber overbuilders in urban and suburban markets are seeing penetration rates between 35% and 45%. Starlink has grown to have four million U.S. broadband customers. It seems like the overall market is working pretty well, being fueled by private capital and head-to-head competition.

But even there, the federal government can’t help itself from fiddling at least a little bit with the market. Congress recently directed the FCC to rework mid-band spectrum allocations to bring 600 megahertz of spectrum to auction. That action will benefit FWA cellular competition even more than cellular service. That’s what incessant lobbying and contributions from the cellular industry buy. But mostly, the government hasn’t been intervening in the broader broadband market.

Roth’s policy position falls flat when applied to the tiny world of BEAD. The BEAD grant program aims to bring broadband to the last 6 to 7 million locations that got left behind by the market. For the most part, the BEAD locations are the most remote or least dense areas where nobody has been willing to make private investments. Congress recognized this when they developed BEAD, and it’s still the market reality. Without a one-time grant subsidy, these locations will likely never see better broadband.

How BEAD grants are awarded is extremely important to the households that will get the broadband, along with the construction companies and vendors that will supply the materials and labor to implement the grants. A lot of rural people put effort into the BEAD process in the hopes of getting a generational broadband upgrade.

But the technology chosen for BEAD has almost zero impact on the bigger U.S. broadband market. It really doesn’t matter to the larger market if the BEAD money all goes for fiber or all goes for satellite. It’s absolutely impossible to make an argument with a straight face that the technologies chosen for BEAD will somehow “warp the progress of innovation” if it’s not done just right.

I know Assistant Secretary Roth is looking for a good argument for awarding all of the locations in a rural county to satellite instead of fiber – but this argument is not it. The truth is that the NTIA changes to BEAD are only about saving money. NTIA could have completed the BEAD grant award on the same timeline by deciding to spend the full $42.5 billion, but they chose not to. They whacked the spending in half and are now searching for a clever way to justify their choices. I can’t think of a more ludicrous argument than one that says that spending more BEAD money on fiber would somehow sabotage the overall pace of broadband technology innovation for the country.

I’m Ready to Call It

I think we can now foresee the demise of traditional telephone service delivered over the PSTN (public switched telephone network). My best guess is the PSTN will ether be dead or dying by the end of 2030. This doesn’t mean the death of telephone voice service, but the end of the regulated service that has been offered by telephone companies. Any voice products that remain will be delivered using VoIP.

The death of the PSTN is being fostered by the FCC, which has made it much easier for telephone companies to tear down or decommission copper telephone networks. The FCC began the process by providing a two-year moratorium on notifications for taking down copper in July and followed that up more recently with a formal docket to make the rules permanent.

Eliminating copper lines is not the same as eliminating the PSTN. I expect the FCC will formally announce rules to end the PSTN soon. But even if the FCC doesn’t take specific action, I expect the big telcos to start dismantling the PSTN in pieces on their own.

The PSTN consists of a private network owned collectively by telephone companies. The PSTN is a series of regional networks that surround a large tandem switch that connects to the telcos and CLECs in the region. The connections between each voice provider and the tandem are called trunks. These are transport routes, many still using the old TDM technology based on T1s, to deliver the traffic. Local voice providers can also have direct trunks to other local voice providers in the area, to the largest long-distance carriers, or to the large cellular carriers.

The PSTN is also the mechanism used to route calls between a local voice provider and the many other carriers in the country. There is a complex set of routing tables that instruct tandem switches how to route calls to reach every registered telephone number in the system. The PSTN is also the starting point for routing other kinds of calls, like international long distance and 800 numbers.

This may sound too complex to break apart, but the biggest telcos have been talking about this for over twenty years. They do not want to be responsible for taking care of the local PSTN arrangements, which costs them money and causes a lot of maintenance. I remember sitting in meetings twenty years ago that discussed ways that the regional tandem switching network could be deactivated over time. There was a lot of investigation done on the topic ten years ago at the FCC, but that effort fizzled out somehow.

The impetus to dismantle the PSTN was always driven by money. The big long-distance carriers were paying huge amounts in access charges to get ‘access’ to the local networks of the many voice providers in the country. The FCC took an axe to many of those fees, and after the magnitude of spending on access decreased, I think the focus on finishing the process died.

The largest telcos like AT&T have always envisioned a much-simplified replacement for the PSTN. Twenty years ago, AT&T talked about a vision where it would replace hundreds of tandem switches nationwide with perhaps two for the whole country. Every carrier that used one of its tandems would be responsible for buying transport to reach one of the big new switches. We can’t ever get rid of the function of routing calls, but this vision would shift most of the cost of the PSTN function away from the big telcos onto each company that originates or terminates voice calls. Under the AT&T vision, the PSTN would be greatly simplified by greatly decreasing the number of locations where calls are exchanged.

There is nothing stopping the big telcos from doing this, other than having a method in place to make sure that calls continue to route. The big carriers are feeling emboldened by the current FCC to wash away old systems, and I think they are now ready to finally tackle this.

Missed by BEAD

An article from the Advanced Communications Law and Policy Institute at the New York Law School claims that over 1 million locations were missed by the BEAD grants. They identified these as locations that are still shown as unserved and underserved on the FCC broadband maps, but which did not make it into the BEAD program.

ACLP also identified two other sources of locations that will likely not get broadband. They predict some BEAD defaults since a number of small and untested ISPs won sizable BEAD grants. They also believe there will continue to be defaults in other grant programs.

ACLP recommends that up to half of the $20 billion+ that will not be spent on BEAD grant be deposited into a BEAD Reserve Fund to be used to cover the shortfalls.

It’s a sensible idea, but unlikely to gain any momentum. It seems clear that NTIA wants to take credit for solving the rural broadband gap while also returning $20 billion to the U.S. Treasury. I can’t think of any mechanism that would allow NTIA to keep unspent monies alive once BEAD grants are awarded and NTIA makes a final announcement on non-deployment funds. The general consensus I’m hearing is that NTIA will award little or nothing to non-deployment funds.

I think ACLP is missing the bigger picture, and I think there are many millions more locations that should have rightfully been included in BEAD.

ACLP’s math starts with the assumption that the speeds reported to the FCC in the broadband maps are right. Anybody who has worked at a local level knows this is often not the case. There are a lot of ISPs that claim a speed of exactly 100/20 Mbps in the FCC maps, and I believe that millions of these locations have been falsely excluded from BEAD.

Each State had a BEAD map challenge that was supposed to result in an accurate map, but that process was largely a total bust. The map challenge rules made it much easier to exclude locations from the preliminary BEAD maps than add locations. The process of proving an ISP was overstating speed capabilities in the FCC maps was nearly impossible to comply with.

Additionally, NTIA declared that licensed fixed wireless was to be treated as served as long as speeds were reported at 100/20 Mbps. In many counties I worked with for the map challenges, it became obvious that reporting by some WISPs was a joke. I remember one WISP that drew an eleven-mile radius circle around every tower and claimed the ability to serve every place in that circle. Numerous WISPs used seven- and nine-mile circles and also claimed full coverage. The irony of the NTIA ruling was that the only requirement to block off big areas from BEAD was adding CBRS spectrum to the spectrum mix. Many WISPs tell me that CBRS is an unremarkable spectrum due to the small channel sizes.

The other big category of locations that could have been covered by BEAD was low-income MDUs. The BEAD legislation suggested that States attack this issue using non-deployment funds. The number of such locations is hard to identify because many MDUs show as served in the FCC map since there is fiber nearby. But an MDU is not served until somebody is willing to invest in the inside wiring needed to bring better broadband to residents.

My guess is that the number of locations missed by BEAD is likely 6 to 7 million, much higher than the number suggested by ACLP. I have no analytical basis for that guess other than I seem to find examples of places missed by BEAD in every community I dig deeply into.

At some point, this will all come clear as folks without good broadband continue to complain to their elected officials. RDOF was supposed to fix the digital divide. BEAD was supposed to solve it. Maybe the next time will be the charm, although I’m not taking any bets on it.

Technology Neutral

I cringe every time I see the term “technology neutral”. Over the last few years, NTIA has morphed the phrase into a euphemism to mean we should favor the cheapest technology over the best technology.

And it clearly is a euphemism meant to disguise the true nature of the broadband policy discussion from those not involved in the topic every day. Governments have gotten so good at developing such phrases that the euphemisms replace the right language and become common usage. We routinely hear phrases like revenue enhancement instead of tax increase, or negative growth instead of losses without fully realizing what is not being said.

The phrase technology neutral didn’t start as a euphemism. It comes from a policy paper issued during the Clinton Administration, “Framework for Global Electronic Commerce“, which used the term “technology-neutral” to warn that governments shouldn’t get involved in trying to steer the technology direction for the budding Internet industry. The Administration at the time believed that a hands-off market approach would best allow the Internet to develop. It turns out they were right.

It seems pretty clear that the term was tossed into the IIJA legislation as a bone for WISPs. They badly wanted to participate in BEAD and used the term technology-neutral to plant the idea that all technologies that could deliver speeds of 100/20 Mbps were all equivalent. Until Tarana came out with much faster radios, the fixed wireless technology at the time didn’t deserve to be considered for long-term grants – and sure enough, five years later, the older radios have already joined DSL and other older technologies in the obsolete technology trash bin.

I’ve been searching for a good analogy for the current use of technology neutral and think I have one. Consider a tiny village that is not connected to the power grid. There is a wide range of technology solutions for providing homes with heat and light. The village could be given a self-sufficient solar power farm. They could be connected to a nuclear power plant. They could be given an obsolete coal-powered plant being decommissioned from somewhere else. Each home could be given a gas generator. They could be provided with the low-tech option of fireplaces and axes to chop firewood.

The various technology choices are clearly different in terms of cost and effectiveness. The NTIA technology neutral position would say that all of these options are acceptable, as long as they deliver heat and light to the homes today and also will deliver heat and light in the foreseeable future. If there were a government grant to bring heat and light to the towns that operated under the NTIA rules, the decision would be made on cost, since all of the solutions are considered to be technology-neutral. I don’t think the rural residents would be thrilled with their government-subsidized axes.

Don’t mistake this as a rant for building fiber instead of other broadband technologies. In the example, it would be extreme to build the most expensive solutions, like a nuclear power plant. I don’t know anybody who supports the idea of spending huge amounts of money to bring broadband to a small number of places. Going back to the village in my example, there are a lot of options between a nuclear power plant and fireplaces.

The real problem I have with the term technology neutral is that it says that all broadband technologies are the same, and they clearly are not. Starlink is not equivalent to fiber for a small community. For one thing, fiber can be used for a lot of other purposes that can benefit the community beyond bringing home broadband. Using a euphemism is a way to disguise the real discussion that should be held at State Broadband Offices – what can be afforded for the funding that is available. I think States were mostly doing that, but the shift to the lowest-cost solution ended all logical deliberation.

As we saw in the first BEAD award from Louisiana, which was done under the original BEAD rules, the State still awarded satellite technology for some locations, because that was the most sensible solution for those places. But when the rules got reshuffled to impose technology neutrality, deliberate decisions of the broadband office were replaced with a simple cost comparison.

Cox Wins DMCA Case

There was another interesting court decision involving Cox. The U.S. Court of Appeals for the Ninth Circuit ruled that ISPs can’t be served subpoenas related to some portions of the Digital Millennium Copyright Act (DMCA).

The lawsuit arose from a subpoena issued to Cox by Capstone Studios concerning the movie Fall. Capstone had identified 29 Cox subscribers that it said had pirated a copy of the movie using BitTorrent. Cox notified its customers and asked them if they wanted to respond to the court, and one subscriber did. The subscriber claimed that they had an open WiFi connection and that somebody else must have downloaded the movie using their bandwidth. The U.S. District Court of Hawaii ruled that Cox qualified for the conduit safe harbor since Cox only provided the Internet connection and had no role or obligations in the transaction of downloading the movie.

The DMCA rules are key to protecting Internet service providers from being prosecuted for the information that passes through their platform at the request of users. The four safe harbors include:

  • The conduit safe harbor is the one evoked by the Court for Cox. This says that online service providers have no obligations under DMCA if they only temporarily store user data for the purpose of transmission. ISPs like Cox don’t look to see what customers are transmitting on their networks.
  • The system caching safe harbor says service providers have no liability if they cache data at the request of others to speed access to users or reduce traffic on the Internet.
  • The information safe harbor protects service providers if they store information on their platforms at the direction of users.
  • The information location tools safe harbor protects service providers from liability if their site includes search engines, directories, or hyperlinks that might send users to infringing materials.

Capstone appealed the ruling to the Appeals Court. The Ninth District agreed with the lower court and ruled that ISPs are only conduits for the data sent over their networks and that Cox hadn’t taken any actions that demonstrated that it had a role in pirating the movie. The Ninth Circuit went further than the District Court and said that the original subpoena was invalid since Cox only acted as an ISP and was not a party to the action of pirating the movie.

This is good news for ISPs since they clearly have no idea how customers are using their broadband connection. The Internet would come to a screeching halt if ISPs were held liable for actions taken by subscribers. Most ISPs have terms of service that allow them to disconnect customers who engage in bad behavior on the Internet, but ISPs only hear about that behavior from third parties.

This case is tangentially related to another case involving Cox that was recently accepted by the U.S. Supreme Court. That case involves a longstanding dispute between Cox and music labels. The case originated with a 2019 decision by a Virginia Court that found Cox liable for both contributory and vicarious copyright infringement for actions taken by its customers and awarded the record companies an astounding $1 billion in damages.

Even after an Appeals Court reversed the fine, Cox is still in violation of contributory damages over actions taken by its customers. The record labels want Cox to permanently disconnect any customer who engages in repeated copyright infringement. If that ruling holds, it would turn ISPs into Internet policemen who must monitor and punish customers who engage in copyright infringement for music, movies, games, books, and sports events.

Big ISPs and Speeds

I was recently reminded in a conversation with a client how cable company executives used to tell the public that they didn’t need faster broadband speeds, and what the cable companies offered was fine. Looking through my archives, I found the following statements from 2013, where cable companies were responding to the first Google Fiber offerings of symmetrical gigabit broadband.

In 2013, Time Warner Cable CFO Irene Esteves announced that the company didn’t see the need to deliver Google Fiber speeds to consumers. Comcast Executive Vice President David L. Cohen was quoted as saying that gigabit speeds were pointless due to limitations on the data speeds that could be delivered from websites and the lack of capability of home WiFi routers. Michael Powell, the CEO of the National Cable & Telecommunications Association, characterized gigabit speeds as an “irrelevant exercise in bragging rights”.

The criticisms had some merit at the time. There was no web traffic that operated at speeds even close to a gigabit. Off-the-shelf WiFi routers couldn’t handle anything close to gigabit speeds. But the public didn’t care because performance on fiber was perceived as being significantly better than what was delivered by cable companies, and customers flocked to Google Fiber in the markets where it was introduced. Interestingly, Time Warner obviously thought the Google Fiber threat was real, because the company quickly built fiber-to-the-premise to compete against Google in North Carolina.

There were some customers who benefited from gigabit speeds. I recall talking to a doctor who subscribed to gigabit speeds when it became available from a municipal ISP. This hospital also had gigabit broadband, and the doctor was able to download large MRI files at home in a reasonable amount of time once he had gigabit fiber. I also talked to a photographer who used a different municipal ISP who told me that gigabit speeds made it possible for the first time to upload photography and video libraries to clients without having to wait for hours for the uploads to complete.

The next time that cable companies told the public they didn’t need faster speeds was during the pandemic, when it became clear that cable company upload speeds of 10 Mbps were not able to handle multiple people working and schooling at home at the same time. Every big cable company defended its networks. Charter CEO Tom Rutledge said at the time that Charter’s network was adequate and justified that by pointing out that the majority of customer data usage was downstream. But Charter and other cable companies tweaked their networks during the pandemic to improve upload speeds to 15-20 Mbps. Still today, there are numerous cable networks that have not yet implemented any upgrades to bring significant improvement to upload speeds.

Many ISPs subtly tell their customers they don’t need fast broadband through their pricing. I find small ISPs around the country that still charge extremely high prices for anything faster than their basic broadband product.

This frankly mystifies me. I’ve always guessed that this kind of pricing is for two reasons. First, I think some small ISPs fear that customers who buy faster speeds will somehow cost the ISP a lot more money. But that doesn’t seem to be the case. I recall an Ookla article last year that said that, in some markets, the biggest data users were the customers buying the least expensive broadband package. I’ve had numerous ISPs tell me that their gigabit customers don’t use more broadband than their 100 Mbps customers.

The only other reason for high prices for faster speeds is that they are trying to create the idea that fast speeds are a super-premium product. But I think these ISPs are losing out on a lot of revenue. ISPs who space prices between speed tiers of $15 to $20 see that a lot of customers who are willing to upgrade to faster speeds when it doesn’t cost a lot more per month. Most customers are leery about paying $50 or more per month for a faster speed.

AT&T Raises Rates

AT&T announced it will raise broadband rates as of December 1 by $5 per month. This is the second year in a row that the company has raised rates by that amount. The fact that the company is raising rates in today’s environment is an interesting choice. I suspect the rate increase says several things about AT&T. The increase tells me that the company is meeting its fiber penetration goals and doesn’t think a rate increase will hurt its market share. It also speaks to a belief that customers perceive fiber as the superior technology that people are willing to pay for.

This will take AT&T fiber broadband prices to $69 for 300 Mbps, $80 for 500 Mbps, $95 for 1 Gbps, and $160 for 2 Gbps. Before the two rate increases, AT&T was priced noticeably lower than its cable competitors, but that is no longer the case.

The rate increase will apply to existing customers, although AT&T is not raising the rate for it’s low-income plan. In a move that always mystifies long-time customers, AT&T is still offering aggressively low rates for new customers while asking for more revenue from long-time customers. While writing this blog, I saw the AT&T website is offering introductory rates of 300 Mbps for $42 and 1 Gbps for $50. AT&T is also offering a low rate for its FWA cellular broadband of $47 per month.

AT&T is giving customers the typical story that the rate increases are needed to ensure that customers will receive a high level of service. But the company is not mentioning to its customers that it had a net income of $4.9 billion and free cash flow generated of $4.4 billion in the second quarter of this year.

This has to be good news for the big cable companies that compete against AT&T fiber. If the cable companies decide not to raise rates now, they can advertise against AT&T for doing so. However, this could also give cable companies the cover to raise rates again, and I’m sure this announcement is being discussed in cable Board rooms.

What I find most interesting about the rate increases is that the big cable companies have spent a lot of advertising dollars talking about lower rates. Cable companies are in a panic about losing customers to both fiber and FWA and have mostly fought back with lower introductory rates and special promotions.

Charter had a rate increase this year and raised broadband rates by $2 per month, starting with the July 2025 billing cycle. That’s the lowest rate increase from the company in years and follows a $3 rate increase in the summer of 2024. Charter has been pushing a two- or three-year price lock where rates are guaranteed without customers having to sign a contract.

Comcast has not been so cautious with rate increases and announced an across-the-board 5% rate increase for broadband at the end of 2024. It will be interesting to see what they will do this year. But Comcast has also been pushing low-rate deals, including a promotion in April that gave new customers a 5-year price lock.

These annual rate increases always prompt small ISPs to ask if they should raise rates. The majority of small ISPs do not raise rates every year. I know a number of cooperatives that typically only raise rates every three to five years. It’s ironic that, on the whole, these rate increases will mean that urban broadband rates will become significantly more expensive than rural rates, mostly due to urban rates getting increased every year. There are exceptions, and some rural companies have high rates, but most do not.