4G on the Moon

This blog is a little more lighthearted than my normal blog. An article in FierceWireless caught my eye talking about how Nokia plans to establish a 4G network on the Moon.

The primary purpose of the wireless technology will be to communicate between a base station and lunar rovers. 4G LTE is a mature and stable technology that can handle data transmission with ease – particularly in an environment where there won’t be any interference. While the initial communications will be limited to a base station and lunar rovers, the choice of 4G will make it easier to integrate future devices like sensors and astronaut cellphones into the network. NASA historically used proprietary communications gear, but it makes a lot more sense to use a communications platform that can easily communicate with a wide range of existing devices.

One challenge Nokia and NASA have to overcome on the moon is that the transmissions will be made between a low-sitting rover to a base station antenna that probably won’t be more than 3 – 5 meters off the ground. While there are no trees or other such obstacles on the moon, there are plenty of boulders and craters that will be a challenge for communications.

Nokia will have one benefit not available on earth – they can use the best spectrum band possible for the transmissions. They can establish wider data channels than are used on earth to accommodate more data within a transmission. Nobody has ever been handed a clean spectrum  slate to develop the perfect 4G system before, and Nokia engineers are probably having a good time with this.

The biggest challenge will be in designing a lightweight cellular base station that contains the core, the baseband, and the radios in a small box. All of the components must be hardened to work in wide-ranging temperatures on the moon, which can range from a high of 260 F in the daytime to minus 280 F in the dark.

Nokia engineers know they have to test, then retest the gear – there will be no easy repairs on the moon. The vision is that future lunar landings will touch down on the surface and then send off both manned and unmanned rovers to explore the moon’s surface. The 4G gear must survive the rigors of an earth launch, a moon landing, and the vibrations and jolts from rovers and still be guaranteed to always work in the desolate lunar environment

I have to admit that my first reaction to the article was, “Shouldn’t we be putting 5G on the moon?”. But then it struck me. There is no 5G anywhere in the world other than the marketing product that cellular carriers call 5G. Since there will be no easy upgrades in space, Nokia engineers are being honest in calling for 4G LTE. Honestly labeling this as 4G will remind future engineers and scientists about the technology being used. Wouldn’t it be refreshing if Nokia was as honest about the 5G in our terrestrial cellular networks?

Next Generation PON is Finally Here

For years, we’ve been checking the prices of next-generation passive optical network (PON) technology as we help clients consider building a new residential fiber network. As recently as last year there was still at least a 15% or more price penalty for buying 10 Gbps PON technology using the NG-PON2 or XGS-PON standards. But recently we got a quote for XGS-PON that is nearly identical in price to buying the GPON that’s been the industry standard for over a decade.

New technology is usually initially more expensive for two reasons. Manufacturers hope to reap a premium price from those willing to be early adapters. You’d think it would be just the opposite since the first buyers of new technology are the guinea pigs who have to help debug all of the inevitable problems that crop up in new technology. But the primary reason that new technology costs more is economy of scale for the manufacturers – prices don’t drop until manufacturers start manufacturing large quantities of a new technology.

The XGS-PON standard provides a lot more bandwidth than GPON. The industry standard GPON technology delivers 2.4 Gbps download and 1 Gbps upload speed to a group of customers – most often configured at 32 passings. XGS-PON technology delivers 10 Gbps downstream and 2.5 Gbps upstream to the same group of customers—a big step up in bandwidth.

The price has dropped for XGS-PON primarily due to its use by AT&T in the US and Vodaphone in Europe. These large companies and others have finally purchased enough gear to drive down the cost of manufacturing.

The other next-generation PON technology is not seeing the same price reductions. Verizon has been the only major company pursuing the NG-PON2 standard and is using it in networks to support large and small cell sites. But Verizon has not been building huge amounts of last-mile PON technology and seems to have chosen millimeter-wave wireless technology as the primary technology for reaching into residential neighborhoods. NG-PON2 works by having tunable lasers that can function at several different light frequencies. This would allow more than one PON to be transmitted simultaneously over the same fiber but at different wavelengths. This is a far more complex technology than XGS-PON, which basically has faster lasers than GPON.

One of the best features of XGS-PON is that some manufacturers are offering this as an overlay onto GPON. An overlay means swapping out some cards in a GPON network to provision some customers with 10 Gbps speeds. An overlay means that anybody using GPON technology ought to be able to ease into the faster technology without a forklift upgrade.

XGS-PON is not a new technology and it’s been around for around five years. But the price differential stopped most network owners from considering the technology. Most of my clients tell me that their residential GPON networks average around 40% utilization, so there have been no performance reasons to need to upgrade to faster technology. But averages are just that and some PONs (neighborhood nodes) are starting to get a lot busier, meaning that ISPs are having to shuffle customers to maintain performance.

With the price difference finally closing, there is no reason for somebody building a new residential network to not buy the faster technology. Over the next five years as customers start using virtual reality and telepresence technology, there is likely to be a big jump up in bandwidth demand from neighborhoods. This is fueled by the fact that over 9% of homes nationwide are now subscribing to gigabit broadband service – and that’s enough homes for vendors to finally roll out applications that can use gigabit speeds. I guess the next big challenge will be in finding 10 gigabit applications!

Build It and They Will Fill It

Early in my career as a consultant, I advised clients to not adopt the philosophy of “build it and they will come”. Fifteen years ago, when fiber networks were first being built to residential communities, I had clients who were so enamored with fiber technology that they couldn’t imagine that almost every household wouldn’t buy broadband from a new fiber network.

I saw clients invest in fiber networks and take bank loans based upon irrationally high customer penetration rates, with no basis for their projections other than hope. Fiber overbuilders who counted on everybody taking fiber were inevitably disappointed, and over time I saw most fiber builders become more realistic about penetration rates and engage in surveys and pre-sales efforts to get a better idea of how well they would do.

Interestingly, I’m seeing this same concept creep back into the industry. This time it has to do with building middle-mile transport fiber. I have heard the phrase ‘build it and they will fill it’ a number of times over the last few years. There are examples of fiber transport routes being subscribed quickly, and the exuberance from a few such examples has some fiber builders believing that they can’t fail in building transport fiber.

Unfortunately, for every fiber route that is a huge success, I can point to a dozen fiber routes that languish with little traffic. As it turns out, middle-mile fiber is probably the one product in our industry that best illustrates the classic economics of supply and demand.

Buyers of middle-mile transport have explicit needs to get from point A to point B. If a given fiber route can be part of such a solution, then they will consider buying transport. But buyers of transport usually consider all of the alternatives to buying on a given fiber route – there are almost always alternatives. I know one case where three different carriers built fiber to reach a large rural data center. This instantly created price competition and none of the carriers are seeing the revenues they hoped for when building the fiber.

Some of the companies that buy transport will also consider building fiber rather than buying dark fiber of lit bandwidth. Verizon is probably the best example of this – they seem to have an internal formula that determines when building is better than leasing. Even worse for fiber owners, once Verizon builds fiber it is instantly competing with the existing fiber.

Companies that lease fiber also have to deal with other issues. The ideal long-haul fiber route has a minimal number of POPs, and some carriers avoid routes with too many stopping points. Intermediate stopping points and POPs increase electronics costs and maintenance costs and each electronics site degrades the light signal a bit.

I advise that anybody building transport fiber needs to have an iron-clad reason the justifies building a specific route – even if there are no other revenues. If the carrier can’t enter a new market without the new transport, then the route is mandatory. But a carrier ought to have already lined up enough basic revenues to justify building a non-mandatory transport route. If one major fiber tenant pays enough to recover the cost of building the route, then it might be a good risk.

The same advice to be careful applies whether a route connects major cities or goes to rural areas. I remember years ago helping a client find a connection between Dallas and Kansas City and we found seven separate fibers that made the connection. This level of overbuilding drops the lease price for the route.

We had an interesting national experiment over a decade ago in building a lot of middle-mile fiber to rural communities that were funded by the ARRA Stimulus grants. A lot of the fiber built with those grants was pure middle-mile transport, with only a few stops along the routes to serve a handful of rural anchor institutions. Looking back a decade later is a great example of today’s topic. Many of the ARRA routes have attracted almost no interest even after a decade. Some routes built with the grants are doing well and gained transport sales to cellular carriers and to ISPs wanting to serve the last mile. It’s a challenge when comparing the winners and losers among those routes to understand why some rural routes attracted transport customers while other similar routes have not.

Leasing transport in rural markets is a tough business. The big wireless carriers like Verizon and AT&T have grown increasingly leery of entering into long-term fiber leases. Carriers that want to reach small rural towns to provide last mile fiber can’t afford to pay a lot for transport. Many WISPs are notoriously overextended and can’t afford expensive leases. While school systems might lease fiber for a while, they are always looking for grants to build and own the routes directly. The bottom line is that if you build it, there is no guarantee they will fill it.

Cost Models and Grants

Possibly the least understood aspect of the recent FCC RDOF grants is that the FCC established the base amount of grant for every Census block in the grant using a cost model. These cost models estimate the cost of building a new broadband network in every part of the country – and unfortunately, the FCC accepts the results of the cost models without question.

The FCC contracts with CostQuest Associates to create and maintain the cost estimation models. The cost models have been used in the past in establishing FCC subsidies, such as Universal Service Fund payments made to small telephone companies under the ACAM program. For a peek into how the cost models work, this link is from an FCC docket in 2013 when the small telcos challenged some aspects of the cost models. The docket explains some of the basics about of the cost model functions.

This blog is not meant to criticize CostQuest, because no generic nationwide cost model can capture the local nuances that impact the cost of building fiber in a given community. It’s an impossible task. Consider the kinds of unexpected things that engineers encounter all of the time when designing fiber networks:

  • We worked in one county where the rural utility poles were in relatively good shape, but the local electric company hadn’t trimmed trees in decades. We found the pole lines were now 15 feet inside heavy woods in much of the fiber construction area.
  • We worked in another county where 95% of the county was farmland with deep soil where it was inexpensive to bury fiber. However, a large percentage of homes were along a river in the center of the county that consisted of steep, rocky hills with old crumbling poles.
  • We worked in another county where many of the rural roads were packed dirt roads with wide water drainage ditches on both sides. However, the county wouldn’t allow any construction in the ditches and insisted that fiber be placed in the public right-of-way which was almost entirely in the woods.


Every fiber construction company can make a long list of similar situations where fiber construction costs came in higher than expected. But there are also cases where fiber construction costs are lower than expected. We’ve worked in farm counties where road shoulders are wide, the soil is soft, and there are long stretches between driveways. We see electric cooperatives that are putting ADSS fiber in the power space for some spectacular savings.

Generic cost models can’t keep up with the fluctuations in the marketplace. For example, I saw a few projects where the costs went higher than expected because Verizon fiber construction had lured away all local work crews for several years running.

Cost models can’t possibly account for cases where fiber construction costs are higher or lower than what might be expected in a nearby county with seemingly similar conditions. No cost model can keep up with the ebb and flow of the availability of construction crews or the impact on costs from backlogs in the supply chain.

Unfortunately, the FCC determines the amount to be awarded for some grants using these cost models, such as the recently completed RDOF grants. The starting bid for each Census block in the RDOF auction was determined using the results of the cost models – and the results make little sense to people that understand the cost of building fiber.

One might expect fiber construction costs to easily be three or four times higher per mile in parts of Appalachia compared to the open farmland plains in the Midwest. However, the opening bids for RDOF were not as proportionately higher for Appalachia than what you might expect. The net results are that grants offered a higher percentage of expected construction cost is the open plains compared to the mountains of Appalachia.

There is an alternative to using the cost models – a method that is used by many state grants. Professional engineers estimate construction costs and many state grants then fund some percentage of the grant cost based upon factors like the technology to be constructed. This kind of grant would offer the same percentage of grant assistance in all different geographies of a state. Generic cost models end up advantaging or disadvantaging grant areas, without those accepting the grants even realizing it. The RDOF grants offered drastically different proportions of the cost of construction – which is unfair and impossible to defend. This is another reason to not use reverse auctions where the government goofs up the fairness of the grants before they are even open for bidding.

The White House Broadband Plan

Reading the White House $100 billion broadband plan was a bit eerie because it felt like I could have written it. The plan espouses the same policies that I’ve been recommending in this blog. This plan is 180 degrees different than the Congress plan that would fund broadband using a giant federal, and a series of state reverse auctions.

The plan starts by citing the 1936 Rural Electrification Act which brought electricity to nearly every home and farm in America. It clearly states that “broadband internet is the new electricity” and is “necessary for Americans to do their jobs, to participate equally in school learning, health care, and to stay connected”.

The plan proposes to fund building “future proof’ broadband infrastructure to reach 100 percent broadband coverage. It’s not hard to interpret future proof to mean fiber networks that will last for the rest of the century versus technologies that might not last for more than a decade. It means technologies that can provide gigabit or faster speeds that will still support broadband needs many decades from now.

The plan wants to remove all barriers so that local governments, non-profits, and cooperatives can provide broadband – entities without the motive to jack-up prices to earn a profit. The reference to electrification implies that much of the funding for modernizing the network might come in the form of low-interest federal loans given to community-based organizations. This same plan for electrification spurred the formation of electric cooperatives and would do something similar now. I favor this as the best use of federal money because the cost of building the infrastructure with federal loans means that the federal coffers eventually get repaid.

The plan also proposes giving tribal nations a say in the broadband build on tribal lands. This is the third recent funding mechanism that talks about tribal broadband. Most Americans would be aghast at the incredibly poor telecom infrastructure that has been provided on tribal lands. We all decry the state of rural networks, but tribal areas have been provided with the worst of the worst in both wired and wireless networks.

The plan promotes price transparency so that ISPs must disclose the real prices they will charge. This means no more hidden fees and deceptive sales and billing practices. This likely means writing legislation that gives the FCC and FTC some real teeth for ending deceptive billing practices of the big ISPs.

The plan also proposes to tackle broadband prices. It notes that millions of households that have access to good broadband networks today can’t use broadband because “the United States has some of the highest broadband prices among OECD countries”. The White House plan proposes temporary subsidies to help low-income homes but wants to find a solution to keep prices affordable without subsidy. Part of that solution might be the creation of urban municipal, non-profit, and cooperative ISPs that aren’t driven by profits or Wall Street earnings. This goal also might imply some sort of federal price controls on urban broadband – an idea that is anathema to the giant ISPs. Practically every big ISP regulatory policy for the last decade has been aimed at keeping the government from thinking about regulating prices.

This is a plan that will sanely solve the rural broadband gap. It means giving communities time to form cooperatives or non-profits to build broadband networks rather than shoving the money out the door in a hurry in a big reverse auction. This essentially means allowing the public to build and operate its own rural broadband – the only solution I can think of that is sustainable over the long-term in rural markets. Big commercial ISPs invariably are going to overcharge while cutting services to improve margins.

Giving the money to local governments and cooperatives also implies providing the time to allow these entities to be able to do this right. We can’t forget that the electrification of America didn’t happen overnight and it took some communities as more than a decade to finally build rural electric networks. The whole White House infrastructure plan stretches over 8 – 10 years – it’s an infrastructure plan, not an immediate stimulus plan.

It’s probably obvious that I love this plan. Unfortunately, this plan has a long way to go to be realized. There is already proposed Congressional legislation that takes nearly the opposite approach, and which would shove broadband funding out of the door within 18 months in a gigantic reverse auction. We already got a glimpse of how poorly reverse auctions can go in the recently completed RDOF auction. I hope Congress thinks about the White House plan that would put the power back into the hands of local governments and cooperatives to solve the broadband gaps. This plan is what the public needs because it creates broadband networks and ISPs that will still be serving the public well a century from now.

A Surprise

I think my biggest industry surprise of the last year happened recently when I opened the front door and found that a new yellow page directory had been placed on my porch. I haven’t received a yellow pages directory for the last seven years living in the US or the decade before that living in the Virgin Islands. I hadn’t given it much thought, but I thought the yellow pages were dead.

The yellow pages used to be a big deal. Salespeople would canvass every business in a community and sell ads for the annually produced book. I remember when living in Maryland that the Yellow Pages was at least three inches thick just for the Maryland suburbs of DC and that there were similar volumes for different parts of the DC metropolitan area.

Wikipedia tells me that the yellow pages were started by accident in Cheyenne, Wyoming in 1883 when a printer ran out of white paper and used yellow in printing a directory. The idea caught on quickly and Reuben H. Donnelley printed the first official Yellow Pages directory in 1886.

Yellow Page directories became important to telephone companies as a significant source of revenue. The biggest phone companies produced their directories internally through a subsidiary. For smaller telcos, the yellow page ads were sold, and directories were printed by outside vendors like Donnelley that shared ad revenues with the phone company. The revenue stream became so lucrative in the 1970s and 1980s that many medium-sized telephone companies took the directory function in-house – only to and found out how hard it was to sell ads to every business in a market. The market for yellow pages got so crazy that competing books were created for major metropolitan markets.

Yellow pages were a booming business until the rise of the Internet. The Internet was supposed to replace the yellow pages. The original yellow pages vendors moved the entire yellow page directories online, but this was never a big hit with the public. It was so much easier to leaf through a directory, circle numbers of interest, and take notes in a paper copy of the directory than it was to scroll through pages of listings online.

Merchants always swore that yellow page ads were effective. A merchant that was creative in getting listed in the right categories would get calls from all over a metropolitan area if they sold something unique.

Of course, there was also a downside to yellow pages. The yellow paper and the glue used to bind the thick books meant that the paper wasn’t recyclable. This meant a huge pile of books ended up in landfills every year when the new books were delivered. After the directories lost some of their importance, many cities required that directories were only delivered to homes that asked for them to reduce the huge pile of paper in the landfills.

Yellow pages are just another aspect of telephony that has largely faded away. There was a time that you saw yellow pages sitting somewhere near the main telephone in every home you visited. It’s something that we all had in common – and it’s something that the consumer found to be invaluable. A new business knew they had made it when they saw their business first listed in the yellow pages.

The Accessible, Affordable Internet Act for All – Part 2

This is the second look at the Accessible, Affordable Internet Act for All sponsored by Rep. James E. Clyburn from South Carolina and Sen. Amy Klobuchar from Minnesota. The first blog looked at the problems I perceive from awarding most of the funding in a giant reverse auction.

In a nutshell, the bill provides $94 billion for broadband expansion. A huge chunk of the money would be spent in 2022, with 20% of the biggest funds deferred for four years. There are other aspects of the legislation worth highlighting.

One of the interesting things about the bill is the requirements that are missing. I was surprised to see no ‘buy American’ requirement. While this is a broadband bill, it’s also an infrastructure bill and we should make sure that infrastructure funding is spent as much as possible on American components and American work crews.

While the bill has feel-good language about hoping that ISPs offer good prices, there is no prohibition that I can find against practices like data caps imposed in grant-funded areas that can significantly increase monthly costs for a growing percentage of households.

The most dismaying aspect of the bill that is missing is the idea of imposing accountability on anybody accepting the various federal grant funds. Many state grant programs come with significant accountability. ISPs must often submit proof of construction costs to get paid. State grant agencies routinely visit grant projects to verify that ISPs are building the technology they promised. There is no such accountability in the grants awarded by this bill, just as there was no accountability in the recent RDOF grants or the recently completed CAF II grants. In the original CAF II, the carriers self-certify that the upgrades have been made and provide no back-up that the work was done other than the certification. There is a widespread belief that much of the CAF II upgrades were never done, but we’ll likely never know since the telcos that accepted the grants don’t have any reporting requirements to show that the grant money was spent as intended.

There is also no requirement to report the market success of broadband grants. Any ISPs building last-mile infrastructure should have to report the number of households and businesses that use the network for at least five years after construction is complete. Do we really want to spend over $90 billion for grants without asking the basic question of whether the grants actually helped residents and businesses?

This legislation continues a trend I find bothersome. It will require all networks built with grant funding to offer a low-income broadband product – which is great. But it then sets the speed of the low-income service at 50/50 Mbps while ISPs will be required to provide 100/100 Mbps or faster to everybody else. While it’s hard to fault a 50/50 Mbps product today, that’s not always going to be the case as homes continue to need more broadband. I hate the concept that low-income homes get slower broadband than everybody else just because they are poor. We can provide a lower price without cutting speeds. ISPs will all tell legislators that there is no difference in cost in a fiber network between a 50/50 Mbps and a 100/100 Mbps service. This requirement is nothing more than a backhanded way to remind folks that they are poor – there is no other reason for it that I can imagine.

One of the interesting requirements of this legislation is that the FCC gathers consumer prices for broadband. I’m really curious how this will work. I studied a market last year where I gathered hundreds of customer bills and I found almost no two homes being charged the same rate for the same broadband product. Because of special promotional rates, negotiated rates, bundled discounts, and hidden fees, I wonder how ISPs will honestly answer this question and how the FCC will interpret the results.

The bill allocates a lot of money for ongoing studies and reports. For example, there is a new biennial report that quantifies the number of households where cost is a barrier to buying broadband. I’m curious how that will be done in any meaningful way that will differ from the mountains of demographic data that show that broadband adoption has almost a straight-line relationship to household income. I’m not a big fan of creating permanent report requirements for the government that will never go away.

Taking the Short View

We need to talk about the insidious carryover impact of having a national definition of broadband speed of 25/3 Mbps. You might think that the FCC’s definition of broadband doesn’t matter – but it’s going to have a huge impact in 2021 on how we spend the sudden flood of broadband funding that’s coming to bear from the federal government.

First, a quick reminder about the history of the 25/3 definition of broadband. The FCC under Tom Wheeler increased the definition of broadband in 2015 from the paltry former definition of 4/1 Mbps – a sorely overdue upgrade. At the time that the new definition was set it seemed like a fair definition. The vast majority of US homes could comfortably function with a 25/3 Mbps broadband connection.

But we live in a world where household usage has been madly compounding at a rate of over 20% per year. More importantly, since 2015 we’ve changed the way we use broadband. Homes routinely use simultaneous broadband streams and a large and growing percentage of homes now find 25 Mbps download to be a major constraint on how they want to use broadband. The cable companies understood this, and to keep customers happy have upgraded their minimum download speeds from 100 Mbps to 200 Mbps.

Then came the pandemic and made the whole country focus on upload speeds. Suddenly, every student and every adult who tried to work at home learned that the upload stream for most  broadband connection will just barely support one person working at home and is completely inadequate for homes where multiple people are trying to function at home at the same time.

Meanwhile, the FCC under Chairman Ajit Pai ignored the reality of the big changes in the way that Americas use broadband. The FCC had multiple opportunities to increase the definition of broadband – including after the evident impact of the pandemic – but he stubbornly stuck with the outdated 25/3 definition. Chairman Pai did not want a legacy of suddenly declaring that many millions of homes didn’t have adequate broadband.

We now have an FCC that is likely to increase the definition of broadband, but the FCC is still waiting for a fifth Commissioner to hold a vote on the issue. Meanwhile, we are poised to start handing out billions of dollars of broadband subsidies that come from the $1.9 trillion American Rescue Plan Act. This includes $10 billion that is directly approved as state block grants for broadband plus some portion of the larger $350 billion that is approved for use for more general infrastructure that can include broadband. I can promise you that this money is going to come encumbered in some form or fashion by the old definition of broadband. I can’t predict exactly how this will come into play, but there is no way that the Treasure Department, which is administering these funds can’t ignore the official definition of broadband.

As much as federal officials might want to do the right thing, 25/3 Mbps is the current law of the land. The new federal monies are likely to emphasize serving areas that don’t have speeds that meet that 25/3 Mbps definition. Let me rephrase that to be more precise – federal broadband money will be prioritized to give funding to areas where ISPs have told the FCC that households can’t buy a 25/3 broadband product. Unfortunately, there are huge parts of the US where homes don’t get speeds anywhere close to 25/3 Mbps, but where ISPs are safe in reporting marketing speeds to the FCC rather than actual speeds. States like Georgia and North Carolina have estimated that the number of households that can’t buy a 25/3 Mbps broadband product is twice what is reported to the FCC.

What this all means is that we are going to decide where to spend billions in funding from the American Rescue Plan Act based upon the 25/3 Mbps definition of broadband – a definition that will not long survive a fully staffed FCC. The intransigence of Chairman Pai and the big ISPs that strongly supported him will carry over and have a huge impact even after he is gone. The broadband that will be built with the current funding will last for many decades – but unfortunately, some of this funding will be misdirected due to the government taking the short view that we must keep pretending that 25/3 Mbps is a meaningful measurement of broadband.

T-Mobile to Expand Rural Broadband Coverage

T-Mobile will be launching a marketing effort this month for a fixed LTE broadband product that it’s marketing as 5G. This launch was a requirement of the merger with Sprint. In November 2019 T-Mobile agreed that within three years it would provide fixed cellular broadband to cover 97% of the US population, with the goal increased to 99% within six years. T-Mobile’s announcement of the new product says the company plans to extend the product to 97% of US households by the end of 2022, mirroring the agreement it made with the FCC for the merger.

The latest announcement doesn’t mention broadband speeds. In the 2019 deal with the FCC, T-Mobile promised that it would provide 100 Mbps broadband for cellular speeds to 90% of households in the same three-year period, with the rest of T-Mobile’s commitment at 50 Mbps broadband.

That speed commitment is going to hard for T-Mobile to achieve. The 100 Mbps cellular speed is probably reachable in large cities. PC Magazine conducted its annual test of cellular speeds in summer 2020 in 26 cities and found the following average urban cellular speeds

  • AT&T averaged 103/19 Mbps
  • Verizon averaged 105/22 Mbps
  • T-Mobile averaged 74/26 Mbps

The AT&T and Verizon average speeds were boosted by having a millimeter-wave service available in some downtowns. The average LTE speed for those carriers likely looks a lot more like T-Mobile after backing out this outdoor-only gimmick product that is not reachable by most people. The PC Magazine speed tests verified what has been widely reported – that 5G speeds are slower on average than 4G LTE speeds.

The above speeds are only in large metropolitan areas where the majority of cellular customers are within a mile of a cell site. It’s going to be a different speed story when measuring cellular speeds in rural America where the average customer is going to likely live several miles from a cellular tower.

There are still huge tracts of rural America that have little or no cellular coverage – and I guess the FCC counts them as the 3% that don’t have to be covered by T-Mobile. My experience in driving through rural America is that there are probably more than 3% of homes that have either no cellular, or extremely weak cellular, and who are not going to benefit from a cellular data product from any of the carriers.

My consulting firm has been conducting rural speeds tests in communities around the country and I rarely see a fixed-cellular customer on any of the three carriers with speeds over 20 Mbps –  and speeds are often much slower. T-Mobile can make promises to the FCC about delivering 100 Mbps, but none of the cellular carriers are going to build a robust cellular network in rural America that can achieve the same speeds as in cities. They would be crazy to even think about it.

This doesn’t mean that rural homes shouldn’t start inquiring about the T-Mobile fixed cellular product – because it might be the fastest broadband in your neighborhood. T-Mobile has announced a $60 monthly price for customers willing to use autopay. The best news of the product is that data usage will be unlimited – a big relief to customers who have been paying a fortune for capped cellular broadband on a Verizon or AT&T hotspot.

T-Mobile says it currently has 100,000 customers nationwide on the fixed cellular product. It’s hoping to get 500,000 customers by the end of 2021, with a longer-term goal of 7 or 8 million customers within five years.

Theoretically, the broadband options for rural America should be getting better. We’re seeing beta trials with low-orbit satellite and announcements like this one from T-Mobile that promise something better than the telco DSL that barely works in rural places. I’d love to hear from folks about what is available where you live – because I’m not going to believe that T-Mobile will cover 97% of households without some proof. I hope my skepticism is misplaced.

I’m Looking to Hire a New Consultant

I’m looking to hire an Associate Consultant. This is a starting consulting position that will work directly with me. I spend interesting days on a wide variety of projects. My primary work is helping communities and ISPs look at opening new broadband markets. I also help ISPs find funding. I work with states and foundations in developing broadband policies. And I work on a number of interesting projects each year that are hard to categorize – I help clients solve problems. I also spend a lot of time responding to RFPs or writing proposals to provide consulting services.

This is a position with big growth potential in both knowledge and earnings. I’m looking for somebody who is willing to learn the intricacies of the broadband industry – you’ll be working with an industry insider and pro who knows a lot about almost all aspects of the industry.

The two traits I value the most are the ability to write clearly and the ability to tackle complex spreadsheets. I realize that’s an uncommon pair of talents for one person, but I run an uncommon business. I promise the right individual an interesting workday.

You’ll find a more detailed job description here.