Broadening the USF Funding Base

The funding mechanism to pay for the Universal Service Fund is broken. The USF is funded from fees added to landline telephones, cell phones and on large business data connections that are still billed using telco special access products (T1s and larger circuits). The USF fee has now climbed to an exorbitant month tax of 20% of the portion of those services that are deemed to be Interstate by the FCC. This equates to a monthly fee of between a dollar or more for every landline phone and cellphone (the amount charged varies by carrier).

The funding mechanism made sense when it was originally created. The fee at that time was assessed on landlines and was used to built and strengthen landline service in rural America. When the USF fee was introduced the nationwide penetration rate of landlines in urban America was over 98%, and the reasoning was that those with phone service ought to be charged a small fee to help bring phone service to rural America. The concept behind universal service is that everybody in the country is better off when we’re all connected to the communications network.

However, over time the use of the Universal Service Fund has changed drastically and this money is now the primary mechanism that FCC is using to pay for the expansion of rural broadband. This pot of money was used to fund the original CAF II programs for the big telcos and the A-CAM program for the smaller ones. It’s also the source of the Mobility Fund which is used to expand rural cellular coverage.

Remember the BDAC? That’s the Broadband Deployment Advisory Committees that was created by Chairman Ajit Pai when he first took the reins at the FCC. The BDAC was split into numerous subcommittees that looked at specific topics. Each BDAC subcommittee issued a report of recommendations on their topic, and since then little has been heard from them. But the BDAC subcommittees are still meeting and churning out recommendations.

The BDAC subcommittee tasked with creating a State Model Code has suggested the broadening of the funding for the USF. This is the one committee that is not making recommendations for the FCC but rather suggesting ideas that states ought to consider. The Committee has suggested that states establish a fee, similar to the federal USF fee and use the fee to expand broadband in each state. Many states have already done something similar and have created state Universal Service Funds.

The recommendation further suggests that states tax anybody that benefits from broadband. This would include not just ISPs and customers of ISPs, but also the big users of the web like Netflix, Google, Amazon, Facebook, etc. The reasoning is that those that benefit from broadband ought to help pay to expand broadband to everybody. The BDAC recommended language has been modified a few times because the original language was so broad that almost everybody in the country would be subject to the tax, and we’ve learned over years that taxation language needs to be precise.

This is not the first time that this idea has been floated. There are many who suggested in the past to the FCC that USF funding should be expanded to include broadband customers. Just as telephone customers were charged to fund the expansion of the telephone network it makes sense to tax broadband customers to expand broadband. But this idea has always been shot down because early in the life of the Internet the politicians in DC latched onto the idea of not taxing the Internet. This made sense at the time when we needed to protect the fledgling ISP industry – but that concept is now quaintly obsolete since Internet-related companies are probably collectively the world’s biggest industry and hardly need shielding from taxation.

AT&T is a member of this BDAC subcommittee and strongly supports the idea. However, AT&T’s motivations are suspect since they might be the biggest recipient of state USF funds. We saw AT&T lobbyists hijack the state broadband grant program in California and grab all of the money that would have been used to build real rural broadband in the state. The big carriers have an overly large influence in statehouses due to decades of lobbying, and so there is a concern that they support this idea for their own gain rather than supporting the idea of spreading broadband. We just saw AT&T lobbyists at the federal level sneak in language that makes it hard to use the e-Connectivity grants from competing with them.

But no matter how tainted the motivation of those on the BDAC committee, this is an idea with merit. It’s hard to find politicians anywhere who don’t think we should close the broadband gap. It’s clear that it’s going to take some government support to make this work. Currently, there are a number of state broadband grant programs, but these programs generally rely annually on allocations from the legislature – something that is always used annually as a bargaining chip against other legislative priorities. None of these grant programs have allocated enough money to make a real dent in the broadband shortfalls in their states. If states are going to help solve the broadband gap they need to come up with a lot more money.

Setting up state USF funds with a broad funding base is one way to help solve the rural broadband divide. This needs to be done in such a way that the money is used to build the needed fiber infrastructure that is needed to guarantee broadband for the rest of the century – such funds will be worthless if the money is siphoned instead to the pockets of the big telcos. It makes sense to assess the fees on a wider base, and I can’t see any reasonable objection against charging broadband customers but also charging big broadband-reliant companies like Netflix, Google, Amazon, and Facebook. The first state to try this will get a fight from those companies, but hopefully the idea of equity will win since it’s traffic from these companies that is driving the need for better broadband infrastructure.

The Huge CenturyLink Outage

At the end of December CenturyLink had a widespread network outage that lasted over two days. The outage disrupted voice and broadband service across the company’s wide service territory.

Probably the most alarming aspect pf the outage is that it knocked out the 911 systems in parts of fourteen states. It was reported that calls to 911 might get a busy signal or a recording saying that “all circuits are busy’. In other cases, 911 calls were routed to the wrong 911 center. Some jurisdictions responded to the 911 problems by sending out emergency text messages to citizens providing alternate telephone numbers to dial during an emergency. The 911 service outages prompted FCC Chairman Ajit Pai to call CenturyLink and to open a formal investigation into the outage.

I talked last week to a resident of a small town in Montana who said that the outage was locally devasting. Credit cards wouldn’t work for most of the businesses in town including at gas stations. Businesses that rely on software in the cloud for daily operations like hotels were unable to function. Bank ATMs weren’t working. Customers with CenturyLink landlines had spotty service and mostly could not make or receive phone calls. Worse yet, cellular service in the area largely died, meaning that CenturyLink must have been supplying the broadband circuits supporting the cellular towers.

CenturyLink reported that the outage was caused by a faulty networking management card in a Colorado data center that was “propagating invalid frame packets across devices”. It took the company a long time to isolate the problem, and the final fix involved rebooting much of the network electronics.

Every engineer I’ve spoken to about this says that in today’s world it’s hard to believe that it would take 2 days to isolate and fix a network problem caused by a faulty card. Most network companies operate a system of alarms that instantly notify them when any device or card is having problems. Further, complex networks today are generally supplied with significant redundancy that allows the isolation of troubled components of a network in order to stop the kind of cascading outage that occurred in this case. The engineers all said that it’s almost inconceivable to have a single component like a card in a modern network that could cause such a huge problem. While network centralization can save money, few companies route their whole network through choke points – there are a dozen different strategies to create redundancy and protect against this kind of outage.

Obviously none of us knows any of the facts beyond the short notifications issued by CenturyLink at the end of the outage, so we can only speculate about what happened. Hopefully the FCC enquiry will uncover the facts – and it’s important that they do so, because it’s always possible that the cause of the outage is something that others in the industry need to be concerned about.

I’m only speculating, but my guess is that we are going to find that the company has not implemented best network practices in the legacy telco network. We know that CenturyLink and the other big telcos have been ignoring the legacy networks for decades. We see this all of the time when looking at the conditions of the last mile network, and we’ve always figured that the telcos were also not making the needed investments at the network core.

If this outage was caused by outdated technology and legacy network practices then such outages are likely to recur. Interestingly, CenturyLink also operates one of the more robust enterprise cloud services in the country. That business got a huge shot in the arm through the merger with Level 3, with new management saying that all of their future focus is going to be on the enterprise side of the house. I have to think that this outage didn’t much touch that network, just more likely the legacy network.

One thing for sure is that this outage is making CenturyLink customers look for an alternative. A decade ago the local government in Cook County, Minnesota – the northern-most county in the state – was so frustrated by continued prolonged CenturyLink network outages that they finally built their own fiber-to-the-home network and found alternate routing into and out of the County. I talked to one service provider in Montana who said they’ve been inundated after this recent outage by businesses looking for an alternate to CenturyLink.

We have become so reliant on the Internet that major outages are unacceptable. Much of what we do everyday relies on the cloud. The fact that this outage extended to cellular outages, a crash of 911 systems and the failure of credit card processing demonstrates how pervasive the network is in the background of our daily lives. It’s frightening to think that there are legacy telco networks that have been poorly maintained that can still cause these kinds of widespread problems.

I’m not sure what the fix is for this problem. The FCC supposedly washed their hands of the responsibility for broadband networks – so they might not be willing to tackle any meaningful solutions to prevent future network crashes. Ultimately the fix might the one found by Cook County, Minnesota – communities finding their own network solutions that bypass the legacy networks.

How Bad is the Digital Divide?

The FCC says that approximately 25 million Americans living in rural areas don’t have access to an ISP product that would be considered as broadband – currently defined as 25/3 Mbps. That number comes out of the FCC’s mapping efforts using data supplied by ISPs.

Microsoft tells a different story. They say that as many as 163 million Americans do not use the Internet at speeds that the FCC considers as broadband. Microsoft might be in the best position of anybody in the industry to understand actual broadband performance because the company can see data speeds for every customer that updates Windows or Microsoft Office – that’s a huge percentage of all computer users in the country and covers every inch of the country.

Downloading a big software update is probably one of the best ways possible to measure actual broadband performance. Software updates tend to be large files, and the Microsoft servers will transmit the files at the fastest speed a customer can accept. Since the software updates are large files, Microsoft gets to see the real ISP performance – not just the performance for the first minute of a download. Many ISPs use a burst technology that downloads relatively fast for the first minute or so, but then slows for the rest of a download – a customer’s true broadband speed is the one that kicks in after the burst is finished. The burst technology has a side benefit to ISPs in that it inflates performance on standard speed tests – but Microsoft gets to see the real story.

I’ve ranted about the FCC’s broadband statistics many times. There are numerous reasons why the FCC data is bad in rural America. Foremost, the data is self-reported by the big ISPs who have no incentive to tell the FCC or the public how poorly they are doing. It’s also virtually impossible to accurately report DSL speeds that vary from customer to customer according to the condition of specific copper wires and according to distance from the DSL core router. We also know that much of the reporting to the FCC represents marketing speeds or ‘up-to’ speeds that don’t reflect what customers really receive. Even the manner of reporting to the FCC, by Census block, distorts the results because when a few customers in a block get fast speeds the FCC assumes that everyone does.

To be fair, the Microsoft statistics measure the speeds customers are actually achieving, while the FCC is trying to measure broadband availability. The Microsoft data includes any households that elect to buy slower broadband products to save money. However, there are not 140 million households that purposefully buy slow broadband (the difference between 163 million and 24 million). The Microsoft numbers tell us that the actual speeds in the country are far worse than described by the FCC – and for half of us slower than 25/3 Mbps. That is a sobering statistic and doesn’t just reflect that rural America is getting poor broadband, but also that many urban and suburban households also aren’t achieving 25/3 Mbps.

I’ve seen many real-life examples of what Microsoft is telling us. At CCG Consulting we do community surveys for broadband and we sometimes see whole communities where the achieved speeds for customers is lower than the speeds advertised by the ISPs. We often see a lot more households claim to have no broadband or poor broadband than would be expected using the FCC mapping data. We constantly see residents in urban areas complain that broadband with a relatively fast speed seems slow and sluggish.

Microsoft reported their findings to the FCC, but I expect the FCC to ignore their story. This is a drastic departure from the narrative that the FCC is telling Congress and the public. I wrote a blog just a few weeks ago describing how the FCC is claiming that big ISPs are delivering the speeds that they market. Deep inside the recent reports the FCC admitted that DSL often wasn’t up to snuff – but the Microsoft statistics mean that a lot of cable companies and other ISPs are also under-delivering.

In my mind the Microsoft numbers invalidate almost everything that we think we know about broadband in the country. We are setting national broadband policy and goals based upon false numbers – and not numbers that are a little off, but rather than are largely a fabrication. We have an FCC that is walking away from broadband regulation because they have painted a false narrative that most households in the country have good broadband. It would be a lot harder for politicians to allow broadband deregulation if the FCC admitted that over half of the homes in the country aren’t achieving the FCC definition of broadband.

The FCC has been tasked by Congress to find ways to improve broadband in areas that are unserved or underserved – with those categories being defined by the FCC maps. The Microsoft statistics tell us that there are huge numbers of underserved households, far higher than the FCC is recognizing. If the FCC was to acknowledge the Microsoft numbers, they’d have to declare a state of emergency for broadband. Sadly, the FCC has instead doomed millions of homes from getting better broadband by declaring these homes as already served with adequate broadband – something the Microsoft numbers say is not true.

The current FCC seems hellbent on washing their hands of broadband regulation, and the statistics they use to describe the industry provide the needed cover for them to do so. To be fair, this current FCC didn’t invent the false narrative – it’s been in place since the creation of the national broadband maps in 2009. I, and many others predicted back then that allowing the ISPs to self-report performance would put us right where we seem to be today – with statistics that aren’t telling the true story. Microsoft has now pierced the veil to see behind the curtain – but is there anybody in a position of authority willing to listen to the facts?

FCC Urban Rate Survey

The FCC collects retail prices annually from urban carriers for landline telephone and broadband services. These prices are used to determine benchmark rates for rural areas for incumbent local exchange rate-of-return carriers, incumbent price-cap carriers receiving CAF Phase II support, recipients of the Rural Broadband Experimental grants, and winners of the recent Connect America Fund Phase II Auction.

I find it ironic that the FCC says they no longer regulate broadband, yet they still define maximum broadband rates allowed for various classes of carriers. The fact is that there are still numerous ways that the FCC is regulating broadband and since many of these mandates come from Congress the FCC will never be able to back out of broadband regulations entirely.

The FCC publishes spreadsheets summarizing of the rates they collected. The benchmark rate for voice defines the highest and lowest rates that are allowable by the affected carriers. Starting in 2019 the lowest rate that can be charged for residential voice is $26.98 and the highest is $51.61.

The following table is for the residential broadband rates listed by AT&T in North Carolina, where I live. The rates listed are non-discounted rates and many customers pay less due to bunding or to negotiating a lower rate. It is striking to me that AT&T charges $70 per month for a 10/1 Mbps connection on DSL and also for a 100/100 Mbps connection on fiber. This is one of the issues that has rural customers up in arms – they pay high prices for less performance, particularly considering that they often only receive a fraction of the published speeds shown in the table. It’s also worth noting that AT&T has a monthly data cap on every product other than their symmetrical gigabit product.

Download Upload Data Cap
Mbps Mbps GB Price Technology
3 0.384 150 $56 DSL
5 1 1000 $60 DSL
5 5 1000 $60 FTTP
6 0.512 150 $61 DSL
10 1 1000 $70 DSL
18 1.5 1000 $70 DSL
25 25 1000 $70 DSL
25 5 1000 $70 FTTP
25 25 1000 $60 FTTP
50 50 1000 $70 DSL
50 10 1000 $70 DSL
50 50 1000 $70 FTTP
75 20 1000 $70 DSL
100 20 1000 $70 DSL
100 100 1000 $70 FTTP
300 300 1000 $90 FTTP
1000 1000 unlimited $100 FTTP

The benchmarks for broadband are extremely high and it’s doubtful that many carriers are even trying to charge the rates shown in the table below. There are separate rate caps calculated for Alaska and the rest of the US.

Download Bandwidth (Mbps) Upload Bandwidth (Mbps) Capacity Allowance (GB) 2019 U.S.

($)

2019 AK

 ($)

4 1 200 66.12 113.19
4 1 Unlimited 70.76 119.06
10 1 200 72.31 121.54
10 1 Unlimited 77.30 127.75
25 3 200 77.65 129.52
25 3 Unlimited 82.66 135.75
25 5 200 78.49 129.78
25 5 Unlimited 83.50 136.01
50 5 Unlimited 100.85 153.64
100 10 Unlimited 106.23 161.16
250 25 Unlimited 128.69 203.67
500 50 Unlimited 148.35 223.87
1000 100 Unlimited 162.33 232.38

This is one of the exercises that the FCC must go through that seems largely meaningless. They set a really high rate cap for those that participate in various FCC subsidy programs – but realistically it’s unlikely that many carriers would want to charge more than $100.85 for a 50/5 Mbps connection – but if they did, customers have a legal recourse. What’s more valuable from this exercise is seeing the list prices of the larger urban ISPs – something that’s getting harder to find on line.

Trusting Big ISP Data

The FCC has finally come to grips with the fact that big ISPs are supplying bad data to the various FCC mapping efforts that are then used to distribute FCC funding and to set national policies. The latest mapping snafu come from one-time data collection from the cellular carriers last year showing rural cellular coverage. These maps were to be used to establish a new federal fund called the Mobility Fund II which will distribute $4.53 billion for the expansion of 4G cellular coverage to rural parts of the country that have little or no cellular coverage.

The big cellular companies have been lying about their cellular coverage for years. If you look at the nationwide 4G LTE coverage maps from AT&T and Verizon you’d think that they have cellular coverage virtually everywhere except in areas like deserts and mountains. But anybody living or traveling in rural America knows better. It’s not hard to drive very far off main highways and hit areas that never see a bar of cellular coverage. And even where there is coverage, it’s still often 3G or even older technology.

When the FCC collected data for the Mobility II funding the big carriers stuck to this same flawed mapping data. It turns out that overclaiming rural cellular coverage will keep funding from going to the smaller cellular companies that still serve in many parts of rural America. Luckily the FCC effort included a challenge process and the FCC was flooded with challenges showing that cellular coverage is far worse than is claimed by the big carrier maps. There were so many challenges that the FCC put the Mobility II award process on hold until they can sort it out.

This is just one of the mapping efforts from the FCC that have been used to award billions of dollars of funding over the last decade. The FCC relied on mapping data from the big telcos to establish the areas that were eligible for the billions of dollars of CAF II funding.

Since rural areas served by the biggest telcos have been neglected for years, and since the big telcos deployed very little rural DSL outside of towns it’s not hard to identify huge swaths of rural areas that have little or no broadband. But the big telco broadband coverage data contains a ton of inaccuracies. For example, there are numerous smaller rural towns that are listed in the telco databases as having decent broadband, when the reality on the ground is broadband speeds of a few Mbps at best. It looks like the big telcos often reported marketing speeds rather than actual speeds. This inaccuracy has stopped others from seeking federal grants and loans to upgrade such towns.

I fear that rural broadband mapping is on the verge of the next crisis. As a blogger I am contacted a lot by folks in rural America describing their broadband situation. I’ve heard enough stories to convince me that the big telcos have made only a half-hearted effort at implementing CAF II. I think many homes that should have seen CAF II broadband upgrades will see zero upgrades while many others will get upgraded to speeds that don’t meet even the measly CAF II goal of 10/1 Mbps.

The big telcos are not likely to come clean about having pocketed CAF II funding rather than spending every penny to make upgrades, and so they are going to claim that the CAF II areas have been upgraded, regardless of the actual situation on the ground. Rural households that didn’t see the promised upgrades will then be counted by the FCC as having better broadband. That will make these areas off limits to future federal funding to fix what the telcos botched. We already see the newest federal grant programs having a new requirement that no more than 10% of the homes covered by federal funding can have broadband today. Because of the falsified mapping, many homes without broadband are going to be deemed to be covered and it will be a massive challenge for somebody else to get funding to help such areas. These communities will be harmed twice – once by the telcos that aren’t upgrading speeds and second by the inaccurate mapping that will stop others from funding assistance to fix the problem.

The big telcos and carriers have huge incentives to lie about rural broadband coverage. None of the big telcos or cellular carriers want to spend any of their own money in rural areas, but they love the revenues they are receiving by a captive rural customer base who pays high prices for poor broadband. The big companies are fighting hard to preserve these revenues, which means they don’t want anybody else to get funding to improve broadband. To make matters worse, the big telcos continue to eliminate technicians and maintenance budgets in rural America, making it nearly impossible for customers to get repairs and service.

I unfortunately don’t have any easy solution for the problem of crappy mapping. Perhaps the FCC could entertain challenges to the broadband maps in the same way they are accepting challenges in the Mobility II process. I know a number of rural communities that would make the effort to create accurate broadband maps if this might bring them better broadband.

The FCC Looks at 911

The FCC recently released its tenth annual report to Congress reporting on the collection and distribution of 911 fees nationwide. The report includes a number of interesting statistics, a few which will be listed below.

But first I’d like to look backwards a bit because we now take 911 for granted, but it hasn’t always been so. 911 has been implemented during my adult lifetime. The idea for having an emergency phone number was first introduced in 1967 by Lyndon Johnson’s Commission on Law Enforcement. AT&T selected the 9-1-1 digits the following year. An independent telco, the Alabama Telephone Company leaped on the concept and introduced 911 in Haleyville, Alabama in 1968 – but it then took decades for the implementation nationwide since this was deemed a local issue to be implemented by local governments. I recall the introduction of 911 in the DC suburbs in the mid-70s, accompanied by a flurry of radio, newspaper and TV ads to inform the public of the new safety service. There were major metropolitan areas like the Chicago suburbs that didn’t get 911 until the early 1980s.

911 service has been enhanced over the years. For example, by 2015 96% of homes in the US were covered by E-911 (enhanced) where the 911 operator knows the caller’s location according to the phone number for landlines or by using triangulation of cell sites for mobile phones. Currently 911 systems are upgrading to NG911 (next generation) that ties 911 systems into broadband to be able to relay text messages, photos and videos as part of the 911 process.

Some of the interesting statistics from the FCC report:

  • In 2017 almost $3 billion was collected in 911 fees to fund local 911 efforts. The total cost to provide 911 was reported at $4.8 billion, with 911 services in many states also funded partially by tax revenues.
  • States collect 911 fees in different ways. This includes flat rates per telephone or cellular lines, percentage of telecommunications bills, and flat rate per subscriber. Fees vary widely and range from $0.20 per residential landline in Arizona to $3.34 in West Virginia per cell phone. There are states that charge eve more for business landlines.
  • Most states use the 911 fees to fund the 911 service, but six states – Montana, New Jersey, New York, Rhode Island and West Virginia use some of their 911 fee to fund non-safety purposes or even just to go into the general funds of the state. In total $284 million was diverted from collected 911 fees.
  • Thirty-five states, Puerto Rico and the District of Columbia have begun the process of upgrading to NG911.
  • Sixteen states have deployed statewide Emergency Services IP Networks (ESInets) for exclusive use of public safety agencies.
  • Thirty states, Guam, Puerto Rico and the US Virgin Islands have not taken any steps for cybersecurity for 911 centers (PSAPs).
  • There are 5,232 PSAPs in the country. These range from tiny centers in sheriff stations in rural counties to massive 911 centers in major metropolitan areas. For example, Washington DC has one PSAP while there are 586 in Texas.
  • 1,381 PSAPs now had the ability to communicate with the public by text message. Another 1,103 PSAPs will be implementing that capability in 2018.
  • There were over 39,000 operators employed to take 911 calls in 2017.
  • Only 44 states reported 911 call volumes and in those states there were over 211 million calls to 911. Over 70% of calls now come from cellular phones.

I know it’s easy to hate regulation, but without it we wouldn’t have a 911 system that works so well. People in most of the country feel a lot safer knowing they can dial 911 and get help when needed.

Telecom Predictions for 2019

It’s that time of year when I look forward at what the next year might bring to the industry. I see the following as the biggest telecom trends for 2019:

5G Will Not Save the World (or the Industry). This will be the year when we will finally stop seeing headlines about how 5G will transform society. There will be almost no actual introduction of 5G in networks, but we’ll still see numerous press releases by the big ISPs crowing about fictional 5G achievements.

CAF II Buildout Nearly Complete, but Few Notice. The CAF II upgrades will not have the impact hoped for by the FCC. Many areas that should have gotten speed increases to at least 10/1 Mbps will get something less, but nobody will officially monitor or note it. Households that buy the upgrades to 10/1 will still feel massively underserved since those speeds are already seriously obsolete.

People Will Wonder Why They Bought 5G Cellphones and 802.11ax Routers. The wireless carriers will begin charging premium prices for 5G-capable cellular phone yet there will be no 5G cell sites deployed. Households will upgrade to 802.11ax WiFi routers without realizing that there are no compatible devices in the home. Both sets of customers will feel cheated since there will be zero improvement in performance. Yet we’ll still see a few articles raving about the performance of each technology.

FCC Will Continue to Work Themselves out of the Regulatory Business. The current FCC will continue on the path to deregulate the large carriers to the fullest extent possible. They will continue to slant every decision in the direction of the big ISPs while claiming that every decision helps rural broadband.

Rural America Will Realize that Nobody is Coming to Help. I predict that hundreds of rural communities will finally realize that nobody is bringing them broadband. I expect many more communities to begin offering money for public/private partnerships as they try desperately to not fall on the wrong side of the broadband divide.

Broadband Prices Start to Climb. 2019 will be the first year that the world will notice the big ISP strategy to significantly increase broadband prices. We saw the first indication in November when Charter increased bundled broadband prices by $5 per month – the biggest broadband price increase in my memory. All the big ISPs are hoping to have broadband prices to $90 within 5 – 7 years.

Corporate Lobbyists Will Drive Policy. In 2018 there were numerous FCC decisions that came straight from the pens of telecom lobbyists. In 2019 those lobbyists will drive state and federal telecom legislation and FCC decisions.

Comcast and Charter Continue to Eat into Cellular Market. These two cable companies will quietly, yet significantly begin eating into the cellular markets in urban areas. I still don’t expect a major reaction by the cellar companies, but by 2020 we should start seeing cellular prices take another tumble.

Household Bandwidth Usage Will Continue to Grow. There will be no slowdown in the growth of household broadband as homes add many more bandwidth-capable devices to their homes. Another few million customers will cut the cable TV cord and ratchet up bandwidth usage. Online programming will routinely first offer 4K video and we’ll see the first commercial 8K video online.

We’ll See First Significant Launches of LEO Satellites. There will be little public notice since the early market entries will not be selling rural broadband but will be supporting corporate WANs, cellular transport and the development of outer space networks between satellites.

25 New Online Programmers Emerge. There will be a flood of new online programming options as numerous companies jump into the market. We won’t see many, and possibly no failures this year, but within a few years the market reality will drive out companies that can’t gain enough market share.

Transport Price Pressure Tightens. Anybody selling transport to cellular companies will see big pressure to lower prices. Those who ignore the pressure will find out that the carriers are willing to build fiber to bypass high costs.

Big Companies Will Get Most New Spectrum. The biggest ISPs and cellular carriers will still gobble up the majority of new spectrum, meaning improved spectrum utilization for urban markets while rural America will see nearly zero benefits.

FCC Says Big ISPs Delivering the Speeds they Market

The FCC recently released the reports from its speed test program for both 2017 and 2018. The reports summarize the results of the FCC’s Measuring Broadband America (MBA) that samples the actual performance of broadband customers by installing measuring devices at their homes. This program began in 2011 and these are the 7thand 8threport from the program. These used to be issued as separate reports, but these reports along with a number of other FCC reports are now being released together in one large annual filing. The link to the reports can be found here. The 2017 report begins at page 349 of the document and the 2018 report on page 463.

These tests are given to volunteer households of large ISPs only – those that cover 80% of all broadband customers in the country. This list of ISPs contains the big cable companies, big telcos, satellite broadband providers.

The primary conclusion of both reports is that “For most of the major broadband providers that were tested, measured speeds were 100% of advertised speeds or better between the peak hours (1 p.m. to 11 p.m. local time).

Frankly, that conclusion is impossible for me to believe and indicates that there is something in this testing program that is different than the reported experience by many customers in the real world. Consider all of the following:

  • It’s possible that the FCC is somehow doctoring the speed data, or at least not reporting all of the data they gather. Ars technica reports that SamKnows, the firm doing the measuring for these tests said they have been collecting data from between 6,000 and 10,000 homes during the time of these tests. But the reports are basing data on about 4,500 locations. This is an FCC that seems adverse to reporting things it doesn’t like, so there is certainly a chance that there is selective editing of the data used to create the report.
  • It’s clear that the reported users in these test results are not from rural America. My experience over the last decade is that virtually nobody in rural America is receiving the advertised broadband speeds. It’s virtually impossible for a rural DSL customer to getthe advertised speeds since they live far away from the core DSLAM modems that provide broadband. It’s worth noting that both reports admit that satellite broadband underperforms.

My experience comes from working extensively in rural America across the country. When we do broadband studies we elicit households to take speed tests so we can see actual performance. Admittedly speed tests have issues and are not as accurate as the measuring being done by SamKnows. They are likely connecting directly to the incoming broadband signal at the modem while most households today use WiFi, which affects self-administered speed test results. But in the many rural speed tests we’ve seen households perform, it’s rare to see a rural customer getting the speed they are paying for, and often they get just a tiny fraction of that speed, with results sometimes barely better than dial-up.

In general I test speed tests because we also do broadband studies for larger towns, and sometimes the speed tests show good performance by the ISP. For example, we studied a City recently with about 40,000 homes and for the most part Comcast was delivering speeds that often exceeded the advertised speeds. This makes me believe that the major speed test sites, while not perfect, are not terrible and can be believed to represent a whole community.

However, I’ve also studied larger communities where a major ISP underperforms across the board. I’ve rarely seen DSL meet advertised speeds for the majority of customers in a community. And I’ve studied communities where the cable company was slower than advertised for everybody.

The FCC results are also hard to believe because we know from the press that there are whole communities where a major ISP underperforms. As an example is a long-running battle in upstate New York where Charter has been delivering speeds at a fraction of the advertised speeds – the performance was so poor that the State is trying to kick Charter out of the state.

I have similar anecdotal evidence at my own house. My ISP is also Charter. They currently tell me that my speed ought to be 200 Mbps but I’m getting about 135 Mbps. Before the recent upgrade I was also getting less than what they said. I’m not unhappy with the 135 Mbps, but if my house was part of the FCC test it would show a connection getting only 2/3 of the advertised speeds.

The ars technica article I cited above is worth reading because they dig deeper into the data. I must admit that I got stopped at the first page of each report where they said that the large ISPs are mostly delivering the speeds they are advertising, because I know for much of the country that is not true. That makes me suspect that there is doctoring of data somehow. Perhaps the results mostly come from larger communities where the speeds are okay. Maybe the FCC is doctoring the data and excluding poor test results. Perhaps the ISPs know which homes are being measured and give them special attention. I don’t know the details of how the report was generated, but I have too much experience in the real world to accept the conclusions that big ISPs deliver the speeds they advertised.

Deregulating Text Messaging

“This is one of the oddest dockets I’ve ever seen”. That’s roughly quoting myself several times over the last year as I read some of the things that the current FCC is up to. I find myself saying that again as I read the FCC’s recent docket that proposes to classify SMS text messaging as a Title I information service. Their stated reason for the reclassification is that it will make it easier to fight text message spam, and that stated reason is where the FCC loses me.

Text message spam is a real thing and I’ve gotten some annoying text spam over the last year and I’d sure hate to see my texting inbox get polluted with crap like my email inbox. However, I doubt that you’ll find any technologist in the industry that will tell you that the way to fight spam of any kind is by waving a magic wand and changing the way that something is regulated. The way you fight spam is to put barriers in place to detect and block it – and that is something that only the carriers that control the flow inside of a communications path can do. It’s the solution that the FCC themselves just pushed recently to try to stop robocalling – by demanding that the telephone industry find a solution.

Yet here sits a docket that blindly declares that reclassifying texting as an information service will somehow dissuade bad actors from sending spam text messages. I’m pretty sure that those bad actors don’t really care about the differences between Title I and Title II regulation.

One of the interesting things about this filing is that past FCCs have never definitively said how texting is regulated. Over the years the industry has come to assume that it’s regulated under Title II just like a telephone call – because functionally that’s all a text message is, a telephone call made using texted words rather than a voice call.

To some extent this docket is the first time the FCC has every officially addressed the regulatory nature of text messaging. In the past they made rulings about texting that implies a regulatory scheme, but they never have officially put texting into the Title II category. Now they want to remove it from Title II authority – the first time we’ve ever been told definitively that text is already a Title II service. Here are some of the past FCC treatment of the regulatory nature of text messages:

  • In 1994 the FCC ruled that systems that store and forward telecommunications messages, like SMS texting are ‘interconnected’ services, which at that time were clearly regulated by Title II. But there was no specific statement at the time that texting was a Title II service.
  • In the Telecommunications Act of 1996 the FCC defined a telecommunications service for the first time – which was defined as a service that uses telephones and the PSTN to communicate. The 1996 Act didn’t mention texting, but it clearly fits that definition.
  • In 2003 the FCC declared that text messages were ‘calls’ when the agency implemented the Telephone Consumer Protection Act, which was the same treatment given to other Title II telephone services.
  • In 2007 the FCC included texting as one of the Title II services for which cellular carriers must allow roaming.
  • In 2011 USAC began enforcing the inclusion of text revenues as a Title II interstate revenues that used to assess monies owed to the Universal Service Fund.

All of these regulatory actions implied that texting is a Title II service, although that was never explicitly stated until now, when the FCC wants to reclassify it to be an information service. Reclassification doesn’t pass the ‘quack like a duck test’ because telephone calls and anything like them fit squarely as Title II services. Texting is clearly a type of telephone call and any person on the street will tell you that a text message from a cellphone is just like a phone call using text rather than voice.

Unfortunately, the only conclusion I can draw from this docket is that the FCC has an ulterior motive since their stated reasons for wanting to reclassify texting are pure bosh. There seem to be no obvious reasons for the reclassification. There are no parties in the industry, including the cellular carriers, that have been clamoring for this change. Further, the change will have the negative impact of further shrinking the Universal Service Fund – and expanding rural broadband is supposedly the number one goal of this FCC.

This is disturbing for somebody who has followed regulation for forty years. By definition, regulatory agencies are not supposed to push for changes without first opening an industry-wide discussion about the pros and cons of any suggested changes. Regulators are not supposed to hide the motives for their ideas behind false premises.

The only justification for the FCC’s proposed ruling that I can imagine is that the FCC wants to kill all Title II regulation. It seems they are on a mission to eliminate Title II as a regulatory category to make it hard for future FCC’s to reregulate broadband or to bring back network neutrality.

If that’s their real agenda, then we ought to have an open discussion and ask if we ought to eliminate Title II regulation – that’s how it’s supposed to work. The rules establishing the FCC call for a process where the agency floats new ideas to the world so that all interested parties can weigh in. The FCC is not ready to face the backlash from openly trying to kill Title II regulation, so instead of an open debate we are seeing a series of ridiculous attempts to chip quietly away at Title II regulation without overtly saying that’s their agenda.

In my opinion the time when we ought to stop regulating telephone services is getting closer as technology changes the way that we communicate. But that time is not here and there is still room for monopoly abuse of text messaging. There are a number of examples over the last decade where carriers have blocked text messages – sometimes when they disagreed with the content.

I’m disappointed to have an FCC that is using regulatory trickery to achieve their agenda rather than having a bold FCC that is willing to have the public debate that such a decision deserves. Telephone and related services like text messaging were regulated for many reasons and we ought to examine all of the pros and cons before deregulating them.

I’m guessing that this FCC wants to kill Title II regulation without ever having to tell the public that’s their agenda. I think they want to deregulate text messaging and then point to that deregulation as the precedent to justify deregulating all Title II services without having to suffer to criticism that is sure to come when the public realizes this closes the door on net neutrality.

2.5 GHz – Spectrum for Homework

As part of the effort to free up mid-band spectrum, the FCC is taking a fresh look at the 2.5 GHz spectrum band. This band of spectrum is divided into 33 channels; the lower 16 channels are designated as EBS (Educational Broadband Service) with the remainder as BRS (Broadcast Radio Service).

The EBS band was first granted to educational institutions in 1963 under the designation ITFS (Instructional Television Fixed Service) and was used to transmit educational videos within school systems. It became clear that many schools were not using the spectrum and the FCC gave schools the authority to lease excess capacity on the spectrum for commercial use. In urban markets the spectrum was leased to networks like HBO, Showtime and the Movie Channel which used the spectrum to delivery content after the end of the school day. In the late 1990s the spectrum was combined with MMDS in an attempt to create a wireless cable TV product, but this use of the spectrum never gained commercial traction.

In 1998 the FCC allowed cellular companies to use the leased spectrum for the new 3G cellular. In 1998 the FCC also stopped issuing new licenses for the spectrum band. Companies like Craig McCaw’s Clearwire leased the spectrum to deliver competitive cellular service in many urban areas. In 2005 the FCC cemented this use to allow the spectrum to be used for two-way mobile and fixed data.

Today the technology has improved to the point where the spectrum could help to solve the homework gap in much of rural America. The spectrum can be used in small rural towns to create hot spots that are tied directly to school servers. The spectrum can also be beamed for about 6 miles from tall towers to reach remote students. The spectrum has nearly the same operating characteristics as the nearby 2.4 GHz WiFi band, meaning that long-distance connections require line-of-sight, so the spectrum is more useful is areas with wide-open vistas than in places like Appalachia.

A group of educational organizations including the Catholic Technology Network, the National EBS Association, the Wireless Communications Association International and the Hispanic Information and Telecommunications Network petitioned the FCC to expand the EBS network and to grant new EBS licenses to fully cover the country. The FCC has been considering a plan that would strengthen the educational use of the spectrum and which would also auction the rest of the spectrum for use as wireless broadband.

The use of the spectrum for rural educational uses could be transformational. Rural students could get a small dish at their homes, like is done with the fixed wireless deployed by WISPs. Students would them have a direct connection to the school systems servers for doing homework. Interestingly, this would not provide a home with regular Internet access, other than what might be granted by schools for links needed for doing homework.

The disposition of the spectrum band is complicated by the fact that Sprint holds much of the spectrum under long-term lease. Sprint holds licenses to use more than 150 MHz of the spectrum in the top 100 markets in the country, which currently provides them with enough spectrum to simultaneously support both 4G LTE and 5G. The speculation is that the FCC is working on a plan to free up some of this spectrum as a condition to the merger of Sprint and T-Mobile.

This is the only current spectrum band where the FCC is envisioning different urban and rural uses, with rural parts of the country able to use the spectrum to connect to students while in urban areas the spectrum is used to support 5G. This divided use was only made possible by the historic educational component of the spectrum. If the FCC tries to give all of this spectrum to the cellular carriers they’d have to reclaim the 2,200 licenses already given to school systems – something they are politically unwilling to tackle.

However, this solution points to a wider solution for rural residential broadband. The FCC could order the same type of rural/urban bifurcation for many other bands of spectrum that are used primarily in urban settings. We need to find creative ways to use idle spectrum, and this spectrum bank provides a roadmap that ought to be applied to other swaths of spectrum.

Freeing the spectrum for full use by rural education offers big potential, but also creates challenges for rural school systems which will have to find the money to build and deploy wireless networks for homework. But solving the rural homework gap is compelling and I’m sure many school districts will tackle the issue with gusto.