Is it Time to Tax the Internet?

There is a law advancing in the Maryland legislature that would tax Internet advertising. The law, if enacted would collect taxes from online advertising done by Google, Facebook, and others. The tax would immediately be challenged in court due to our history of not taxing things related to the Internet.

The idea of not taxing the Internet began with the Internet Tax Freedom Act in 1998. That law grandfathered taxes imposed in 13 states on Internet access but prohibited it elsewhere. The law also prohibited local governments from imposing taxes on electronic commerce. The law imposed a 3-year moratorium on such taxes and was extended eight times until the tax prohibition was made permanent in the Trade Facilitation and Trade Enforcement Act in 2015.

There are other precedents for not taxing electronics commerce. For example, the Supreme Court ruled in 1967 that requiring remote vendors to collect sales taxes would impose an undue constraint on interstate commerce. The prohibition against sales taxes for out-of-state sellers was strengthened in 1992 in the Supreme Court’s ruling in Quill Corporation v. North Dakota that prohibited a state from collecting sales taxes unless a business has a physical presence in a state. These prohibitions were assumed to also apply to sales made over the Internet if the seller lived in a different state than the buyer.

However, such rulings change over time and the Supreme Court reversed the older decisions in 2018 in the case of South Dakota v Wayfair, Inc. The Court effectively ruled that states can require remote sellers of any kind, including online sellers to collect state sales taxes. Since that ruling, many states have imposed sales taxes on Internet sales.

Opponents of the tax on advertising say that the proposed law would violate the intent of the Internet Freedom Act to not tax electronic commerce. The application of a tax to advertising is different enough from a sales tax that the Supreme Court ruling won’t automatically apply. If the law is passed it will likely have to go to the Supreme Court. To a non-lawyer like me, it seems the issue is similar enough to sales taxes that there is a good chance that at tax on advertising would be allowed. Even if it allowed, such a tax has the significant challenge of identifying the advertising revenue that applies to a given state. For example, what portion of nationwide advertising for Ford trucks would apply to Maryland?

The Maryland law does freshly raise the question of whether it’s time to tax the Internet. The original Internet Tax Freedom Act was an attempt by Congress to protect the fledgling broadband business. It’s debatable if the growth of the Internet would have been slowed had there been a few dollars of taxes applied to broadband bills like were applied to landline telephone bills. But the new Internet companies were the darlings of Wall Street, and Congress decided to keep broadband products free from taxation.

The broadband tax that’s most needed is a surcharge on broadband to help fund the FCC’s Universal Service Fund. Many states also have similar funds. The USF is currently being funded by fees charged to landlines and cellphones. The purpose of the fund is currently to promote broadband for places that don’t have it. The fund will be paying for the $20.4 billion RDOF grants for rural broadband and the $9 billion 5G Fund grants to improve rural cellular coverage. It’s silly that we aren’t charging a small fee onbroadband users to help pay for rural broadband – everybody in the country benefits when there is broadband everywhere. I can’t fathom a justification for having landlines users pay for rural broadband but not broadband customers.

Congress is often mystifying. I can understand the initial ban against Internet taxes, although I don’t believe such taxes would have hampered the explosive growth of broadband. But I can’t think of any justification in 2015 for making the ban on Internet taxes permanent. Considering the huge problems that lack of rural broadband just caused during the COVID-19 crisis, there is no justification for not increasing the funding for the Universal Service Fund, and the easiest way to do so is to tax broadband customers.

The Battle over Encryption

There has been a tug-of-war between the US government and tech companies on the issue of encryption since the Clinton administration. The latest attack on encryption is the EARN IT Act (Eliminating Abusive and Rampant Neglect of Interactive Technologies).

In the 1990s the battle over encryption started because of the development of Pretty Good Privacy (PGP) software that developed end-to-end encryption. The Clinton administration proposed a ‘Clipper Chip’ which provided a backdoor at ISPs so that law enforcement could decrypt messages when necessary. That idea was quashed by a coalition from the right and left that didn’t want the government spying on private communications.

The idea of killing encryption died down because most big web platforms chose transport layer encryption. In this commonly used encryption method, a web company will encrypt traffic transmitted across its platform. This stops messages from being read by anybody in the middle of a transmission. Most big web companies, with the notable exception of Apple, will allow law enforcement to have access to the content of a transport layer encrypted message, because the web company can decrypt any message by reversing the encryption technology they use.

The encryption conversation flared up again in 2013 when Edward Snowden released proof that the NSA and other intelligence agencies were routinely decrypting traffic on platforms like Google, Facebook, Microsoft, and Yahoo. This meant that the government was routinely decrypting messages sent across the web without going through the normal processes of issuing a subpoena for cases that warranted investigation. According to Snowden, the government was decrypting almost everything sent over the big public platforms.

This led to a renewed interest in end-to-end encryption. This differs from transport layer encryption in that messages are encrypted at the end-user level instead of by the big web company. In end-to-end encryption, a message is encrypted on a sender’s device, sent to the recipient in an unreadable format, and then decoded on the recipient’s device. There are several ways to make this work, but none of the methods can be read by anybody else, including the big web companies that might be transmitting the messages. Law enforcement can still figure out the content of messages, but they have to work hard at it and gain access to both the sender and the receiver’s devices.

The general public has been pressuring the big web companies to incorporate end-to-end encryption in normal communication, and a number of them have considered doing so. Google pledged to incorporate end-to-end decryption in Gmail a few years ago but still hasn’t done so. There are platforms like Wicker and Signal that allow for end-to-end encryption, but the major platforms have not yet implemented it.

The EARN IT Act would tackle end-to-end encryption by tying encryption methods to the Section 230 protections that are key to operating big web platforms. Section 230 was created as law in the Telecommunications Act of 1996 and says that online services can’t be held responsible for end-user content they didn’t create. This is what allows Facebook, Twitter, Reddit, and Google to host content created by end-users that might violate various local laws. Without Section 230 protection, most privacy lawyers say that web companies couldn’t function since they would be sued by those who disagree with content.

The EARN IT Act would remove the blanket immunity from Section 230 and web companies would ‘earn’ Section 230 protections by providing backdoors for government surveillance. The EARN Act has ostensibly good intentions in that it addresses child sexual abuse. Web companies would have to certify annually that they are following whatever guidelines the government determines for monitoring or they would lose their Section 230 immunity. Those guidelines aren’t even listed in the EARN Act and would be determined periodically by a 19-member commission.

The EARN IT Act is a backdoor way to eliminate end-to-end encryption because if Google would lose Section 230 protections if it couldn’t certify that all Gmail can be decrypted. It’s likely such a certification would force web companies to spend a fortune to monitor every message on their platform before it is posted or transmitted – putting the web companies in the uncomfortable position of spying on everything said on their platform. As usual, Congress is trying to achieve something without the courage to outrightly say what they mean. The goal of the EARN Act is to outlaw end-to-end encryption, but it’s instead wrapped up inside a law that says it would fight against child pornography.

The Proposed 5G Fund

The FCC is seeking public comments in a Notice for Proposed Rulemaking on how to determine the coverage areas and the timing for the new $9 billion 5G Fund. The money for the 5G Fund will come out of the Universal Service Fund. The 5G Fund is aimed at bringing cellular coverage to rural places that don’t have coverage today and will award the money using a reverse auction. The FCC is proposing to award $8 billion in the first round of auctions with $1 billion awarded later.

The FCC’s attempt to spend this money already has a checkered past. The FCC tried to award $4.5 billion of this same funding in 2019 under the name of Mobility Fund II. When preparing for that reverse auction the FCC asked existing cellular carriers to provide maps showing existing cellular coverage. It turns out that the maps provided by Verizon, T-Mobile, and US Cellular were badly overstated and smaller cellular carriers cried foul. The smaller carriers claimed that the overstatement of coverage was meant to shuttle funding opportunities away from smaller cellular companies. It felt eerily familiar to just watch Frontier and a few other big telcos make similar last-minute claims about their broadband coverage for the RDOF grants.

The FCC eventually agreed with the small carriers and canceled the auction last year. The $4.5 billion in funding from 2019 was augmented by an additional $4.5 billion and reconstituted as the 5G Fund.

The FCC is asking for comments on two different options for awarding the money. The first option would award the funds in 2021 based upon the best current cellular coverage maps available. This option would only award money to areas that have never had 3G or 4G coverage. The second option would delay the auction until 2023, by which time the FCC is hoping for better maps through a process they have labeled as the Digital Opportunity Data Collection initiative.

The need for this fund is further complicated by the T-Mobile / Sprint merger. One of the merger agreements made by T-Mobile is to cover 99% of the people in the country, including 90% of those living in rural areas with 5G of at least 50 Mbps data speeds within 6 years of the merger.

There doesn’t seem to be any logical way the FCC can award this money in 2021. By definition, they’d be awarding using grant coverage using maps that the FCC openly acknowledges are badly flawed. Maybe even more importantly, at this early date the FCC can’t know where T-Mobile plans to cover over the next 6 years. If the FCC proceeds now they will almost surely be spending money to cover areas that T-Mobile is already on the hook to serve. By using flawed maps, the FCC will almost certainly miss areas that need service that T-Mobile will not be serving.

The T-Mobile merger agreement also raises a serious issue about the size of the 5G Fund. The Fund was set at $9 billion before T-Mobile agreed to cover a lot of the areas that were proposed for funding in 2019. Isn’t the $9 billion now too high since T-Mobile will be covering many of these areas?

This raises a bigger policy question. Does the FCC really want to spend $9 billion to cover the last 1% of the US population with cellular when a much larger percentage of rural homes don’t have workable home broadband? Shouldn’t some of this money now be repurposed to fund rural broadband in light of the T-Mobile agreement to cover 99% of people with cellular coverage?

Finally, FCC Chairman Ajit Pai never misses a chance to overhype 5G. In the announcement for the NPRM the Chairman was quoted as saying, “5G promises to be the next leap in broadband technology, offering significantly increased speeds and reduced latency. The 5G Fund for Rural America focuses on building out 5G networks in areas that likely would otherwise go unserved. It’s critical that Americans living in rural communities have the same opportunities as everybody else.”

What the Chairman and the carriers are  never going to say out loud is that 5G is an urban technology. All of the coolest features of 5G only work when cell sites are close together. The areas covered by these grants are the most rural cell sites in the US and will be serving only a few people at any given location. Low density sites gain almost no extra advantage from 5G, so they will effectively act like 4G LTE sites forever. It’s even unlikely that a cellular carrier would bother using extra spectrum at a cell site with only a few customers. Such cell sites need only the basic 4G LTE coverage and spectrum bands, and it’s unlikely that these areas will get true 5G, regardless of the 5G name the FCC has attached to the funding mechanism.

The Upload Crisis

Carriers continue to report on the impact of COVID-19 on their networks. One of the more interesting statistics that caught my eye was when Comcast reported that upload traffic on their network was up 33% since March 1. Comcast joins the rest of big ISPs in saying that their networks are handling the increased traffic volumes.

By ‘handling’ the volumes they mean that their networks are not crashing and shutting down. But I think there is a whole lot more to these headlines than what they are telling the public.

I want to start with an anecdote. I was talking to a client who is working at home along with her husband and two teenagers. The two adults are trying to work from home and the two kids are supposed to be online keeping up with schoolwork. Each of them needs to create a VPN to connect to their office or school servers. They are also each supposed to be connecting to Zoom or other online services for various meetings, webinars, or classes.

These functions all rely on using the upload path to the Internet. The family found out early in the crisis that their broadband connection did not provide enough upload speed to create more than one VPN at a time or to join more than one video call. This has made their time working at home into a major hassle because they are being forced to schedule and take turns using the upload link. This is not working well for any of them since the family has to prioritize the most important connections while other family members miss out on expected calls or classes.

The family’s upload connection is a choke point in the network and is seriously limiting their ability to function during the stay-at-home crisis. But the story goes beyond that. We all recall times in the past when home Internet bogged down in the evenings when everybody in the neighborhood was using broadband to watch videos or play games. Such slowdowns occurred when the download data path into the neighborhood didn’t deliver enough bandwidth to satisfy everybody’s request for broadband. When that download path hit maximum usage, everybody in the neighborhood got a degraded broadband connection. When the download path got overloaded, the network responded by giving everybody a little less bandwidth than they were requesting – and that resulted in pixelating video or websites that lose a connection.

The same thing is now happening with the upload links, but the upload path is a lot more susceptible to overload.  For technologies like coaxial cable networks or telephone DSL the upload path leaving the neighborhood is a lot smaller than the download path into the area. As an example, the upload link on a coaxial network is set to be no more than 10% of the total bandwidth allowed for the neighborhood. It takes a lot more usage to overload the download path into the neighborhood since that path is so much larger. On the upload path, the homes are now competing for a much smaller data path.

Consider the difference in the way that homes use the download path compared to the new way we’re all using uploading. On the download side, networks get busy mostly due to streaming video. Services like Netflix stay ahead of demand by downloading content that will be viewed five minutes into the future. By doing so, the neighborhood download network can have cumulative delays of as much as five minutes before the video streams collapse and stop working. The very nature of streaming creates a buffer against failure – sort of a network insurance policy.

Homes are not using the upload links in the same way. Connecting to a school server, a work server, or a video chat service creates a virtual private network (VPN) connection. A VPN connection grabs and dedicates some minimum amount of bandwidth to the user even during times when the person might not be uploading anything. A VPN carves out a small dedicated path through the upload broadband connection provided by the ISP. There is no buffer like there is with downloading of streaming video – when the upload path gets full, there’s no room for anybody else to connect.

The nearest analogy to this situation harkens back to traditional landline telephone service. We all remember times, like after 911, when you couldn’t make a phone call because all of the circuits were busy. That’s what’s happening with the increased use of VPNs. Once the upload path from the neighborhood is full of VPNs, nobody else is going to be able to grab a VPN connection until somebody ‘hangs up’.

Residential customers have historically valued download speeds over upload speeds and ISPs have configured their networks accordingly. Many technologies allow an ISP to balance the upload and download traffic, and ISPs can help upload congestion by providing a little more bandwidth on the upload stream. Unfortunately for cable companies, the current DOCSIS standards don’t allow them to provide more than 10% of bandwidth on the upload side – so their ability to balance is limited.

As I keep hearing these stories from real users I am growing less and less impressed by the big ISPs saying that everything is well and that their networks are handling the increased load. I think there are millions of households struggling due to inadequate upload speeds. It’s true, as the big ISPs are reporting, that the networks are not crashing – but the networks are not providing the connections people want to make. No big ISP is going to admit this to their stockholders – but I bet a lot of those stockholders already understand this first-hand from having troubles trying to work from home.

FCC Ignoring Consumer Broadband Complaints

One of the best aspects of broadband regulation was that a consumer was always able to file a complaint against an ISP with the FCC, and the complaint process generally resolved disputes between customers and carriers. If customers had legitimate complaints about billing or service, a complaint sent to the FCC generally solved the issue; if the carrier was in the right, the FCC sided with the carrier and asked them to explain the applicable laws or rules to the customer involved. This complaint process was the ultimate backstop for people who had tried every other avenue for resolving a dispute.

But starting with the Restoring Internet Freedom order where the FCC voted to kill net neutrality and to kill Title II regulation of broadband this all changed. After that order, the FCC stopped intervening in broadband complaints from customers. They now forward complaints to carriers but don’t insist that problems are resolved.

Jon Brodkin wrote an article about this last November where he documented a case where Frontier was billing $10 per month to a customer who had purchased a FiOS router before Frontier purchased the property there. The company insisted that the customer pay the fee for a router that the customer clearly owns. Even after a complaint was filed at the FCC on the issue, Frontier wouldn’t change its position. The FCC did nothing about the complaint – the agency forwarded the complaint to Frontier and considered the issue settled.

In the past, the FCC would have looked at the facts, which in this case any person off the street would have resolved in favor of the customer. If the FCC got too many complaints on the same issue, they would pressure an ISP to change their practices.

It’s conceivable that the FCC no longer has the power to resolve complaints and just doesn’t want to publicly say so. When the agency voided their ability to regulate broadband, it’s likely they also voided their ability to intervene on any topic related to broadband – the agency effectively gelded themselves.

As Brodkin points out, the FCC isn’t being truthful about the complaint process. They told US Rep. Mike Quigley (D-Ill.) that they forward complaints to the Federal Trade Commission, but it turns out they only forward complaints that the FTC asks about – not most complaints.

The FCC has informed some consumers that they have an option to file a formal complaint. This is a process that costs $235 and that ensures that the agency will at least look at the issue. This is the process normally used to resolve pole attachment complaints and similar disputes between carriers. A formal complaint initiates a formal process that the average person probably would find difficult to comply with – a formal complaint initiates the equivalent of a legal proceeding, and there are specific procedural rules and a legal process of filing documents and pleadings on a pre-determined schedule. A formal complaint that doesn’t follow the processes and protocols would likely be tossed as being non-responsive.

Unfortunately, paying this fee for a formal complaint still might not do any good since the FCC no longer has jurisdiction over a broadband billing dispute or other broadband issues. The resolution of a formal complaint might result in nothing more than an FCC ruling that the customer should have gone to the FTC instead of the FCC.

There are other ramifications of the Restoring Internet Freedom order. When the FCC killed its ability to regulate broadband it also theoretically voided the State’s ability to regulate broadband as well. State regulatory commissions have always had a complaint process similar to the FCC’s, but since the law of the land is that broadband is no longer regulated, consumers can’t take these complaints to a state commission. The only current recourse for a consumer is to go to the FTC. Unfortunately, the FTC regulates bad behavior by all corporations, and so the agency only opens an investigation when there are numerous complaints against a specific ISP on a specific topic. The FTC does not intervene in or try to resolve individual consumer complaints.

I don’t think it has registered with the general public that broadband is unregulated. This means that consumers are on their own when ISPs harm them and no government agency can intervene on their behalf. There is no better example than the one that Brodkin had highlighted – Frontier feels safe in mistreating a customer even when under the eye of regulators, and even when they are blatantly wrong. To Frontier, keeping the erroneous $10 in monthly billing is obviously more important than doing the right thing by a customer – and there seems to be nothing a customer can do than perhaps finding somebody in the press to highlight their story.

Windstream Adding YouTube TV

Windstream announced earlier this year that it is now offering YouTube TV to customers as an alternative to its traditional cable TV offering. The company has not yet fully ditched its traditional Kinetic TV offering, but this is a first step towards doing so. As more small cable operators look at the math of staying in the TV business, I’m expecting we’ll see a lot more ISPs considering the same transition. There are a lot of implications for converting traditional cable TV to a streaming service.

Regulatory. While regulation of traditional cable TV isn’t a massive burden, all regulatory requirements disappear with a conversion to a streaming service like YouTube TV. There are a several annual FCC filings required by cable operators that would disappear. If a cable operator is paying local franchise fees, they can avoid the monthly reporting of customers and revenues to local tax authorities.

Taxation. The biggest external change from such a conversion would be that the cable operator no longer has to collect and remit local franchise fees assessed on cable service which vary across the country between 3% and 6%. The cost of collecting taxes and fees and of dealing with tax authorities disappears for the cable operator.

The biggest implication of this change is that local communities could see franchise fees dry up overnight. I would expect a cable provider like Windstream to withdraw and cancel their franchise agreements if they fully adopt YouTube TV. If the primary cable provider in a town makes this conversion, then franchise fee payments dry up immediately. Franchise fees are an important part of balancing local government budgets, particularly in smaller towns.

Cancelling franchise agreements also means that all of the local obligations that come with a cable franchise disappear. The cable provider would no longer provide a PEG channel to show local government meetings and other local content. Any subsidies for local government iNets for bringing connectivity to city halls and schools would disappear.

Operational. There are huge operational savings for ISPs that make this conversion. Most of my clients that offer cable TV tell me that 60% or more of calls to customer service are about the cable product. Eliminating traditional cable means reducing customer service calls and reducing truck rolls.

Getting out of the traditional cable business also means getting out of the settop box business. There is a huge operational savings from not having to keep a settop box inventory and keeping boxes operational. Installations get much easier when there is no settop box to connect.

Broadcast Fees. There are also implications for the larger cable market. Online services like YouTube TV are not required to comply with FCC channel lineups and they can offer whatever packages they negotiate with programmers. This means many networks will no longer be carried and will lose the revenue for every customer that makes this conversion. This becomes cord cutting at the corporate level and as 200-channel lineups get shrunk to 70 channels, a lot of monthly fees to programmers evaporate.

If you look at the YouTube TV line-up, you’ll see the most popular networks. For example, the service includes the primary Discovery channel, but not all of the ancillary Discovery channels that come with a traditional TV subscription. This is true throughout the line-up as the service concentrates on the most-popular channels only.

Profitability. The biggest change is to profitability. I expect that if Windstream fully calculated the cost of being in the cable business that they would show no margin or a negative margin. All of the ancillary costs of extra truck rolls, dealing with settop boxes, tracking and reporting franchise fees and taxes, etc. can add up.

I don’t know what YouTube TV will pay to a cable provider like Windstream, but it can’t be much – no more than a few pennies on the dollar. Nobody would make this transition to get rich from the commission fees, but rather to avoid the costs and the hassles of remaining in the traditional cable business.

There They Go Again

The FCC issued the 2020 Broadband Deployment Report on April 20. It’s a self-congratulatory document that says that the state of broadband in the country is improving rapidly and that the FCC is doing a great job. I had a hard time making it past the second paragraph of the report which summarized the state of broadband in the country. Consider the following:

The number of Americans lacking access to fixed terrestrial broadband service at 25/3 Mbps continues to decline, going down by more than 14% in 2018 and more than 30% between 2016 and 2018. 

The FCC has no factual basis for this statement because they don’t know the number of households in the US that don’t have access to 25/3 Mbps broadband. The numbers cited are based upon the Form 477 data collected from ISP that everybody in the country, including the FCC, has acknowledged is full of errors. The FCC has proposed moving to a new method of data collection that will produce maps based upon drawing polygons that they hope will fix the rural broadband reporting problem. 

I’ve been working all over the country with rural counties and I have yet to encounter a rural county where the Form 477 coverage of 25/3 broadband is not overstated. In county after county, we find places where the big telcos and/or WISPs exaggerate the broadband speeds that are available and the coverage area available for faster speeds of broadband. The reporting problem is getting worse rather than improving as witnessed by a recent filing by Frontier to the FCC that claims they have improved speeds in 16,000 rural Census blocks to 25/3 Mbps broadband since June 30, 2019. This claim is made by a company that just went into bankruptcy and which the whole rural industry knows is not spending a dime on rural infrastructure. There were similar claims made by the other big telcos in a proceeding that was to determine the areas available for FCC grant funding.

The number of Americans without access to 4G Long Term Evolution (LTE) mobile broadband with a median speed of 10/3 Mbps fell approximately 54% between 2017 and 2018. 

There has been a lot of rural cell sites upgraded from 3G to 4G as the big cellular carriers want to mothball 3G technology. However, any quantification of the improvement of cellular broadband coverage is suspect due to blatantly erroneous reporting by the big cellular carriers. In 2019 when the FCC went to award grant funding to upgrade rural cellular coverage the discovered that Verizon, T-Mobile, Sprint, and US Cellular had significantly overstated rural cellular coverage in an attempt to shuttle grant funds away from smaller cellular carriers. The FCC reacted by yanking that grant program and delaying it, in what is now called the 5G Fund. It’s hard to believe that the FCC would try to quantify the improvement in 4G coverage between 2017 and 2018 without acknowledging that this was the coverage that was badly overstated by the cellular carriers.  

AT&T, Sprint, T-Mobile, and Verizon are also rapidly expanding their 5G capability, with 5G networks in aggregate now covering the majority of the country’s population, especially in urban areas, and more live launches planned for 2020.

The FCC clearly buys the 5G hype from the cellular companies which are claiming widespread 5G coverage. The cellular companies have introduced new spectrum into their 4G LTE environment, and the cellular marketers have labeled this as 5G. Much of the first wave of new spectrum being used is in lower frequency bands such as 600 MHz for T-Mobile and 850 MHz for AT&T. These lower frequency bands don’t carry as much data as higher frequencies and won’t be delivering faster broadband. However, new spectrum bands improve the chances of grabbing a channel to get the data speeds that 4G was already supposed to be delivering.

5G will not arrive until the carriers begin implemented the new features described in the 5G specifications. For now, none of the important new 5G features have yet made it to the market. So, contrary to the FCC telling the public that 5G is nearly everywhere, the truth is that it is not yet anywhere in the country. I’ll be curious in a few years to see how the annual FCC reports on broadband describe the actual introduction of 5G features. It’s likely they’ll parrot whatever language the cellular marketers spin by then.

This opening pat on the back is followed by page after page of broadband statistics that are based upon the lousy Form 477 reporting from ISPs. There is almost no statistic in this report that is entirely trustworthy.

This report is unfortunate in many ways. The FCC feels compelled to exaggerate broadband coverage so that they can’t be forced to try to fix broadband gaps. The sad aspect of this report is that this is the statistics cited in this report are used to determines which parts of rural America are eligible for broadband grants – and this report is largely a fairy tale. It would have been nice if the summary of the report had acknowledged that the FCC knows that their data is faulty – something they have openly recognized in other dockets. Instead, the FCC chose to spin this fanciful tale of rapidly improving broadband that does little more than provide cover for the FCC to not have to fix rural broadband. 

Funding Middle-mile Fiber

A decade ago, there were a lot of federal grants given to build middle mile fiber. That’s the fiber that connects communities and that provides a path between a community and a connection to the Internet. Ideally, backbone fiber also provides a diverse route with ring electronics so that if one of the fibers serving a community is cut the broadband connection to the community keeps working.

It’s not as easy to find grants for backbone fiber today. For instance, the $16.4 RDOF grant for later this year is aimed at bringing last-mile fiber to remote places in the country but doesn’t let an applicant file for money to build just backbone fiber to reach those same remote communities. It’s almost as if the FCC somehow thinks that most of America is somehow now in reach of a reliable connection to the Internet.

A new network called Project THOR recently launched in northwest Colorado that is purely a backbone project and that shows the continued need for middle-mile fiber. Project  THOR is a consortium of 14 communities that came together because they regularly suffered major broadband outages any time there was a middle-mile fiber cut in the region or an electronics problem at CenturyLink, the backbone provider for the entire section of the state. Network outages can be devastating and mean non-functional 911 centers, hospitals with no broadband, city governments that are crippled, and business districts that can no longer take credit cards or use the Internet.

The cities and towns in the region selected Mammoth Networks to create and operate a new middle-mile fiber network. The initial network is cobbled mostly with dark fiber leased from Colorado DOT, other networks like Strata, and lit fiber from CenturyLink, Comcast, and Zayo. The plan is to eventually replace lit fiber with dark fiber or constructed fiber. Mammoth oversaw the construction of lateral fibers inside of communities and also designed and implemented the electronics network. The State of Colorado Department of Local Affairs funded the lateral construction and half of the equipment purchases through a broadband grant.

The communities are free to use the network in any way they see fit. The Project THOR network terminates at a meet-me center created in each community. Several of the communities on the new network have already built fiber-to-the-home and the new network provides Internet redundancy. Other communities located the meet-me room at a hospital or other critical facility so that they’d see an immediate benefit from the network.

Project THOR brings two advantages to the region. First, the network is designed to carry up to 400 Gbps – much more capacity than any existing fiber in the region. Mammoth Networks was also able to string together routes that provide diversity for each city to protect against fiber cuts. A single fiber cut on the Project THOR routes won’t interrupt service to any of the member communities.

There was no better evidence of the effectiveness of Project THOR than when a CenturyLink fiber outage hit the region a few days after Project THOR was activated. On April 10, there was a 6.5-hour outage, and because of Project THOR, the 911 PSAP in Aspen, hospitals in Granby and Kremmling, and the city governments in Aspen and Glenwood Springs stayed operational – but would have lost broadband service without Project THOR. The Project THOR route was the only network to stay functional in the region during the outage.

It’s common knowledge that the large incumbent telcos haven’t put any money into last-mile broadband in rural areas – but the same thing is true for middle-mile fiber. What’s most amazing about Project THOR is that CenturyLink could easily be providing much of the same redundancy and quality of service that the new network offers. However, the company doesn’t seem interested in making the needed investments in diverse fiber routes or the associated electronics.

There are huge areas of the country that suffer from inadequate middle-mile fiber routes. It would be great if there was a grant program aimed specifically for middle-mile fiber. The need is there because existing middle-mile fibers are often not adequate for today’s bandwidth needs and are definitely not ready for the increased bandwidth needs of the future. Most incumbent middle mile has little redundancy, leading to regular Internet outages. It’s also not unusual to find relatively ancient electronics on middle-mile routes in rural areas.

Project THOR is an example of cities that banded together to fix a common issue – in this case, regular and extended Internet outages. In the ideal world, the incumbents would fix such issues because it’s the right thing to do. However, the lack of capital spending on rural broadband affects middle-mile fiber as much as if impacts last-mile fiber – both are inadequate in most rural areas.

An Update on Robocalling

The FCC has continued to push for a solution to reduce nuisance robocalls. A solution is badly needed. YouMail tracks nationwide robocalls and publishes the results on this Robocall index. They estimate that there were over 4.1 billion robocalls in March 2020 or about 12.5 calls for every telephone subscriber in the country. According to their index, the number of robocalls peaked in October of last year with an astounding 5.6 billion calls that month. Not all robocalls are bad. For example, when my daughter was in high school we got robocalls for parents to remind us of things that we likely wouldn’t otherwise be told.

One of the services our firm offers is doing telephone surveys for marketing purposes, and my callers report that a significant percentage of people that we call won’t pick up their calls, which are always a long-distance call from out of region. People have gotten leery about calls from numbers they don’t recognize.

The industry is now scheduled to implement a STIR/SHAKEN solution by June 2021. The STIR acronym stands for Secure Telephony Identity Revisited and creates a trust value score for originating calls, which represent the level of assurance that the originating carrier has in the legitimacy of the call. SHAKEN stands for Signature-based Handling of Asserted information using toKENs is a set of guidelines for carriers that terminate calls that define how carriers should react to STIR information that has crossed the network.

It’s worth noting that this new technology doesn’t block any robocalls, although I’ve seen the general press assume that’s how the industry will be handling robocalls. Instead, the STIR/SHAKEN protocols are a classification system where originating carriers attest to the validity of calls. For example, there is a STIR code for full attestation where the originating service provider authenticates that the call came from on of their customers and that the customer was authorized to use the phone number used to make the call. Carriers that receive calls with that full attestation will deliver the calls like normal to customers.

For calls with lesser levels of authentication, carriers will decide how to handle the calls with customers. I’m already seeing some of this on my AT&T cellphone. I’ve received at least a dozen calls in the last few months where AT&T labels the Caller ID as ‘SPAM Call’. I haven’t answered any of these calls and I guess I should, just to see if they are robocalls or something else. There is a good description of the mechanics of the STIR/SHAKEN process at the bandwidth.com website that’s not too technical.

There is still one big flaw in the new protocols in that STIR/SHAKEN only works for Voice over IP calls. Such calls are already fully digital and the protocols are able to layer on the STIR attestation data along with delivering a call. However, traditional TDM phone calls won’t work with the new protocols since there is no easy way to transmit additional data about the call for use by terminating telephone companies.

There are still a lot of traditional TDM phones in the world. It’s been estimated that 40% of residential landlines and almost half of business telephone lines are still using TDM technology. Everybody with a copper telephone line from a telco likely still uses traditional TDM voice technology. The business/government market still uses a lot of TDM technology, using either individual telephone lines, traditional Centrex service, or PBXs that use channelized PRIs to carry voice traffic. Almost no international calls, except calls from Canada, will use the STIR protocol.

None of the calls originating on traditional phone switches will carry the STIR attestation information. There is a huge concern from rural telephone companies which fear that calls from their customers will be labeled as spam. Businesses that use older phone technology are going to have the same concerns. There are general guidelines that tell the companies that terminate and deliver calls to customers how to deal with the various STIR protocols. But there is nothing to stop a telco from being cautious and labeling all calls without the STIR designation as spam. The calls would still be delivered, but how many people will answer such calls?

There is also a concern that robocallers will only originate calls through TDM phone systems. By doing so they still might get calls delivered if telcos allow other TDM calls. Robocall companies are likely going to be like computer spam, and robocallers will change tactics to defeat the STIR protocols.

Small telcos have a dilemma. Most of them have lost over half of their customers over time and have lost half of the calling volume from the customers that remain. There is no easy or affordable path for somebody originating calls on a copper network to convert them to VoIP. Rural America has already suffered for years by long-distance routing schemes that fail to deliver calls to rural markets. Rural telephony gets really screwed if calls originating from rural areas are also no longer trusted.

The FCC knows about these limitations and is moving forward anyway. Robocalls are a huge problem and the FCC has been under fire from Congress to do something, so they acted. But there is still a lot of risk that much of our phone calling is going to get gummed up with the cutover to STIR/SHAKEN – particularly calls coming from rural America.

The Evolution of 5G

Technology always evolves and I’ve been reading about where scientists envision the evolution of 5G. The first generation of 5G, which will be rolled out over the next 3-5 years, is mostly aimed at increasing the throughput of cellular networks. According to Cisco, North American cellular data volumes are growing at a torrid 36% per year, and even faster than that in some urban markets where the volumes of data are doubling every two years. The main goal of first-generation 5G is to increase network capacity to handle that growth.

However, if 5G is deployed only for that purpose we won’t see the giant increases in speed that the public thinks is coming with 5G. Cisco is predicting that the average North American cellular speed in 2026 will be around 70 Mbps – a far cry from the gigabit speed predictions you can find splattered all over the press.

There is already academic and lab work looking into what is being labeled as 6G. That will use terabit spectrum and promises to potentially be able to deliver wireless speeds up to as much as 1 terabit per second. I’ve already seen a few articles touting this as a giant breakthrough, but the articles didn’t mention that the effective distance for this spectrum can be measured in a few feet – this will be an indoor technology and will not be the next cellular replacement for 5G.

This means that to some degree, 5G is the end of the line in terms of cellular delivery. This is likely why the cellular carriers are gobbling up as much spectrum as they can. That spectrum isn’t all needed today but will be needed by the end of the decade. The cellular carriers will use every spectrum block now to preserve the licenses, but the heavy lifting for most of the spectrum being purchased today will come into play a decade or more from now – the carriers are playing the long game so that they aren’t irrelevant in the not-too-distant future

This doesn’t mean that 5G is a dead-end, and the technology will continue to evolve. Here are a few of the ideas being explored in labs today that will enhance 5G performance a decade from now:

  • Large Massive Network MIMO. This means expanding the density and capacity of cellular antennas to simultaneously be able to handle multiple spectrum bands. We need much better antennas if we are to get vastly greater data volumes into and out of cellular devices. For now, data speeds on cellphones are being limited by the capacity of the antennas.
  • Ultra Dense Networks (UDN). This envisions the end of cell sites in the way we think about them today. This would come first in urban networks where there will be a hyper-dense deployment of small cell devices that would likely also incorporate small cells, WiFi routers, femtocells, and M2M gateways. In such an environment, cellphones can interact with the cloud rather than with a traditional cell site. This eliminates the traditional cellular standard of one cell site controlling a transaction. In a UDN network, a cellular device could connect anywhere.
  • Device-to-Device (D2D) Connectivity. The smart 5G network in the future will let nearby devices communicate with each other without having to pass traffic back and forth to a data hub. This would move some cellular transactions to the edge, and would significantly reduce logjams at data centers and on middle-mile fiber routes.
  • A Machine-to-Machine (M2M) Layer. A huge portion of future web traffic will be communications between devices and the cloud. This research envisions a separate cellular network for such traffic that maximizes M2M communications separately from traffic used by people.
  • Use of AI. Smart networks will be able to shift and react to changing demands and will be able to shuffle and share network resources as needed. For example, if there is a street fair in a neighborhood that is usually vehicle traffic, the network would smartly reconfigure to recognize the changing demand for connectivity.
  • Better Batteries. None of the improvements come along until there are better ‘lifetime’ batteries that can allow devices to use more antennas and process more data.

Wireless marketing folks will be challenged to find ways to describe these future improvements in the 5G network. If the term 6G becomes associated with terabit spectrum, marketers are going to find something other than a ‘G’ term to over-hype the new technologies.