Mapping Cellular Data Speeds

AT&T recently filed comments in Docket 19-195, the docket that is looking to change broadband mapping, outlining the company’s proposal for reporting wireless data speeds to the FCC. I think a few of their recommendations are worth noting.

4G Reporting. Both AT&T and Verizon support reporting on 4G cellular speeds using a 5 Mbps download and 1 Mbps upload test with a cell edge probability of 90% and a loading of 50%. Let me dissect that recommendation a bit. First, this means that customer has a 90% chance of being able to make a data connection at the defined edge if a cell tower coverage range.

The more interesting reporting requirement is the 50% loading factor. This means the reported coverage area would meet the 5/1 Mbps speed requirement only when a cell site is 50% busy with customer connections. Loading is something you rarely see the cellular companies talk about. Cellular technology is like most other shared bandwidth technologies in that a given cell site shares bandwidth with all users. A cell site that barely meets the 5/1 Mbps data speed threshold when it’s 50% busy is going to deliver significantly slower lower speeds as the cell site gets busier. We’ve all experienced degraded cellular performance at rush hours – the normal peak times for many cell sites. This reporting requirement is a good reminder that cellular data speeds vary during the day according to how many people are using a cell site – something the cellular companies never bother to mention in their many ads talking about their speeds and coverage.

The recommended AT&T maps would show areas that meet the 5/1 Mbps speed threshold, with no requirement to report faster speeds. I find this recommendation surprising because Opensignal reports the average US speeds of 4G LTE across America is as follows:

2017 2018
AT&T 12.9 Mbps 17.87 Mbps
Sprint 9.8 Mbps 13.9 Mbps
T-Mobile 17.5 Mbps 21.1 Mbps
Verizon 14.9 Mbps 20.9 Mbps

I guess that AT&T favors the lowly 5/1 Mbps threshold since that will show the largest possible coverage area for wireless broadband. While many AT&T cell sites provide much faster speeds, my guess is that most faster cell sites are in urban areas and AT&T doesn’t want to provide maps showing faster speeds such as 15 Mbps because that would expose how slow their speeds are in most of the country. If AT&T offered faster speeds in most places, they would be begging to show multiple tiers of cellular broadband speeds.

Unfortunately, maps using the 5/1 Mbps criteria won’t distinguish between urban places with fast 4G LTE and more rural places that barely meet the 5 Mbps threshold – all AT&T data coverage will be homogenized into one big coverage map.

About the only good thing I can say about the new cellular coverage maps is that if the cellular companies report honestly, we’re going to see the lack of rural cellular broadband for the first time.

5G Broadband Coverage. I don’t think anybody will be shocked that AT&T (and the other big cellular companies) don’t want to report 5G. Although they are spending scads of money touting their roll-out of 5G they think it’s too early to tell the public where they have coverage.

AT&T says that requiring 5G reporting at this early stage of the new technology would reveal sensitive information about cell site location. I think customers who pony up extra for 5G want to know where they can use their new expensive handsets.

AT&T wants 5G coverage to fall under the same 5/1 Mbps coverage maps, even though the company is touting vastly faster speeds using new 5G phones.

It’s no industry secret that most of the announced 5G deployment announcements are mostly done for public relations purposes. For example, AT&T is loudly proclaiming the number of major cities that now have 5G, but this filing shows that they don’t want the public to know the small areas that can participate in these early market trials.

If 5G is a reasonable substitute for landline broadband, then the technology should not fall under the cellular reporting requirements. Instead, the cellular carriers should be forced to show where they offer speeds exceeding 10/1 Mbps, 25/3 Mbps and 100/10 Mbps, and 1 Gbps. I’m guessing a 5G map using these criteria would largely show a country that has no 5G coverage – but we’ll never know unless the FCC forces the wireless companies to tell the truth. I think that people should be cautious about speeding extra for 5G-capable phones until the cellular carriers are honest with them about the 5G coverage.

The FCC’s 15th Annual Broadband Deployment Report

The FCC just released its most recent annual report on the state of US broadband. This report is mandated by Section 706 of the Telecommunications Act of 1996 which requires the FCC to “determine whether advanced telecommunications capability is being deployed to all Americans in a reasonable and timely fashion”. The FCC concludes in this latest report that broadband deployment is reasonable and that actions taken by this Commission are helping to close the broadband gap.

I take exception to several findings in this latest report. First, everybody in the country now understands that the FCC’s conclusions are based upon dreadfully inaccurate 477 data reported by ISPs. There have been numerous studies undertaken at the state and local levels that show that the FCC maps undercount households without broadband. Even USTelecom, the group mostly representing the largest telcos showed that the FCC maps in Missouri and Virginia classified 38% of rural homes as being served when in fact they were unserved. Microsoft has been gathering credible data showing that well over 150 million homes aren’t connecting at the FCC’s defined broadband speed of 25/3 Mbps.

For the FCC to draw any conclusions based upon inaccurate 477 data is ridiculous. A few years ago the FCC could have claimed to not understand the extent to which their data is flawed, but they’ve been shown extensive evidence that the 477 data is incredibly bad, and yet they still plowed forward in this report pretending that statistics based upon 477 data have any meaning. There is not one number in this report that has even the slightest amount of credibility and the FCC knows this.

With the knowledge that the FCC now has about the inaccuracy of their data, this FCC should have humbly admitted that they don’t know the number of households that don’t have broadband. The report could have discussed ways that the Commission is trying to fix the bad data and described steps they have taken to improve rural broadband. But for this report to lead off with a claim that the number of homes without broadband fell by 18% in 2018 is a joke – there is zero chance that’s an accurate statistic. This report should have stated that external analysis has shown that the state of broadband is a lot worse than what they’ve reported in prior annual reports.

I also take exception to the opening statement of the report where the FCC claims that its top goal is “closing the digital divide and bringing the educational, healthcare, social, and civic benefits of connectivity to all Americans seeking broadband access.” This FCC’s top goal is clearly to eliminate regulatory rules that create any obligations for the largest carriers. This FCC already completely deregulated broadband – something an agency would never do if their goal was to improve broadband access. Most of the major dockets that have been approved by this FCC have made it easier for the big carriers to deploy 5G or to otherwise avoid any regulatory burdens.

It’s insulting to the American people for the agency to state that their top goal is improving broadband when their actions show that their priorities are elsewhere. Regulatory agencies are not supposed to engage in propaganda, and this document reeks of self-promotion.

Finally, this report trots out the oft-repeated message that broadband is improving because of this FCC’s effort to remove barriers to broadband investment. I don’t think Chairman Pai makes a speech or writes an opinion that doesn’t bring up this disproved argument. We know by now that those without broadband fall into two categories – rural homes that don’t have access to a broadband connection and urban households that can’t afford broadband. The big telcos aren’t spending any of their cash to solve these two problems.

There has been a lot of fiber built in recent years. AT&T built fiber to pass 12 million homes as a condition for its merger with DirecTV – an effort the company announced was done this past summer. Verizon has been building fiber to bolster their cellular network, including an expansion of small cell sites – largely as a way to reduce their reliance on paying transport to others. These fiber efforts have nothing to do with the repeal of net neutrality and the ending of broadband regulation. Chairman Pai probably ought to cut back on making this claim, because his real legacy is that he’s emboldened the big cable companies to begin regularly increasing broadband rates since there’s no threat of regulatory oversight. Chairman Pai and his light-touch regulation will get the credit for why broadband costs $100 per month a few years from now.

NFL City Broadband

Every few years a large city takes a hard look at the broadband issue and considers building a citywide fiber network to make their city more competitive. A few years ago, San Francisco took a hard look at the issue. Before then, cities like Seattle, Baltimore, Cleveland, and others considered fiber networks.

The latest city that might be joining the fray is Denver where fiber proponents are pushing the City Council to have a 2020 ballot initiative for removing statewide restrictions on municipal participation in finding fiber solutions. Numerous smaller communities in Colorado have already held ballot initiatives that allowed their cities to opt-out of the restriction. Some of those cities have gone on to build fiber networks and others are now studying the issue.

If such a ballot initiative passed it would not necessarily mean that Denver would be considering building a fiber network. Instead, this would remove the restrictions created by a law sponsored by the big incumbent telephone and cable companies that requires a referendum before a city can even have a serious conversation about fiber.

A lot of people probably wonder why a large city would consider building a fiber network. It turns out that many cities have sizable pockets without adequate broadband. There are places in every big city where the cable companies never provided service – often to apartment buildings in poor neighborhoods. I’ve written several blogs about studies that show that AT&T redlined DSL deployment and that numerous poor neighborhoods still can only get DSL with speeds of 3 Mbps or less. I can’t remember any more who made the estimate, but I recall a paper published six or seven years ago that estimated that there were as many people in cities with no good broadband option as there are in rural America.

Even where cities have broadband, the big cities still have digital deserts where whole neighborhoods barely subscribe to broadband because of cost. The city of Buffalo, NY identified that the city has a huge homework gap and found that many students there didn’t have broadband. After some investigation, the city found that there were numerous neighborhoods where only 30 – 40% of residents could afford broadband. Buffalo has begun a program to provide free home WiFi for students, with the first deployment to cover 5,500 homes.

There have been several recent studies that have shown that affordability has become the number one reason why homes don’t have broadband. That issue is about to intensify as all of the big cable companies are starting to raise broadband rates annually. The big cable companies are also tamping down on special pricing that lets many homes get broadband for an affordable rate for a few years. Cities are recognizing that they have to find ways to solve the digital divide because they can see a huge difference between neighborhoods with and without broadband.

No NFL city has yet tackled building a fiber network to everybody, and perhaps none of them ever will. Building a fiber network of that magnitude is expensive and cities like San Francisco and Seattle got estimates of price tags over $1 billion to provide fiber everywhere. All big cities also already have some neighborhoods with fiber, making it harder to justify building fiber everywhere.

However, every big city has neighborhoods with poor broadband options and neighborhoods suffering from a huge homework gap and digital divide because of affordability. I expect more cities are going to tackle initiatives like the one undertaken in Buffalo to find ways to get broadband to those who can’t afford big-ISP prices.

Many cities are restricted from taking a serious look at broadband solutions because of statewide legal restrictions. The Colorado legislation that requires a referendum just to consider a broadband solution is typical of these laws. There are twenty-two states with some sort of restriction on municipal broadband which is intended to stop the cities in those states from looking for solutions.

The bottom line is that the only solutions for the digital divide and the homework gap are going to have to come locally. And that means that cities must be free to look for broadband solutions for neighborhoods that lack broadband options. There have been enough studies that demonstrate that students without home broadband underperform those with broadband in the home. I have no idea if the City Council in Denver is willing to at least tackle the ballot initiative to allow them to talk about the issue – but if they don’t, then their poorer neighborhoods are doomed to remain at a huge disadvantage to the rest the city.

A New National Broadband Plan?

Christopher Terry recently published an article for the Benton Institute that details how the National Broadband Plan has failed. This plan was initiated by Congress in 2009, which instructed the FCC to develop a plan to make sure that every American had access to broadband within a decade. The article details the many spectacular ways that the plan has failed.

In my opinion, the National Broadband Plan never had the slightest chance of success because it didn’t have any teeth. Congress authorized the creation of the plan as a way for politicians to show that they were pro-broadband. The plan wasn’t much more than a big showy public relations stunt. Congress makes symbolic votes all of the time and this was just another gesture that demonstrated that Congress cared about broadband and that also served to quiet broadband proponents for a few years. If Congress cared about broadband they would have followed up the plan with a vote to force the FCC to implement at least some aspects of the plan.

I have no doubt that those who worked to develop the plan are likely offended by my post-mortem of the effort. I know that several people who worked on the plan still prominently display that fact in their resume a decade later. I’m sure that working on the plan was an exhilarating process, but at the end of the day, the effort must be measured in terms of success. The folks that created the plan and the rest of the country were duped by the FCC.

The FCC never had the slightest interest in adopting the big recommendations of the plan. There is probably no better evidence of this when the Tom Wheeler FCC awarded $11 billion to the big telcos in the CAF II process – an award that couldn’t have been more antithetical to the National Broadband Plan. To those that follow FCC dockets, there are dozens of examples over the last decade where the FCC sided with big carriers instead of siding with better rural broadband.

The fact is that the US government doesn’t do well with grandiose plans and lofty long-term goals. Government agencies like the FCC mostly implement things that are mandated by Congress – and even then they often do the bare minimum. Even without the National Broadband Plan, the FCC already has a Congressional mandate to make certain that rural broadband is equivalent to urban broadband – and we annually see them do a song and dance to show how they are complying with this mandate while they instead largely ignore it.

This is not to say that broadband plans are generically bad. For example, the state of Minnesota developed its own set of broadband goals, with the most prominent goal of defining broadband in the state as connections of at least 100 Mbps. The state has implemented that goal when awarding broadband grants, and unlike the FCC, the state has awarded grant funding to build real rural broadband solutions. They’ve refused to spend money on technologies that deliver speeds that the state doesn’t consider as broadband.

I fully expect to hear a plea to develop a new plan and I hope that most of the folks who are working for better broadband ignore any such effort. Compared to ten years ago there are now a lot of organizations working for better broadband. Hundreds of rural communities have created citizen broadband committees looking for a local solution. There are county governments all over the country making grants to help lure ISPs to serve their county. Statewide groups are working to solve the digital divide and the homework gap. There are a lot of people actively advocating for real broadband solutions.

These advocates don’t need a national goal document to tell them what they want. By now, communities understand good broadband in the simplest form – it’s something their community either has or doesn’t have. Communities now understand the digital divide and the homework gap. Wasting federal dollars to create a new National Broadband Plan wouldn’t move any community one inch closer to better broadband, and I hope we resist the temptation to go down that path.

Perverting the FCC Comment Process

In a recent article, BuzzFeed dug into the issue of the FCC receiving millions of bogus comments in the last two rounds of the net neutrality docket. During the 2015 net neutrality comment period, the agency received over 4 million comments. Many of these were legitimate comments such as many that were driven by HBO’s John Oliver, who prompted people to comment in favor of net neutrality.

When the new FCC wanted to reverse the original net neutrality order they had to again open up the docket for public comment. This second time the FCC got over 20 million comments. The comments were so voluminous that the FCC website crashed in May 2017.

There were fake comments filed on both sides of the issue. On the pro-net neutrality side were 8 million nearly identical comments that were tied to email addresses from FakeMailGenerator.com. There were another million comments from people with @pornhub.com email addresses. On the anti-net neutrality side Buzzfeed identified several organizations that uploaded millions of comments using names, addresses and email addresses that came from a major data breach. These fake comments were generated on behalf of real people who had no idea their name was being used in the FCC proceeding. The fake filings included comments from some people who had died and also some anti-net neutrality comments from a few Democrats in the House of Representatives who clearly were pro-net neutrality.

While the FCC’s net neutrality dockets received the largest number of fake comments, there are fake comments being filed in other FCC dockets and false comments are being made for legislation at state legislatures.

As somebody who often comments on FCC dockets, the fake comments give me heartburn. Flooding a docket with fake comments makes it likely that legitimate comments are not read or considered. What might be the most interesting thing about the net neutrality docket is that in both cases it was clear the FCC COmmissioners had already decided how they were going to vote – so the fake comments had no real impact. But most FCC dockets are not partisan. For example, there were a lot of fake comments filed in the docket that was considering changing the rules for cable cards – the devices that allow people to avoid paying for the cable company settop boxes. That kind of docket is not partisan and is more typical of the kinds of issues that the FCC has to wangle with.

Hopefully, legal action will be taken against the bad actors that were identified in the net neutrality filings. There are several companies that have been formed for the express purposes of generating large volumes of comments in government dockets. There is nothing wrong in working with organizations to generate comments to politicians. It’s almost a definition of the first amendment if AARP galvanizes members to comment against changes in social security. But it’s a perversion of democracy when fake comments are generated to try to influence the political process.

Fighting this issue was not made any easier when the current FCC under Ajit Pai ignored public records requests in 2017 that wanted to look deeper at the underlying fake comments. After a lawsuit was filed the FCC eventually responded to public records requests that led to investigations like the one described in the Buzzfeed article.

There are probably ways for the FCC and other agencies to restrict the volume of fake comments. For example, the FCC might end the process of allowing for large quantities of comments to be filed on a bulk basis. But federal agencies have to be careful to not kill legitimate comments. It’s not unusual for an organization to encourage members to file, and they often do so using the same language in multiple filings.

This is another example of how technology can be used for negative purposes – in essence, the FCC was hacked in these dockets. As long as there is a portal for citizens to make comments it’s likely that there will be fake comments made. Fake comments are often being made outside the government process and fake reviews are a big problem for web sites like Amazon and Yelp. We need to find a way to stop the fake comments from overwhelming the real comments.

Using Wireless Backhaul

Mike Dano of Light Reading reports that Verizon is considering using wireless backhaul to reach as many as 20% of small cell sites. Verizon says they will use wireless backhaul for locations where they want to provide 5G antennas but can’t get fiber easily or affordably. The article sites an example of using wireless backhaul to provide connectivity where it’s hard to get the rights-of-way to cross railroad tracks.

This prompts me today to write about the issues involved with wireless backhaul. Done well it can greatly expand the reach of a network. Done poorly it can degrade performance or cause other problems. This is not an anti-Verizon blog because they are one of the more disciplined carriers in the industry and are likely to deploy wireless backhaul the right way.

Dano says that Verizon has already addressed one issue that is of concern today to municipalities that are seeing small cell deployments. Cities are worried about small cell devices that are large and unsightly. There are already pictures on the web of small cells gone awry where a mass of different electronics are pole-mounted to create an unsightly mess. Verizon describes their solution as integrated, meaning that no additional external antennas are needed – implying that the backhaul is likely using the same frequencies being used to reach customers. The small cell industry would do well to take heed of Verizon’s approach. It looks like courts are siding with municipalities in terms of being able to dictate aesthetic considerations for small cells.

Another issue to consider is the size of the wireless backhaul link. For instance, if Verizon uses millimeter wave backhaul there is a limitation today of being able to deliver about 1-gigabit links for 2 miles or 2-gigabit links for about a mile. The amount of bandwidth and the distance between transmitters differ according to the frequency used – but none of the wireless backhaul delivery technologies deliver as much bandwidth as fiber. Verizon has been talking about supplying 10-gigabit links to cell sites using next-generation PON technology. Wireless backhaul is going to be far less robust than fiber. This is likely not an issue today where many cell sites are using less than 2 gigabits of bandwidth. However, as the amount of broadband used by cellular networks keeps doubling every few years it might not take long for many cell sites to outgrow a wireless backhaul link.

The primary issue with wireless backhaul is the bandwidth dilution from feeding multiple wireless sites from one fiber connection. Consider an example where one cell site is fiber-fed with a 10-gigabit fiber backhaul. If that site them makes 2-gigabit wireless connections to four other cell sites, each of the 5 sites is now upward limited to 2 gigabits of usage. The bandwidth of the four secondary sites is limited by the 2-gigabit link feeding each one. The core site loses whatever bandwidth is being used by the other sites.

That’s probably a poor example because today most cell sites use less than 2 gigabits of bandwidth. Verizon’s use of 10-gigabit fiber backhaul moves them ahead of the rest of the industry that has cell sites with 1- to 5-gigabit backhaul connections today. The weaknesses of wireless backhaul are a lot more apparent when the wireless network beings at a site that only has a 1- or 2-gigabit fiber connection.

I’m sure that over time that Verizon plans to build additional fiber to relieve network congestion. Their use of wireless backhaul is going to push off the need for fiber by a decade or more and is a sensible way to preserve capital today.

The issues with wireless backhaul are far more critical for carriers that don’t have Verizon’s deep pockets, fiber networks, or discipline. It’s not hard today to find wireless networks that have overdone wireless backhaul. I’ve talked to numerous rural customers who are buying fixed wireless links from WISPs who are delivering only a few Mbps of bandwidth. Some of these customers are getting low speeds because they live too far away from the transmitting tower. Sometimes speeds are low because a WISP oversold the local antenna and is carrying more customers than the technology comfortably can serve.

But many rural wireless systems have slow speeds because of overextended wireless backhaul. In many cases in rural America, there are no fiber connections available for fixed wireless transmitters, which are often installed on grain elevators, water towers, church steeples or tall poles. I’ve seen networks that are making multiple wireless hops from a single gigabit fiber connection.

I’ve also seen preliminary designs for wireless ‘mesh’ networks where pole-mounted transmitters will beam wireless broadband into homes. Every wireless hop in these networks cuts the bandwidth in half at both radio sites (as bandwidth is split and shared). If you feed a mesh wireless network with a gigabit of bandwidth, then by the fifth hop a transmitter only sees 62 Mbps of raw bandwidth (which is overstated because by not accounting for overheads). It’s not hard to do the math to see why some rural wireless customers only see a few Mbps of bandwidth.

I’m sure that Verizon understands that many of the cell sites they serve today wirelessly will eventually need fiber, and I’m sure they’ll eventually build the needed fiber. But I also expect that there will be networks built with inadequate wireless backhaul that will barely function at inception and that will degrade over time as customer demand grows.

The Fight Over Retransmission Consent

There has been a quiet legislative battle brewing in Congress all year concerning the renewing of STELAR (Satellite Television Extension and Localism Act Reauthorization). This is a bill that comes up for renewal every five years. The original bill in 1988 was intended to make sure that rural customers got access to major network television. We lived in a different world in 1988 and it was a technical challenge for satellite providers to get access to local network affiliates (ABC, CBS, FOX, NBC, and PBS) across the country and to broadcast those signals into the appropriate local markets. The original STELA legislation allowed the satellite companies to import network channels from other markets. Without the law, in 1988 numerous rural markets would have lost the ability to watch the major networks.

Of course, Congress loves to tack riders into legislation and the STELA legislation became the primary vehicle for updating the retransmission rules for all broadcasters and cable companies. The fight over the renewal of the legislation has been fierce this year since retransmission consent is the biggest issue of contention between broadcasters and cable TV providers.

Retransmission fees have exploded over the last decade. The average local network station now typically charges cable companies a fee of $3 – $4 per viewer for the right to retransmit their content on cable networks. Not too many years ago this was done for free.

It’s not hard to understand the motivation for the broadcast industry. Advertising revenues are in freefall due to cord-cutting and due to the proliferation of web advertising using Google and Facebook. Retransmission fees are a way for broadcasters to fill the coffers and replace that lost revenue. Interestingly, though, most of the retransmission revenue ends up at the big corporations that own the network channels. I’ve talked to local network station owners who say that their corporate parents suck away most of the retransmission revenues in the form of fees to continue with the network franchise. At the end of the day, most of the retransmission revenues end up with the parent companies of ABC (Disney), CBS (CBS Corporation), FOX (FOX Corporation), and NBC (Comcast).

There is no question that retransmission fees are hurting the public because they have been one of the primary drivers (along with sports programming) for ongoing substantial rate increases.  The average cable subscriber is now paying between $12 and $15 per month for the right to view network channels on their cable system. These are the fees that many cable companies have been hiding in something like a ‘broadcast fee’ to allow them to still advertise a low price for basic cable.

Like with many of the most contentious issues, the fight is largely about money. With the current number of cable customers around 85 million, retransmission fees are generating $12 to $14 billion per year. However, if you read the comments of the two sides of the issues you would think the argument was fully about protecting the consumer. Many of the arguments being made are about stopping blackouts – which occur when broadcasters and cable companies can’t agree on the fees and conditions for buying programming. If the issue was really about the consumer then Congress would be talking about capping retransmission fees or at least limiting the annual increase of the fees.

To some degree, the issue transcends cord-cutting. Anybody buying an online service that includes the major networks is also paying some version of these same fees. That’s one of the primary reasons why the prices of the only TV-equivalent services have been rising. Online services have more flexibility because they are not required to carry any specific programming. However, once a service decides to carry the four major networks, they are somewhat at the mercy of the broadcasters. As an example, I currently subscribe to Playstation Vue which carriers the same local network affiliates that I would also get from Charter. One has to imagine that the fees charged to Playstation Vue are similar to what is being charged to the local cable company.

The way to know this is a huge issue is because the industry has created organizations focused on this one issue. For example, most of the cable companies other than Comcast (who is on the opposite side for this issue) have created the American Television Alliance to lobby against certain provisions of the bill. If you look at their website it looks like it’s a consumer-friendly site, but the members are largely the big cable companies. This is a bogus lobbying organization created solely to lobby on this legislation.

The legislation was introduced in July as the Modern Television Act of 2019. The five-year clock on the last legislation expires soon, but in 2014 the legislation was passed many months after the expiration date. It doesn’t look likely for the latest legislation to pass on time.

Keeping an Eye on the Future

The IEEE, the Institute of Electrical and Electronics Engineers, has been issuing a document annually that lays out a roadmap to make sure that the computer chips that drive all of our technologies are ready for the future. The latest such document is the 2019 Heterogeneous Integration Roadmap (HIR). The purpose of the document is to encourage the needed research and planning so that the electronics industry creates interoperable chips that anticipate the coming computer needs while also functioning across multiple industries.

This is particularly relevant today because major technologies are heading in different directions. Fields like 5G, quantum computing, AI, IoT, gene splicing, and self-driving vehicles are all pursuing different technology solutions that could easily result in specialized one-function chips. That’s not necessarily bad, but the IEEE believes that all technologies will benefit if chip research and manufacturing processes are done in such a way as to accommodate a wide range of industries and solutions.

IEEE uses the label of ‘heterogeneous integration’ to describe the process of creating a long-term vision for the electronics industry. They identify this HIR effort as the key technology going forward that is needed to support the other technologies. They envision a process where standard and separately manufactured chip components can be integrated to produce the chips needed to serve the various fields of technology.

The IEEE has created 19 separate technical working groups looking at specific topics related to HIR. This list shows both the depth and breadth of the IEEE effort. Working groups in 2019 include:

Difficult Challenges

  • Single chip and multichip packaging (including substrates)
  • Integrated photonics (including plamonics)
  • Integrated power devices
  • MEMS (miniaturization)
  • RF and analog mixed signals

Cross Cutting Topics

  • Emerging research materials
  • Emerging research devices
  • Interconnect
  • Test
  • Supply chain

Integrated Processes

  • SiP
  • 3D + 2.5D
  • WLP (wafer level packaging)

Packaging for Specialized Applications

  • Mobile
  • IoT and wearable
  • Medical and health
  • Automotive
  • High performance computing
  • Aerospace and defense

Just a few years ago many of the specific technologies were not part of the HIR process. The pace of technological breakthroughs is so intense today that the whole process of introducing new chip technology could easily diverge. The IEEE believes that taking a holistic approach to the future of computing will eventually help all fields as the best industry practices and designs are applied to all new chips.

The effort behind the HIR process is substantial since various large corporations and research universities provide the talent needed to dig deeply into each area of research. I find it comforting that the IEEE is working behind the scenes to make sure that the chips needed to support new technologies can be manufactured efficiently and affordably. Without this effort the cost of electronics for broadband networks sand other technologies might skyrocket over time.

Is Telephony a Natural Monopoly?

For my entire career, I’ve heard it said that telecommunications is a natural monopoly. That was the justification for creating monopoly exchange boundaries for telcos and for issuing exclusive franchise agreements for cable companies. This historic reasoning is why the majority of Americans in urban areas are still stuck with duopoly competition that is trending towards a cable monopoly.

I worked for Southwestern Bell pre-divestiture and they were proud of their monopoly. Folks at Ma Bell thought the telephone monopoly was the best possible deal for the public and they constantly bragged about the low rates for a residential telephone line, usually at something less than $15 per month. But when you looked closer, the monopoly was not benefitting the average household. Long distance was selling for 12 cents to 25 cents per minute and a major percentage of households had monthly phone bills over $100 per month.

I’ve been doing some reading on the history of the telephone industry and found some history I never knew about – and which is different than what Ma Bell told employees for 100 years.

Alexander Graham Bell was granted many patents for telephone service in 1876. During the 18-year life of the original patents, Bell telephone held a monopoly on telephone service. Bell Telephone mostly built to large businesses and to rich neighborhoods and the country still predominantly communicated via telegraph. Bell Telephone was not considered much of a success. By 1894 there was still less than 5 telephones in the country per 1,000 population, and there were only 37 average calls per day per 1,000 people.

As soon as the patents expired, numerous competitors entered the market. They built to towns that Bell Telephone had ignored but also built a competing network in many Bell Telephone markets. By the end of 1896, 80 competitors that had grabbed 5% of the total telephone market. By 1900 there were 3,000 competitive telephone companies.

By 1907 the competitors had grabbed 51% of the national market and had also driven down urban telephone rates. AT&T’s returns (AT&T had officially become the name of Bell Telephone) had dropped from 46% annually in the late 1800s to 8% by 1906. After 17 years of monopoly, the country had only 270,000 telephones. After 13 years of competition there were over 6 million phones in the country.

The death of telephone competition started when Theodore Vail became president of AT&T in 1907. By 1910 the company was buying competitors and lobbying for a monopoly scenario. Federal regulators stepped in to slow Bell’s the purchase of telephone companies after Vail tried to buy Western Union.

In a compromise reached with the federal government, AT&T agreed to stop buying telcos and to interconnect with independent telephone companies to create one nationwide network. That compromise was known as the Kingsbury Commitment. Vail used this compromise to carve out monopoly service areas by only agreeing to interconnect with companies that would create exchange boundaries and further agree not to compete in AT&T exchanges. With almost the opposite result that federal regulators had hoped for, the Kingsbury Commitment resulted in a country carved into AT&T monopoly telephone service areas.

From that time forward federal regulators supported the new monopoly borders, cementing the arrangement with the Telecommunications Act of 1934. State regulators liked the monopolies because they were easier to regulate – state regulation turned into rate-making procedures that raised rates on businesses to keep lower residential rates. AT&T thrived in this environment because they were guaranteed a rate of return, regardless of performance.

The history of telephone service shows that the industry is not a natural monopoly. A natural monopoly is one where one provider can produce lower rates than are achieved by allowing competition. Competing networks forced lower telephone rates at the turn of the last century. After the establishment of the AT&T monopoly we saw monopoly abuse through high long distance rates that didn’t drop until MCI challenged the monopoly status quo. Today we have a world full of multiple wires and networks and the idea of natural monopoly is no longer considered as valid. Unfortunately, many of the vestiges of the regulations that protect the big telcos are still in place and still create hurdles to unfettered competition.

Shame on the Regulators

It’s clear that even before the turn of this century that the big telcos largely walked away from maintaining and improving residential service. The evidence for this is the huge numbers of neighborhoods that are stuck with older copper technologies that haven’t been upgraded.  The telcos made huge profits over the decades in these neighborhoods and ideally should not have been allowed to walk away from their customers.

In the Cities. Many neighborhoods in urban areas still have first or second-generation DSL over copper with fastest speeds of 3 Mbps or 6 Mbps. That technology had a shelf-life of perhaps seven years and is now at least fifteen years old.

The companies that deployed the most DSL are AT&T and CenturyLink (formerly Quest). The DSL technology should have been upgraded over time by plowing profits back into the networks. This happened in some neighborhoods, but as has been shown in several detailed studies in cities like Cleveland and Dallas, the faster DSL was brought to more affluent neighborhoods, leaving poorer neighborhoods, even today, with the oldest DSL technology.

The neighborhoods that saw upgrades saw DSL speeds between 15 Mbps and 25 Mbps. Many of these neighborhoods eventually saw speeds as fast as 50 Mbps using a technology that bonded two 25 Mbps DSLs circuits. There are numerous examples of neighborhoods with 50 Mbps DSL sitting next to ones with 3 Mbps DSL.

Verizon used a different tactic and upgraded neighborhoods to FiOS fiber. But this was also done selectively although Verizon doesn’t seem to have redlined as much as AT&T, but instead built FiOS only where the construction cost was the lowest.

In Europe, the telcos decided to complete with the cable companies and have upgraded DSL over time, with the fastest DSL today offering speeds as fast as 300 Mbps. There is talk coming out of DSL vendors talking about ways to goose DSL up to gigabit speeds (but only for short distances). The telcos here basically stopped looking at better DSL technology after the introduction of VDSL2 at least fifteen years ago.

By now the telcos should have been using profits to build fiber. AT&T has done this using the strategy of building little pockets of fiber in every community near to existing fiber splice points. However, the vast majority of rural households served by AT&T are not being offered fiber, and AT&T said recently that they have no plans to build more fiber. CenturyLink built fiber to past nearly 1 million homes a few years ago, but that also seems like a dead venture going forward. But now, in 2019, each of these telcos should have been deep into urban neighborhoods in their whole service area with fiber. Had they done so they would not be getting clobbered so badly by the cable companies that are taking away millions of DSL customers every year.

Rural America. The big telcos started abandoning rural America as much as thirty years ago. They’ve stopped maintaining copper and have not voluntarily made any investments in rural America for a long time. There was a burst of rural construction recently when the FCC gave them $11 billion to improve rural broadband to 10/1 Mbps – but that doesn’t seem to be drawing many rural subscribers.

It’s always been a massive challenge to bring the same speeds to rural America that can be provided in urban America. This is particularly so with DSL since the speeds drop drastically with distance. DSL upgrades that could benefit urban neighborhoods don’t work well in farmland. But the telcos should have been expanding fiber deeper into the network over time to shorten loop lengths. Many independent telephone companies did this the right way and they were able over time to goose rural DSL speeds up to 25 Mbps.

The big telcos should have been engaging in a long-term plan to continually shorten rural copper loop lengths. That meant building fiber, and while shortening loop lengths they should have served households close to fiber routes with fiber. By now all of the small towns in rural America should have gotten fiber.

This is what regulated telcos are supposed to do. The big telcos made vast fortunes in serving residential customers for many decades. Regulated entities are supposed to roll profits back into improving the networks as technology improves – that’s the whole point of regulating the carrier of last resort.

Unfortunately, the industry got sidetracked by competition from CLECS. This competition first manifested in competition for large business customers. The big telcos used that competition to convince regulators they should be deregulated. Over time the cable companies provided real residential competition in cities, which led to the de facto total deregulation of telcos.

In Europe, the telcos never stopped competing in cities because regulators didn’t let them quit. The telcos have upgraded to copper speeds that customers still find attractive, but the telcos all admit that the next upgrade needs to be fiber. In the US, the big telcos exerted political pressure to gain deregulation at the first hint of competition. US telcos folded and walked away from their customers rather than fighting to maintain revenues.

Rural America should never have been deregulated. Shame on every regulator in every state that voted to deregulate the big telcos in rural America. Shame on every regulator that allowed companies like Verizon palm off their rural copper to companies like Frontier – a company that cannot succeed, almost by definition.

In rural America the telcos have a physical network monopoly and the regulators should have found ways to support rural copper rather than letting the telcos walk away from it. We know this can be done by looking at the different approaches taken by the smaller independent telephone companies. These small companies took care of their copper and most have now taken the next step to upgrade to fiber to be ready for the next century.