Mapping Cellular Data Speeds

AT&T recently filed comments in Docket 19-195, the docket that is looking to change broadband mapping, outlining the company’s proposal for reporting wireless data speeds to the FCC. I think a few of their recommendations are worth noting.

4G Reporting. Both AT&T and Verizon support reporting on 4G cellular speeds using a 5 Mbps download and 1 Mbps upload test with a cell edge probability of 90% and a loading of 50%. Let me dissect that recommendation a bit. First, this means that customer has a 90% chance of being able to make a data connection at the defined edge if a cell tower coverage range.

The more interesting reporting requirement is the 50% loading factor. This means the reported coverage area would meet the 5/1 Mbps speed requirement only when a cell site is 50% busy with customer connections. Loading is something you rarely see the cellular companies talk about. Cellular technology is like most other shared bandwidth technologies in that a given cell site shares bandwidth with all users. A cell site that barely meets the 5/1 Mbps data speed threshold when it’s 50% busy is going to deliver significantly slower lower speeds as the cell site gets busier. We’ve all experienced degraded cellular performance at rush hours – the normal peak times for many cell sites. This reporting requirement is a good reminder that cellular data speeds vary during the day according to how many people are using a cell site – something the cellular companies never bother to mention in their many ads talking about their speeds and coverage.

The recommended AT&T maps would show areas that meet the 5/1 Mbps speed threshold, with no requirement to report faster speeds. I find this recommendation surprising because Opensignal reports the average US speeds of 4G LTE across America is as follows:

2017 2018
AT&T 12.9 Mbps 17.87 Mbps
Sprint 9.8 Mbps 13.9 Mbps
T-Mobile 17.5 Mbps 21.1 Mbps
Verizon 14.9 Mbps 20.9 Mbps

I guess that AT&T favors the lowly 5/1 Mbps threshold since that will show the largest possible coverage area for wireless broadband. While many AT&T cell sites provide much faster speeds, my guess is that most faster cell sites are in urban areas and AT&T doesn’t want to provide maps showing faster speeds such as 15 Mbps because that would expose how slow their speeds are in most of the country. If AT&T offered faster speeds in most places, they would be begging to show multiple tiers of cellular broadband speeds.

Unfortunately, maps using the 5/1 Mbps criteria won’t distinguish between urban places with fast 4G LTE and more rural places that barely meet the 5 Mbps threshold – all AT&T data coverage will be homogenized into one big coverage map.

About the only good thing I can say about the new cellular coverage maps is that if the cellular companies report honestly, we’re going to see the lack of rural cellular broadband for the first time.

5G Broadband Coverage. I don’t think anybody will be shocked that AT&T (and the other big cellular companies) don’t want to report 5G. Although they are spending scads of money touting their roll-out of 5G they think it’s too early to tell the public where they have coverage.

AT&T says that requiring 5G reporting at this early stage of the new technology would reveal sensitive information about cell site location. I think customers who pony up extra for 5G want to know where they can use their new expensive handsets.

AT&T wants 5G coverage to fall under the same 5/1 Mbps coverage maps, even though the company is touting vastly faster speeds using new 5G phones.

It’s no industry secret that most of the announced 5G deployment announcements are mostly done for public relations purposes. For example, AT&T is loudly proclaiming the number of major cities that now have 5G, but this filing shows that they don’t want the public to know the small areas that can participate in these early market trials.

If 5G is a reasonable substitute for landline broadband, then the technology should not fall under the cellular reporting requirements. Instead, the cellular carriers should be forced to show where they offer speeds exceeding 10/1 Mbps, 25/3 Mbps and 100/10 Mbps, and 1 Gbps. I’m guessing a 5G map using these criteria would largely show a country that has no 5G coverage – but we’ll never know unless the FCC forces the wireless companies to tell the truth. I think that people should be cautious about speeding extra for 5G-capable phones until the cellular carriers are honest with them about the 5G coverage.

Is Telephony a Natural Monopoly?

For my entire career, I’ve heard it said that telecommunications is a natural monopoly. That was the justification for creating monopoly exchange boundaries for telcos and for issuing exclusive franchise agreements for cable companies. This historic reasoning is why the majority of Americans in urban areas are still stuck with duopoly competition that is trending towards a cable monopoly.

I worked for Southwestern Bell pre-divestiture and they were proud of their monopoly. Folks at Ma Bell thought the telephone monopoly was the best possible deal for the public and they constantly bragged about the low rates for a residential telephone line, usually at something less than $15 per month. But when you looked closer, the monopoly was not benefitting the average household. Long distance was selling for 12 cents to 25 cents per minute and a major percentage of households had monthly phone bills over $100 per month.

I’ve been doing some reading on the history of the telephone industry and found some history I never knew about – and which is different than what Ma Bell told employees for 100 years.

Alexander Graham Bell was granted many patents for telephone service in 1876. During the 18-year life of the original patents, Bell telephone held a monopoly on telephone service. Bell Telephone mostly built to large businesses and to rich neighborhoods and the country still predominantly communicated via telegraph. Bell Telephone was not considered much of a success. By 1894 there was still less than 5 telephones in the country per 1,000 population, and there were only 37 average calls per day per 1,000 people.

As soon as the patents expired, numerous competitors entered the market. They built to towns that Bell Telephone had ignored but also built a competing network in many Bell Telephone markets. By the end of 1896, 80 competitors that had grabbed 5% of the total telephone market. By 1900 there were 3,000 competitive telephone companies.

By 1907 the competitors had grabbed 51% of the national market and had also driven down urban telephone rates. AT&T’s returns (AT&T had officially become the name of Bell Telephone) had dropped from 46% annually in the late 1800s to 8% by 1906. After 17 years of monopoly, the country had only 270,000 telephones. After 13 years of competition there were over 6 million phones in the country.

The death of telephone competition started when Theodore Vail became president of AT&T in 1907. By 1910 the company was buying competitors and lobbying for a monopoly scenario. Federal regulators stepped in to slow Bell’s the purchase of telephone companies after Vail tried to buy Western Union.

In a compromise reached with the federal government, AT&T agreed to stop buying telcos and to interconnect with independent telephone companies to create one nationwide network. That compromise was known as the Kingsbury Commitment. Vail used this compromise to carve out monopoly service areas by only agreeing to interconnect with companies that would create exchange boundaries and further agree not to compete in AT&T exchanges. With almost the opposite result that federal regulators had hoped for, the Kingsbury Commitment resulted in a country carved into AT&T monopoly telephone service areas.

From that time forward federal regulators supported the new monopoly borders, cementing the arrangement with the Telecommunications Act of 1934. State regulators liked the monopolies because they were easier to regulate – state regulation turned into rate-making procedures that raised rates on businesses to keep lower residential rates. AT&T thrived in this environment because they were guaranteed a rate of return, regardless of performance.

The history of telephone service shows that the industry is not a natural monopoly. A natural monopoly is one where one provider can produce lower rates than are achieved by allowing competition. Competing networks forced lower telephone rates at the turn of the last century. After the establishment of the AT&T monopoly we saw monopoly abuse through high long distance rates that didn’t drop until MCI challenged the monopoly status quo. Today we have a world full of multiple wires and networks and the idea of natural monopoly is no longer considered as valid. Unfortunately, many of the vestiges of the regulations that protect the big telcos are still in place and still create hurdles to unfettered competition.

Shame on the Regulators

It’s clear that even before the turn of this century that the big telcos largely walked away from maintaining and improving residential service. The evidence for this is the huge numbers of neighborhoods that are stuck with older copper technologies that haven’t been upgraded.  The telcos made huge profits over the decades in these neighborhoods and ideally should not have been allowed to walk away from their customers.

In the Cities. Many neighborhoods in urban areas still have first or second-generation DSL over copper with fastest speeds of 3 Mbps or 6 Mbps. That technology had a shelf-life of perhaps seven years and is now at least fifteen years old.

The companies that deployed the most DSL are AT&T and CenturyLink (formerly Quest). The DSL technology should have been upgraded over time by plowing profits back into the networks. This happened in some neighborhoods, but as has been shown in several detailed studies in cities like Cleveland and Dallas, the faster DSL was brought to more affluent neighborhoods, leaving poorer neighborhoods, even today, with the oldest DSL technology.

The neighborhoods that saw upgrades saw DSL speeds between 15 Mbps and 25 Mbps. Many of these neighborhoods eventually saw speeds as fast as 50 Mbps using a technology that bonded two 25 Mbps DSLs circuits. There are numerous examples of neighborhoods with 50 Mbps DSL sitting next to ones with 3 Mbps DSL.

Verizon used a different tactic and upgraded neighborhoods to FiOS fiber. But this was also done selectively although Verizon doesn’t seem to have redlined as much as AT&T, but instead built FiOS only where the construction cost was the lowest.

In Europe, the telcos decided to complete with the cable companies and have upgraded DSL over time, with the fastest DSL today offering speeds as fast as 300 Mbps. There is talk coming out of DSL vendors talking about ways to goose DSL up to gigabit speeds (but only for short distances). The telcos here basically stopped looking at better DSL technology after the introduction of VDSL2 at least fifteen years ago.

By now the telcos should have been using profits to build fiber. AT&T has done this using the strategy of building little pockets of fiber in every community near to existing fiber splice points. However, the vast majority of rural households served by AT&T are not being offered fiber, and AT&T said recently that they have no plans to build more fiber. CenturyLink built fiber to past nearly 1 million homes a few years ago, but that also seems like a dead venture going forward. But now, in 2019, each of these telcos should have been deep into urban neighborhoods in their whole service area with fiber. Had they done so they would not be getting clobbered so badly by the cable companies that are taking away millions of DSL customers every year.

Rural America. The big telcos started abandoning rural America as much as thirty years ago. They’ve stopped maintaining copper and have not voluntarily made any investments in rural America for a long time. There was a burst of rural construction recently when the FCC gave them $11 billion to improve rural broadband to 10/1 Mbps – but that doesn’t seem to be drawing many rural subscribers.

It’s always been a massive challenge to bring the same speeds to rural America that can be provided in urban America. This is particularly so with DSL since the speeds drop drastically with distance. DSL upgrades that could benefit urban neighborhoods don’t work well in farmland. But the telcos should have been expanding fiber deeper into the network over time to shorten loop lengths. Many independent telephone companies did this the right way and they were able over time to goose rural DSL speeds up to 25 Mbps.

The big telcos should have been engaging in a long-term plan to continually shorten rural copper loop lengths. That meant building fiber, and while shortening loop lengths they should have served households close to fiber routes with fiber. By now all of the small towns in rural America should have gotten fiber.

This is what regulated telcos are supposed to do. The big telcos made vast fortunes in serving residential customers for many decades. Regulated entities are supposed to roll profits back into improving the networks as technology improves – that’s the whole point of regulating the carrier of last resort.

Unfortunately, the industry got sidetracked by competition from CLECS. This competition first manifested in competition for large business customers. The big telcos used that competition to convince regulators they should be deregulated. Over time the cable companies provided real residential competition in cities, which led to the de facto total deregulation of telcos.

In Europe, the telcos never stopped competing in cities because regulators didn’t let them quit. The telcos have upgraded to copper speeds that customers still find attractive, but the telcos all admit that the next upgrade needs to be fiber. In the US, the big telcos exerted political pressure to gain deregulation at the first hint of competition. US telcos folded and walked away from their customers rather than fighting to maintain revenues.

Rural America should never have been deregulated. Shame on every regulator in every state that voted to deregulate the big telcos in rural America. Shame on every regulator that allowed companies like Verizon palm off their rural copper to companies like Frontier – a company that cannot succeed, almost by definition.

In rural America the telcos have a physical network monopoly and the regulators should have found ways to support rural copper rather than letting the telcos walk away from it. We know this can be done by looking at the different approaches taken by the smaller independent telephone companies. These small companies took care of their copper and most have now taken the next step to upgrade to fiber to be ready for the next century.

The Market Uses for CBRS Spectrum

Spencer Kurn, an analyst for New Street Research recently reported on how various market players plan to use the 3.5 GHz CBRS spectrum recently approved by the FCC. I described the FCC’s order in this recent blog. As a quick refresher, this is a large swath of spectrum and the FCC has approved 80 MHz of spectrum for public use and will be auctioning 70 MHz of the spectrum in 2020.

Cellular Bandwidth. Kurn notes that Verizon plans to use the new spectrum to beef up 4G bandwidth now and eventually 5G. Verizon plans to use the spectrum in dense markets and mostly outdoors. Cable companies like Comcast and Charter that have entered the wireless business are also likely to use the spectrum in this manner.

I’ve been writing for a while about the crisis faced by cellular network. In urban areas they are seeing broadband usage double almost every two years and keeping up with that growth is a huge challenge. It’s going to require the combination of new spectrum, more cell sites (mostly small cells), and the improvements that come with 5G, mostly the frequency slicing.

It’s interesting that Verizon only sees this as an outdoor solution, but that makes sense because this spectrum is close in characteristics as the existing WiFi bands and will lose most of its strength in passing through a wall. It also makes sense that Verizon will only do this in metro areas where there is enough outdoor traffic for the spectrum to make a difference. I’ve seen several studies that say that the vast majority of cellular usage is done indoors in homes, businesses, and schools. But this spectrum still becomes one more piece of the solution to help relieve the pressure on urban cell sites.

For this to be of use the spectrum has to be built into cellular handsets. Apple recently announced that they are building the ability to receive Band 48 of CBRS into their new models. They join the Samsung Galaxy S10 and the Google Pixel 3 with the ability to use the spectrum. Over time it’s likely to be built into many phones, although handset manufacturers are always cautious because adding new spectrum bands to a handset increases the draw on the batteries.

Point-to-Multipoint Broadband. Numerous WISPs and other rural ISPs have been lobbying for the use of the spectrum since it can beef up point-to-multipoint broadband networks. These are networks that put a transmitter on a tower and then beam broadband to a dish on a subscriber premise. This technology is already widely in use mostly using the 2.4 GHz and 5.0 GHz WiFi spectrum. Layering on CBRS will beef up the broadband that can be delivered over a customer link.

It will be interesting to see how that works in a crowded competitive environment. I am aware of counties today where there are half a dozen WISPs all using WiFi spectrum and the interference degrades network performance for everybody. There are five SAS Administrators named by the FCC that will monitor bandwidth usage and who also will monitor interference. The FCC rules don’t allow for indiscriminate deployment of public CBRS spectrum and we’ll have to see how interference problems are dealt with.

One interesting player in the space will be AT&T who intends to layer the frequency onto their fixed wireless product. AT&T widely used the technology to meet their CAF II buildout requirements and mostly has used PCS spectrum to meet the FCC requirement to deliver at least 10/1 Mbps speeds to customers. Adding the new spectrum should significantly increase rural customer speeds – at least for those with a few miles of AT&T towers.

Cable Company Edge-out. The most interesting new players considering the market are the cable companies. Kurn believes that the big cable companies will use the spectrum to edge out to serve rural customers with fixed wireless around their existing cable networks. He says the cable networks could theoretically pass 6 – 7 million new homes if this is deployed everywhere. This is an ideal application for a cable company because they typically have fiber fairly close the edge of their service areas. The point-to-point wireless product operates best when the radios are fiber-fed and cable companies could deliver a product in the 50-100 Mbps range where they have line-of-sight to customers.

We’ve already seen one cable company tackle this business plan. Midco was awarded $38.9 million in the CAF II reverse auctions to deploy 100 Mbps broadband in Minnesota and the Dakotas. Midco is going to need this spectrum, and probably even more to deliver 100 Mbps to every customer. Their deployment is not really an edge-out, and the company plans to build networks that will cover entire rural counties with fixed wireless broadband.

Trusting Big Company Promises

When AT&T proposed to merge with Time Warner in 2016, attorneys at the Justice Department argued against the merger and said that the combined company would have too much power since it would be both a content provider and a content purchaser. Justice Department lawyers and various other antitrust lawyers warned that the merger would result in rate hikes and blackouts. AT&T counterargued that they are good corporate citizens and that the merger would be good for consumers.

In retrospect, it looks like the Justice Department lawyers were right. Soon after the merger, AT&T raised the prices for DirecTV and its online service DirecTV Now by $5 per month. The company raised the rates on DirecTV Now again in April of this year by $10 per month. AT&T accompanied the price increases with a decision to no longer negotiate promotional prices with TV customers. In the first two quarters of this year DirecTV lost over 1.3 million customers as older pricing packages expired and the company insisted that customers move to the new prices. AT&T says they are happy to be rid of customers that were not contributing to their bottom line.

In July of this year, CBS went dark for 6.5 million DirecTV and AT&T U-verse cable customers. AT&T said that CBS wanted too much money to renew a carriage deal. The two companies resolved the blackout in August.

Meanwhile, AT&T and Dish networks got into a dispute in late 2018 which resulted in turning off HBO and Cinemax on Dish Network. This blackout has carried into 2019 and the two sides still have not resolved the issue. The dispute cost Dish a lot of customers when the company was unable to carry the Game of Thrones. Dish says that half of its 334,000 customer losses in the fourth quarter of 2018 were due to not having the Game of Thrones.

I just saw headlines that AT&T is headed towards a rate fight with ESPN and warns there could be protracted blackouts.

It’s hard to fully fault any one of the AT&T decisions since they can be justified to some degree as smart business practices. But that’s how monopoly abuses generally work. AT&T wants to pay as little as possible when buying programming from others and wants to charge as much as possible when selling content. In the end, it’s consumers who pay for the AT&T practices – something the company had promised would not happen just months before the blackouts.

Programming fights don’t have to be so messy. Consider Comcast which is also a programmer and the biggest cable TV company. Comcast has gotten into a few disputes over programming, particularly with regional sports programming. In a few of these disputes, Comcast was leveraging its programming power since it also owns NBC and other programming. But these cases mostly got resolved without blackouts.

Regulators are most worried about AT&T’s willingness to allow prolonged blackouts because during blackouts the public suffers. Constantly increasing programming costs have caused a lot of angst for cable TV providers, and yet most disputes over programming don’t result in turning off content. AT&T is clearly willing to flex its corporate muscles since it is operating from a position of power in most cases, as either an owner of valuable content or as one of the largest buyers of content.

From a regulatory perspective this raises the question of how the government can trust the big companies that have grown to have tremendous market power. The Justice Department sued to challenge the AT&T and Time Warner merger even after the merger was approved. That was an extraordinary suit that asked to undo the merger. The Justice Department argued that the merger was clearly against the public interest. The courts quickly ruled against that suit and it’s clear that it’s nearly impossible to undo a merger after it has occurred.

The fact is that companies with monopoly power almost always eventually abuse that power. It’s incredibly hard for a monopoly to decide not to act in its own best interest, even if those actions are considered as monopoly abuses. Corporations are made up of people who want to succeed and it’s human nature for people to take any market advantages their corporation might have. I have to wonder if AT&T’s behavior will make regulators hesitate before the next big merger. Probably not, but AT&T barely let the ink dry on the Time Warner merger before doing things they promised they wouldn’t do.

The Digital Redlining of Dallas

In 2018 Dr Brian Whitacre, an economist from Oklahoma State University looked in detail at the broadband offered by AT&T in Dallas County, Texas. It’s an interesting county in that it includes all of the City of Dallas as well as wealthy suburban areas. Dr. Whitaker concluded that AT&T has engaged for years in digital redlining – in providing faster broadband only in the more affluent parts of the area.

Dr. Whitaker looked in detail at AT&T’s 477 data at the end of 2017 provided to the FCC. AT&T reports the technology used in each census blocks as well as the ‘up-to’ maximum speed offered in each census block.

AT&T offers three technologies in Dallas county:

  • Fiber-to-the-home with markets speeds up to 1 Gbps download. AT&T offers fiber in 6,287 out of 23,463 census blocks (26.8% of the county). The average maximum speed offered in these census blocks in late 2017 according to the 477 data was 300 Mbps.
  • VDSL, which brings fiber deep into neighborhoods, and which in Dallas offers speeds as fast as 75 Mbps download. AT&T offers this in 10,399 census blocks in Dallas (44.3% of the county). AT&T list census blocks with maximum speeds of 18, 24, 45, and 75 Mbps. The average maximum speed listed in the 477 data is 56 Mbps.
  • ADSL2 or ADSL2+, which is one of the earliest forms of DSL and is mostly deployed from central offices. The technology theoretically delivers speeds up to 24 Mbps but decreases rapidly for customers more than a mile from a central office. AT&T still uses ADSL2 in 6,777 census blocks (28.9% of the county). They list the maximum speeds of various census blocks at 3, 6, 12, and 18 Mbps. The average speed of all ADSL2 census blocks is 7.26 Mbps.

It’s worth noting before going further that the above speed differences, while dramatic, doesn’t tell the whole story. The older ADSL technology has a dramatic drop in customer speeds with distances and speeds are also influenced by the quality of the copper wires. Dr. Whitaker noted that he had anecdotal evidence that some of the homes that were listed as having 3 Mbps of 6 Mbps might have speeds under 1 Mbps.

Dr. Whitaker then overlaid the broadband availability against poverty levels in the county. His analysis started by looking at Census blocks have at least 35% of households below the poverty level. In Dallas County, 6,777 census blocks have poverty rates of 35% or higher.

The findings were as follows:

  • Areas with high poverty were twice as likely to be served by ADSL – 56% of high-poverty areas versus 24% of other parts of the city.
  • VDSL coverage was also roughly 2:1 with 25% of areas with high poverty served by VDSL while 48% of the rest of the city had VDSL.
  • Surprisingly, 19% of census blocks with high poverty were served with fiber. I’m going to conjecture that this might include large apartment complexes where AT&T delivers one fiber to the whole complex – which is not the same product as fiber-to-the-home.

It’s worth noting that the findings are somewhat dated and rely upon 477 data from November 2017. AT&T has not likely upgraded any DSL since then, but they have been installing fiber in more neighborhoods over the last two years in a construction effort that recently concluded. It would be interesting to see if the newer fiber also went to more affluent neighborhoods.

I don’t know that I can write a better conclusion of the findings than the one written by Dr. Whitacre: “The analysis for Dallas demonstrates that AT&T has withheld fiber-enhanced broadband improvements from most Dallas neighborhoods with high poverty rates, relegating them to Internet access services which are vastly inferior to the services enjoyed by their counterparts nearby in the higher-income Dallas suburbs…”

This study was done as a follow-up to work done earlier in Cleveland, Ohio and this same situation can likely be found in almost every large city in the country. It’s not hard to understand why ISPs like AT&T do this – they want to maximize the return on their investment. But this kind of redlining is not in the public interest and is possibly the best argument that can be made for regulating broadband networks. We regulated telephone companies since 1932, and that regulation resulted in the US having the best telephone networks in the world. But we’ve decided to not regulate broadband in the same way, and until we change that decision we’re going to have patchwork networks that create side-by-side haves and have-nots.

Are You Paying to Spy on Yourself?

Geoffrey A. Fowler of the Washington Post recently engaged a data expert to track everything going on behind the scenes with his iPhone. What he found was surprising since Apple touts itself as a company that doesn’t invade user privacy. The various apps on his phone were routinely handing out his personal data on a scale that shocked him.

Fowler’s information was being gathered by trackers. This is software built directly into apps and is different than ad tracking cookies that we pick up from web sites. App makers deliberately build trackers into apps and a user can’t get rid of them without getting rid of the app.

Most apps on his phone had these trackers. That included sites like Microsoft OneDrive, Intuit’s Mint, Nike, Spotify, The Washington Post, and the Weather Channel. Some apps came with numerous trackers. He had a food delivery service called DashDoor that included nine separate trackers. Third parties must be paying to share app space because the DashDoor app included trackers for Facebook and Google – those two companies know every time that app is used to order food.

Almost none of these apps disclosed the nature of what they were tracking. When first loaded, most apps ask for somewhat generic permission to track user certain data but don’t disclose the frequency and the extent to which they will gather data from a user.

This issue has relevance beyond privacy concerns because the apps on Fowler’s phone could collectively use as much as 1.5 gigabytes of data per month on his phone. Industry statistics show that the fastest-growing segment of Internet traffic is machine-to-machine communication, and these app trackers make a significant contribution to that traffic. Put bluntly, a lot of machine-to-machine traffic is either being used to back up files or to spy on us.

This has to be concerning to people who are still on measured cellular data plans. This unintended usage can cost real money and a user can end up paying to have trackers spy on them. Our cellphones are generating broadband usage without our knowledge, and mostly without our explicit permission. I’ve had months where I’ve barely roamed with my cellphone and still have seen more than a gigabyte of usage – I now understand where it’s probably coming from.

PCs and tablets have the same problems, with the data tracking coming more from marketing cookies that are loaded when we visit web sites. I scrub these cookies from my computer routinely. My desktop is only used for work and I still find 40 – 100 cookies every week. One of my blogs last year mentioned a guy who had gone on vacation for a month and was shocked when he returned and discovered that his home network had used several gigabytes of data in his absence.

There are ways to block the trackers on your phone, but this mostly involves deleting apps or turning off permission in your privacy setting, and that largely means the apps won’t work. You can also take steps to disguise your data by passing everything through a VPN, but that doesn’t stop the data from being transmitted.

The phone manufacturers are complicit in this tracking. I just got a new Samsung Galaxy and my new phone came with over 300 apps – most for services I don’t use like Facebook, Spotify, and ton of others. These various companies must have paid Samsung (or perhaps AT&T) to include their apps and their trackers. I’ll be spending a few days deleting or disabling most of these apps. I find it creepy that Facebook follows me even though I stopped using the site several years ago. And unlike when I download a new app, I didn’t have the opportunity to allow or deny permission to the many apps on my new phone – I assume AT&T gave that permission.

It might be a generational thing, but it bothers me to have companies reaping my personal data without my permission, without disclosing what they are gathering, and how they are using it. I know young people who are not bothered by tracking and assume that this is just a part of being connected.

The other big concern is that the tracking apps are contributing to the capacity problems on cellular network. I just saw last week that the average US cellphone now uses about 6 GB of data per month. If trackers are pushing out even half a gigabyte per month in usage that is a significant contributor to swamped cellular networks. Cellphone companies are working furiously to keep ahead of the demand and it must be maddening to cellular network engineers to know that 15% – 20% of network usage is being created behind the scenes with app trackers and not from actions taken by users.

In an ideal world, this is something regulators would be investigating to establish rules. Apps like DashDoor shouldn’t be allowed to install a Facebook tracker on your phone without asking for specific and explicit permission. All trackers should have to disclose the exact information they gather about a user and the frequency of that tracking. Unfortunately, this FCC has walked away from any regulatory role in this area. Congress could address the issue – something that European regulators are considering – but this doesn’t seem to be high on anybody’s radar.

How Smart are Promotional Rates?

I think the big ISPs are recognizing the impact that special promotion rates have on their bottom line. Promotional pricing is the low rates that cable companies offer to new customers to pry them away from the competition. Over the years promotional rates have also become the tool that cable companies use to retain customers. Most customers understand that they have to call the cable company periodically to renegotiate rates – and the big ISPs have routinely given customers a discount to keep them happy.

We’re finally seeing some changes with this practice. When Charter bought Time Warner Cable they found that Time Warner had over 90,000 ‘special’ pricing plans – they routinely negotiated separately with customers when they bought new service or renegotiated prices. Charter decided to end the practice and told most former Time Warner customers that they had to pay the full price at the end of their current contract period.

We’ve seen the same thing with AT&T and DirecTV. The company decided last year to eliminate the special discount on DirecTV and DirecTV Now. When the discount period ends for those products the company moves rates to the full list price and refuses to renegotiate. The practice cost AT&T almost a million customers just in the first quarter of this year. But AT&T says that they are glad to be rid of customers that are not contributing to the bottom line of the company. I’ve seen where the CEOs or other big ISPs like Comcast have said that they are considering changes in these practices.

At CCG we routinely examine customer bills from incumbent ISPs as part of the market research of helping ISPs entering new markets. While our examination of customer bills has never reached the level of equating to a statistically valid sample, I can report that the vast majority of bills we see have at least some level of discount. In some markets it’s rare to find a customer bill with no discount.

The discounts must accumulate to a huge loss of revenue for the big ISPs. The big ISPs all know that one of the only ways they are going to be profitable in the future is to raise broadband rates every year. The growth of broadband customers overall is slowing nationwide since most homes have broadband, although Charter and Comcast are still enjoying the migration of customers off DSL. The ISPs are continuing to lose revenues and margins as they lose cable and landline voice customers. Most US markets are seeing increased competition in broadband services for businesses and large MDUs. There’s not much left other than to raise residential broadband rates if the big ISPs want to satisfy the revenue growth expected by Wall Street.

If the big ISPs phased out promotional discounts it would probably equate to a 5% to 10% revenue increase. This is something that is becoming easier for a cable company to do. Many of them have already come to grips with cord cutting, and many are no longer fighting to keep cable customers. Cable companies are also less worried over time about customers leaving them to go back to DSL – a choice that is harder for consumers to make as the household need for broadband continues to climb.

Most ISPs won’t make a loud splash about killing discounts but will just quietly change policies. After a few years, I would expect customer expectations will reset after they realize that they can no longer extract discounts by threatening to drop service.

I’ve always advised my fiber overbuilder customers to not play this game. I ask clients if they really want to fight hard to win that slice of the market of customers that will change ISPs for a discount. Such customers flop back and forth between ISPs every two years, and in my opinion, companies are better off without such customers. Churn is expensive, and it’s even more expensive if an ISP provides a substantial discount to stop a customer from churning. Not all of my client agree with this philosophy, but if the big ISPs stop providing promotional discounts, then over time the need to do this for competitors will lessen.

This is certainly a practice I’d love to see slip into history. I’ve never liked it as a customer because I despise the idea of having to play the game of renegotiating with an ISP every few years. I’ve also hated this as a consultant. Too many times I’ve seen clients give away a huge amount of margin through these practices, giving away revenue that is needed to meet their forecasts and budgets. It’s dangerous to let marketing folks determine the bottom line because they’ve never met a discount they don’t like – particularly if they can make a bonus for selling or retaining customers.

Is AT&T the 800-pound Gorilla?

For years it’s been understood in the industry that Comcast is the hardest incumbent to compete against. However, they are still a cable company and many people dislike cable companies – but Comcast has been the most formidable competitor. The company is reported to have the highest gross margins on cable TV and might be one of the few companies still making a significant profit on cable. Much of that is due to their extensive programming holdings – it’s easier to make money on cable when you buy your own programming. Comcast has also been the best in the industry in creating bundles to lock in customers – bundling things like smart home and more recently cellular service.

But the new 800-pound Gorilla in the industry might be AT&T. The company seems to be finally shaking out of the transition period from integrating their purchase of Time Warner. It can be argued that the programming that came from that merger – things like HBO, CNN, and blockbuster movies – will make AT&T a more formidable competitor than Comcast.

AT&T will be launching its new streaming service, AT&T TV, next month. The company already has one of the largest streaming services with DirecTV Now. It’s been rumored that the streaming service will start at a price around $18 per month – an amazingly low price considering that HBO retails for $15 online today. The company is trying to coax more money out of the millions of current HBO subscribers. This pricing also will lure customers to drop HBO bought from cable companies and instead purchase it online.

AT&T has also been building fiber for the last four years and says that they now pass 20 million homes and businesses. They recently announced the end of the big fiber push and will likely now concentrate on selling to customers in that big footprint. The company is one of the more aggressive marketers and has sent somebody to my door several times in the last year. That’s a sign of a company that is working hard to gain broadband subscribers.

The one area where AT&T is still missing the boat is in not bundling broadband and cellular service. AT&T is still number one in the country with cellular customers, with almost 160 million customers at the end of the recently ended second quarter. For some reason, they have never tried to create bundles into that large customer base.

AT&T has most recently been having a customer purge at DirecTV. For years that business bought market share by offering low-prices significantly below landline cable TV. Over the last, year the company has been refusing to renew promotional pricing deals and is willing to let customers walk. In the first quarter of this year alone the company lost nearly one million customers. The company says they are not unhappy to see these customers leave since they weren’t contributing to the bottom line. This is a sign of a company that is strengthening its position by stripping away the cost of dealing with unprofitable customers.

AT&T has also pushed a few net neutrality issues further than other incumbents. As a whole, the industry seems to be keeping a low profile with issues that are identified as net neutrality violations. There is speculation that the industry doesn’t want to stir up public ire on the topic and invite a regulatory backlash if there is a change in administration.

AT&T widely advertised to its cellular customers earlier this year that the company would not count DirecTV Now usage against cellular or landline data caps. The same will likely be true for AT&T TV. Favoring one’s own service over the competition is clearly one of the things that net neutrality was intended to stop. Since there are data caps on both cellular and AT&T landline products, the move puts Netflix and other streaming services at a competitive disadvantage. That disadvantage will grow over time as more landline customers hit the AT&T data caps.

AT&T has made big mistakes in the past. For instance, they poured a fortune into promoting 50 Gbps DSL instead of pushing for fiber a decade sooner. They launched their cable TV product just as that market peaked. The company seemed to lose sight of all landline and fiber-based products for a decade when everything the company did was for cellular – I remember a decade ago having trouble even finding mention of the broadband business in the AT&T annual report.

We’ll have to wait a few years to see if a company like AT&T can reinvent itself as a media giant. For now, it looks like they are making all of the right moves to take advantage of their huge resources. But the company is still managed by the same folks who were managing it a decade ago, so we’ll have to see if they can change enough to make a difference.

Cellular Broadband Speeds – 2019

Opensignal recently released their latest report on worldwide cellular data speeds. The company examined over 139 billion cellphone connections in 87 countries in creating this latest report.

South Korea continues to have the fastest cellular coverage in the world with an average download speed of 52.4 Mbps. Norway is second at 48.2 Mbps and Canada third at 42.5 Mbps. The US was far down the list in 30th place with an average download speed of 21.3 Mbps. Our other neighbor Mexico had an average download speed of 14.9 Mbps. At the bottom of the list are Iraq (1.6 Mbps), Algeria (2.1 Mbps) and Nepal (4.4 Mbps). Note that these average speeds represent all types of cellular data connections including 2G and 3G.

Cellular broadband speeds have been improving raoidly in most countries. For instance, in the 2017 report, Opensignal showed South Korea at 37.5 Mbps and Norway at 34.8 Mbps. The US in 2017 was in 36th place at only 12.5 Mbps.

Earlier this year Opensignal released their detailed report about the state of mobile broadband in the United States. This report looks at speeds by carrier and also by major metropolitan area. The US cellular carriers have made big strides just since 2017. The following table compares download speeds for 4G LTE by US carrier for 2017 and 2019.

2019 2017
Download Latency Download Latency
AT&T 17.8 Mbps 57.8 ms 12.9 Mbps 63.8 ms
Sprint 13.9 Mbps 70.0 ms 9.8 Mbps 70.1 ms
T-Mobile 21.1 Mbps 60.6 ms 17.5 Mbps 62.8 ms
Verizon 20.9 Mbps 62.6 ms 14.9 Mbps 67.3 ms

Speeds are up across the board. Sprint increased speeds over the two years by 40%. Latency for 4G is still relatively high. For comparison, fiber-to-the-home networks have latency in the range of 10 ms and coaxial cable networks have latency between 25 – 40 ms. The poor latency in cellular networks is one of the reasons why browsing the web on a cellphone seems so slow. (the other reason is that cellphone browsers focus on graphics rather than speed).

Cellular upload speeds are still slow. In the 2019 tests, the average upload speeds were AT&T (4.6 Mbps), Sprint (2.4 Mbps), T-Mobile (6.7 Mbps) and Verizon (7.0 Mbps).

Speeds vary widely by carrier and city. The fastest cellular broadband market identified in the 2019 tests was T-Mobile in Grand Rapids, Michigan with an average 4G speed of 38.3 Mbps. The fastest upload speed was provided by Verizon in New York City at 12.5 Mbps. Speeds vary by market for several reasons. First, the carriers don’t deploy the same spectrum everywhere in the US, so some markets have less spectrum than others. Markets vary in speed due to the state of upgrades – at any given time cell sites are at different levels of software and hardware upgrades. Finally, markets also vary by cell tower density and markets that serve more customers for each tower are likely to be slower.

Many people routinely take speed tests for their home landline broadband connection. If you’ve not taken a cellular speed test it’s an interesting experience. I’ve always found that speeds vary significantly with each speed test, even when run back-to-back As I was writing this blog I took several speed tests that varied in download speeds between 12 Mbps and 23 Mbps (I use AT&T). My upload speeds also varied with a top speed of 3 Mbps, and one test that couldn’t maintain the upload connection and measured 0.1 Mbps on the test. While landlines broadband connections maintain a steady connection to an ISP, a cellphone establishes a new connection every time you try to download and speeds can vary depending upon the cell site and the channel your phone connects to and the overall traffic at the cell site at the time of connection. Cellular speeds can also be affected by temperature, precipitation and all of those factors that make wireless coverage a bit squirrelly.

It’s going to be a few years until we see any impact on the speed test results from 5G. As you can see by comparing to other countries, the US still has a long way to go to bring 4G networks up to snuff. One of the most interesting aspects of 5G is that speed tests might lose some of their importance. With frequency slicing, a cell site will size a data channel to meet a specific customer need. Somebody downloading a large software update should be assigned a bigger data channel with 5G than somebody who’s just keeping up with sports scores. It will be interesting to see how Opensignal accounts for data slicing.