Is the Public Buying the 5G Hype?

T-Mobile recently conducted a survey, conducted by HarrisT, that looks in detail about how the public feels about the role of pending new technologies. They expect to repeat this survey quarterly to track how public perceptions of technology changes over time.

As you would expect, a significant number of the questions in the poll were about 5G. I’m sure that T-Mobile’s motivation for conducting the survey is due to the fact that they are one of the few companies in the industry that are not hyping 5G. They expect 5G to start creeping into the industry in 2020 and then taking as much as a decade to become a widespread reality.

The survey started by asking if respondents had heard of various new technologies. The 5G hype isn’t fully pervasive yet with 57% having heard of the technology. For other technologies: Internet of Things – 29%; machine learning – 26%; virtual reality – 83%; artificial intelligence – 78%; cloud computing – 52% and blockchain – 19%.

One of the most interesting responses in the survey is the public expectation of when they expect to see 5G in the market place. Of those that have heard of 5G, 29% thought it was already here in late 2018. 35% more think they’ll see 5G in 2019 and another 25% expect 5G in 2020. This response has to reflect the flood of marketing hype and press releases extolling 5G. The public has been inundated for several years by articles and press releases that declare that 5G is going to solve our broadband problems by delivering huge data speeds wirelessly everywhere.

When asked more specifics about 5G, 64% were somewhat excited or very excited about 5G in general. They were also somewhat or very excited about the following attributes of 5G: faster upload and download speeds – 92%; wider network coverage – 91%; higher quality streaming video – 85%; higher quality voice calls – 89%; less lag time on mobile devices – 90%; more reliable mobile connections – 93%; greater number of connected devices – 80%; smart city data sensors – 68%; driverless vehicles – 50%; virtual reality in the work environment – 59%; smart energy grids – 75%; supercharged IoT – 64%; expanded use of drones – 47%; next generation artificial intelligence – 59%; telehealth – 68%; remote surgery – 59%; real time language translation – 72%; replacement of landline broadband connections – 75%; replacement of traditional cable TV – 75%.

Interestingly, only 27% of total respondents thought that 5G would have a big influence on their daily life.

In a finding that I find disturbing, 65% of respondents think 5G will have a positive impact on rural America. Even the biggest 5G proponents admit that 5G is going to be hard to justify in low-density areas. It’s not hard to understand this belief because I’ve seen numerous articles that make this claim. 79% think 5G will have a positive impact in cities.

When asked which companies would be leaders in 5G, the unsurprising responses include Verizon (43%), AT&T (36%), Apple (43%), Samsung (35%) and T-Mobile (20%). However, there were surprises on this list including Amazon (24%), Comcast (12%), Google (36%), Facebook (12%), Microsoft (34%) and Dish Networks (5%).

The public believes that 5G is going to bring price increases. 84% said they thought that 5G would result in higher cellular service prices. 77% said they thought 5G would lead to higher cable TV prices (this has me scratching my head). 81% said they thought 5G would lead to higher process for home broadband – but wouldn’t increased competition for home broadband bring lower prices? 86% expect the prices for smart phones to be higher.

Overall, the survey shows an unrealistic public perception about when we’ll see the benefits of 5G. It’s not hard to understand this misperception since there are untold articles making it sound like we’re on the verge of a 5G revolution. I’m guessing this might have been one of the motivations for T-Mobile to sponsor this survey since they are one of the most realistic voices in the industry talking about the 5G time line. It will be interesting to see what the public thinks in a few years after very little 5G has actually been implemented. But perhaps I’m just being overly skeptical since the big carriers like AT&T are now extolling their 4G LTE product as 5G – maybe the public will but it.

Minnesota Sues Comcast

Lori Swanson, the Attorney General of Minnesota sued Comcast on December 21 seeking refunds to all customers who were harmed by the company’s alleged violation of the state’s Prevention of Consumer Fraud Act and Uniform Deceptive Trade Practices Act. The complaint details the sort of practices that we’ve come to expect from most of the big cable companies – and hopefully this serves as a warning to smaller ISPs that might be following similar practices. It’s an interesting read.

The most significant dollar complaint is that Comcast has defrauded customers about the true nature of two fees – the ‘Regional Sports Network Fee’ and the ‘Broadcast TV’ fee. These two fees now total $18.25 per month. These fees are both a part of every cable package and are not optional to customers, but Comcast does not mention them when advertising the cable products. Further, Comcast customer service has repeatedly told the public that these fees are mandated by the government and are some a tax that is not set by Comcast.

Comcast only started charging separately for these two fees in 2014, but the size of these line items has skyrocketed on bills. In recent years the company has put a lot of the annual rate increases into these fees, allowing the company to continue to advertise low prices. The Regional Sports fee passes along the cost of Fox Sports North, and perhaps other regional sports. The Broadcast TV fee includes the amounts that Comcast pays local affiliate stations for ABC, CBS, FOX and NBC.

Interestingly, Comcast was previously sued over this same issue and settled the case without a verdict. As part of that suit the company promised to fix the problems, but they continued into 2017. In a pleading that is sure to displease company employees, Comcast threw its customer service reps under the bus and blame the issue on them. Comcast argues that breaking out these fees makes it easier for customers to know what they are paying for – but there are numerous examples cited in the complaint where new customers were surprised at the size of the first bill they receive from the company.

The complaint also says that the company often misrepresents the fees for equipment rental such as cable settop boxes, digital adapters and broadband modems. The complaint says that for some packages these fees add 30% to the cost of the product and are not fully disclosed to customers.

The complaint also says that Comcast routinely adds unwanted fees to customer bills. Customers that are visited by Comcast field technicians, who visit a business office or who buy from a Comcast door-to-door salesperson are often surprised to see additional products added to their bill. The complaint blames this on the practice of paying commissions to employees for sales.

The complaint notes that Comcast is well aware of these issues. The company settled an FCC complaint about the same issues in 2016 and late last year made refunds to more than 20,000 customers in Massachusetts over these same issues.

It’s not hard to verify some of the issue. If you go to the Comcast website you’ll find that it’s almost impossible to find the real cost of their cable and broadband products. The company constantly advertises low-priced specials that don’t mention the extra programming fees or the equipment fees.

This is a cautionary tale for smaller ISPs that compete with Comcast or other large cable companies. It’s always tempting to advertise cheap special prices in response to big cable company advertising. I know many smaller cable providers that have also separated out the sports and broadcast fees and who are not always fully forthcoming about equipment charges and other fees. It’s hard to watch customers leave who are lured by falsely advertised low prices – but most small ISPs have elected to deal with customers fairly as a way to differentiate themselves from the big companies.

The Huge CenturyLink Outage

At the end of December CenturyLink had a widespread network outage that lasted over two days. The outage disrupted voice and broadband service across the company’s wide service territory.

Probably the most alarming aspect pf the outage is that it knocked out the 911 systems in parts of fourteen states. It was reported that calls to 911 might get a busy signal or a recording saying that “all circuits are busy’. In other cases, 911 calls were routed to the wrong 911 center. Some jurisdictions responded to the 911 problems by sending out emergency text messages to citizens providing alternate telephone numbers to dial during an emergency. The 911 service outages prompted FCC Chairman Ajit Pai to call CenturyLink and to open a formal investigation into the outage.

I talked last week to a resident of a small town in Montana who said that the outage was locally devasting. Credit cards wouldn’t work for most of the businesses in town including at gas stations. Businesses that rely on software in the cloud for daily operations like hotels were unable to function. Bank ATMs weren’t working. Customers with CenturyLink landlines had spotty service and mostly could not make or receive phone calls. Worse yet, cellular service in the area largely died, meaning that CenturyLink must have been supplying the broadband circuits supporting the cellular towers.

CenturyLink reported that the outage was caused by a faulty networking management card in a Colorado data center that was “propagating invalid frame packets across devices”. It took the company a long time to isolate the problem, and the final fix involved rebooting much of the network electronics.

Every engineer I’ve spoken to about this says that in today’s world it’s hard to believe that it would take 2 days to isolate and fix a network problem caused by a faulty card. Most network companies operate a system of alarms that instantly notify them when any device or card is having problems. Further, complex networks today are generally supplied with significant redundancy that allows the isolation of troubled components of a network in order to stop the kind of cascading outage that occurred in this case. The engineers all said that it’s almost inconceivable to have a single component like a card in a modern network that could cause such a huge problem. While network centralization can save money, few companies route their whole network through choke points – there are a dozen different strategies to create redundancy and protect against this kind of outage.

Obviously none of us knows any of the facts beyond the short notifications issued by CenturyLink at the end of the outage, so we can only speculate about what happened. Hopefully the FCC enquiry will uncover the facts – and it’s important that they do so, because it’s always possible that the cause of the outage is something that others in the industry need to be concerned about.

I’m only speculating, but my guess is that we are going to find that the company has not implemented best network practices in the legacy telco network. We know that CenturyLink and the other big telcos have been ignoring the legacy networks for decades. We see this all of the time when looking at the conditions of the last mile network, and we’ve always figured that the telcos were also not making the needed investments at the network core.

If this outage was caused by outdated technology and legacy network practices then such outages are likely to recur. Interestingly, CenturyLink also operates one of the more robust enterprise cloud services in the country. That business got a huge shot in the arm through the merger with Level 3, with new management saying that all of their future focus is going to be on the enterprise side of the house. I have to think that this outage didn’t much touch that network, just more likely the legacy network.

One thing for sure is that this outage is making CenturyLink customers look for an alternative. A decade ago the local government in Cook County, Minnesota – the northern-most county in the state – was so frustrated by continued prolonged CenturyLink network outages that they finally built their own fiber-to-the-home network and found alternate routing into and out of the County. I talked to one service provider in Montana who said they’ve been inundated after this recent outage by businesses looking for an alternate to CenturyLink.

We have become so reliant on the Internet that major outages are unacceptable. Much of what we do everyday relies on the cloud. The fact that this outage extended to cellular outages, a crash of 911 systems and the failure of credit card processing demonstrates how pervasive the network is in the background of our daily lives. It’s frightening to think that there are legacy telco networks that have been poorly maintained that can still cause these kinds of widespread problems.

I’m not sure what the fix is for this problem. The FCC supposedly washed their hands of the responsibility for broadband networks – so they might not be willing to tackle any meaningful solutions to prevent future network crashes. Ultimately the fix might the one found by Cook County, Minnesota – communities finding their own network solutions that bypass the legacy networks.

How Bad is the Digital Divide?

The FCC says that approximately 25 million Americans living in rural areas don’t have access to an ISP product that would be considered as broadband – currently defined as 25/3 Mbps. That number comes out of the FCC’s mapping efforts using data supplied by ISPs.

Microsoft tells a different story. They say that as many as 163 million Americans do not use the Internet at speeds that the FCC considers as broadband. Microsoft might be in the best position of anybody in the industry to understand actual broadband performance because the company can see data speeds for every customer that updates Windows or Microsoft Office – that’s a huge percentage of all computer users in the country and covers every inch of the country.

Downloading a big software update is probably one of the best ways possible to measure actual broadband performance. Software updates tend to be large files, and the Microsoft servers will transmit the files at the fastest speed a customer can accept. Since the software updates are large files, Microsoft gets to see the real ISP performance – not just the performance for the first minute of a download. Many ISPs use a burst technology that downloads relatively fast for the first minute or so, but then slows for the rest of a download – a customer’s true broadband speed is the one that kicks in after the burst is finished. The burst technology has a side benefit to ISPs in that it inflates performance on standard speed tests – but Microsoft gets to see the real story.

I’ve ranted about the FCC’s broadband statistics many times. There are numerous reasons why the FCC data is bad in rural America. Foremost, the data is self-reported by the big ISPs who have no incentive to tell the FCC or the public how poorly they are doing. It’s also virtually impossible to accurately report DSL speeds that vary from customer to customer according to the condition of specific copper wires and according to distance from the DSL core router. We also know that much of the reporting to the FCC represents marketing speeds or ‘up-to’ speeds that don’t reflect what customers really receive. Even the manner of reporting to the FCC, by Census block, distorts the results because when a few customers in a block get fast speeds the FCC assumes that everyone does.

To be fair, the Microsoft statistics measure the speeds customers are actually achieving, while the FCC is trying to measure broadband availability. The Microsoft data includes any households that elect to buy slower broadband products to save money. However, there are not 140 million households that purposefully buy slow broadband (the difference between 163 million and 24 million). The Microsoft numbers tell us that the actual speeds in the country are far worse than described by the FCC – and for half of us slower than 25/3 Mbps. That is a sobering statistic and doesn’t just reflect that rural America is getting poor broadband, but also that many urban and suburban households also aren’t achieving 25/3 Mbps.

I’ve seen many real-life examples of what Microsoft is telling us. At CCG Consulting we do community surveys for broadband and we sometimes see whole communities where the achieved speeds for customers is lower than the speeds advertised by the ISPs. We often see a lot more households claim to have no broadband or poor broadband than would be expected using the FCC mapping data. We constantly see residents in urban areas complain that broadband with a relatively fast speed seems slow and sluggish.

Microsoft reported their findings to the FCC, but I expect the FCC to ignore their story. This is a drastic departure from the narrative that the FCC is telling Congress and the public. I wrote a blog just a few weeks ago describing how the FCC is claiming that big ISPs are delivering the speeds that they market. Deep inside the recent reports the FCC admitted that DSL often wasn’t up to snuff – but the Microsoft statistics mean that a lot of cable companies and other ISPs are also under-delivering.

In my mind the Microsoft numbers invalidate almost everything that we think we know about broadband in the country. We are setting national broadband policy and goals based upon false numbers – and not numbers that are a little off, but rather than are largely a fabrication. We have an FCC that is walking away from broadband regulation because they have painted a false narrative that most households in the country have good broadband. It would be a lot harder for politicians to allow broadband deregulation if the FCC admitted that over half of the homes in the country aren’t achieving the FCC definition of broadband.

The FCC has been tasked by Congress to find ways to improve broadband in areas that are unserved or underserved – with those categories being defined by the FCC maps. The Microsoft statistics tell us that there are huge numbers of underserved households, far higher than the FCC is recognizing. If the FCC was to acknowledge the Microsoft numbers, they’d have to declare a state of emergency for broadband. Sadly, the FCC has instead doomed millions of homes from getting better broadband by declaring these homes as already served with adequate broadband – something the Microsoft numbers say is not true.

The current FCC seems hellbent on washing their hands of broadband regulation, and the statistics they use to describe the industry provide the needed cover for them to do so. To be fair, this current FCC didn’t invent the false narrative – it’s been in place since the creation of the national broadband maps in 2009. I, and many others predicted back then that allowing the ISPs to self-report performance would put us right where we seem to be today – with statistics that aren’t telling the true story. Microsoft has now pierced the veil to see behind the curtain – but is there anybody in a position of authority willing to listen to the facts?

FCC Urban Rate Survey

The FCC collects retail prices annually from urban carriers for landline telephone and broadband services. These prices are used to determine benchmark rates for rural areas for incumbent local exchange rate-of-return carriers, incumbent price-cap carriers receiving CAF Phase II support, recipients of the Rural Broadband Experimental grants, and winners of the recent Connect America Fund Phase II Auction.

I find it ironic that the FCC says they no longer regulate broadband, yet they still define maximum broadband rates allowed for various classes of carriers. The fact is that there are still numerous ways that the FCC is regulating broadband and since many of these mandates come from Congress the FCC will never be able to back out of broadband regulations entirely.

The FCC publishes spreadsheets summarizing of the rates they collected. The benchmark rate for voice defines the highest and lowest rates that are allowable by the affected carriers. Starting in 2019 the lowest rate that can be charged for residential voice is $26.98 and the highest is $51.61.

The following table is for the residential broadband rates listed by AT&T in North Carolina, where I live. The rates listed are non-discounted rates and many customers pay less due to bunding or to negotiating a lower rate. It is striking to me that AT&T charges $70 per month for a 10/1 Mbps connection on DSL and also for a 100/100 Mbps connection on fiber. This is one of the issues that has rural customers up in arms – they pay high prices for less performance, particularly considering that they often only receive a fraction of the published speeds shown in the table. It’s also worth noting that AT&T has a monthly data cap on every product other than their symmetrical gigabit product.

Download Upload Data Cap
Mbps Mbps GB Price Technology
3 0.384 150 $56 DSL
5 1 1000 $60 DSL
5 5 1000 $60 FTTP
6 0.512 150 $61 DSL
10 1 1000 $70 DSL
18 1.5 1000 $70 DSL
25 25 1000 $70 DSL
25 5 1000 $70 FTTP
25 25 1000 $60 FTTP
50 50 1000 $70 DSL
50 10 1000 $70 DSL
50 50 1000 $70 FTTP
75 20 1000 $70 DSL
100 20 1000 $70 DSL
100 100 1000 $70 FTTP
300 300 1000 $90 FTTP
1000 1000 unlimited $100 FTTP

The benchmarks for broadband are extremely high and it’s doubtful that many carriers are even trying to charge the rates shown in the table below. There are separate rate caps calculated for Alaska and the rest of the US.

Download Bandwidth (Mbps) Upload Bandwidth (Mbps) Capacity Allowance (GB) 2019 U.S.

($)

2019 AK

 ($)

4 1 200 66.12 113.19
4 1 Unlimited 70.76 119.06
10 1 200 72.31 121.54
10 1 Unlimited 77.30 127.75
25 3 200 77.65 129.52
25 3 Unlimited 82.66 135.75
25 5 200 78.49 129.78
25 5 Unlimited 83.50 136.01
50 5 Unlimited 100.85 153.64
100 10 Unlimited 106.23 161.16
250 25 Unlimited 128.69 203.67
500 50 Unlimited 148.35 223.87
1000 100 Unlimited 162.33 232.38

This is one of the exercises that the FCC must go through that seems largely meaningless. They set a really high rate cap for those that participate in various FCC subsidy programs – but realistically it’s unlikely that many carriers would want to charge more than $100.85 for a 50/5 Mbps connection – but if they did, customers have a legal recourse. What’s more valuable from this exercise is seeing the list prices of the larger urban ISPs – something that’s getting harder to find on line.

Trusting Big ISP Data

The FCC has finally come to grips with the fact that big ISPs are supplying bad data to the various FCC mapping efforts that are then used to distribute FCC funding and to set national policies. The latest mapping snafu come from one-time data collection from the cellular carriers last year showing rural cellular coverage. These maps were to be used to establish a new federal fund called the Mobility Fund II which will distribute $4.53 billion for the expansion of 4G cellular coverage to rural parts of the country that have little or no cellular coverage.

The big cellular companies have been lying about their cellular coverage for years. If you look at the nationwide 4G LTE coverage maps from AT&T and Verizon you’d think that they have cellular coverage virtually everywhere except in areas like deserts and mountains. But anybody living or traveling in rural America knows better. It’s not hard to drive very far off main highways and hit areas that never see a bar of cellular coverage. And even where there is coverage, it’s still often 3G or even older technology.

When the FCC collected data for the Mobility II funding the big carriers stuck to this same flawed mapping data. It turns out that overclaiming rural cellular coverage will keep funding from going to the smaller cellular companies that still serve in many parts of rural America. Luckily the FCC effort included a challenge process and the FCC was flooded with challenges showing that cellular coverage is far worse than is claimed by the big carrier maps. There were so many challenges that the FCC put the Mobility II award process on hold until they can sort it out.

This is just one of the mapping efforts from the FCC that have been used to award billions of dollars of funding over the last decade. The FCC relied on mapping data from the big telcos to establish the areas that were eligible for the billions of dollars of CAF II funding.

Since rural areas served by the biggest telcos have been neglected for years, and since the big telcos deployed very little rural DSL outside of towns it’s not hard to identify huge swaths of rural areas that have little or no broadband. But the big telco broadband coverage data contains a ton of inaccuracies. For example, there are numerous smaller rural towns that are listed in the telco databases as having decent broadband, when the reality on the ground is broadband speeds of a few Mbps at best. It looks like the big telcos often reported marketing speeds rather than actual speeds. This inaccuracy has stopped others from seeking federal grants and loans to upgrade such towns.

I fear that rural broadband mapping is on the verge of the next crisis. As a blogger I am contacted a lot by folks in rural America describing their broadband situation. I’ve heard enough stories to convince me that the big telcos have made only a half-hearted effort at implementing CAF II. I think many homes that should have seen CAF II broadband upgrades will see zero upgrades while many others will get upgraded to speeds that don’t meet even the measly CAF II goal of 10/1 Mbps.

The big telcos are not likely to come clean about having pocketed CAF II funding rather than spending every penny to make upgrades, and so they are going to claim that the CAF II areas have been upgraded, regardless of the actual situation on the ground. Rural households that didn’t see the promised upgrades will then be counted by the FCC as having better broadband. That will make these areas off limits to future federal funding to fix what the telcos botched. We already see the newest federal grant programs having a new requirement that no more than 10% of the homes covered by federal funding can have broadband today. Because of the falsified mapping, many homes without broadband are going to be deemed to be covered and it will be a massive challenge for somebody else to get funding to help such areas. These communities will be harmed twice – once by the telcos that aren’t upgrading speeds and second by the inaccurate mapping that will stop others from funding assistance to fix the problem.

The big telcos and carriers have huge incentives to lie about rural broadband coverage. None of the big telcos or cellular carriers want to spend any of their own money in rural areas, but they love the revenues they are receiving by a captive rural customer base who pays high prices for poor broadband. The big companies are fighting hard to preserve these revenues, which means they don’t want anybody else to get funding to improve broadband. To make matters worse, the big telcos continue to eliminate technicians and maintenance budgets in rural America, making it nearly impossible for customers to get repairs and service.

I unfortunately don’t have any easy solution for the problem of crappy mapping. Perhaps the FCC could entertain challenges to the broadband maps in the same way they are accepting challenges in the Mobility II process. I know a number of rural communities that would make the effort to create accurate broadband maps if this might bring them better broadband.

The Future of Video Streaming

SANYO DIGITAL CAMERA

I predict that we are going to see a huge shake-out in the online video market over the next few years. The field of OTT providers is already crowded. There are providers that offer some version of the programming offered on traditional cable TV like Sling TV, DirecTV Now, Playstation Vue, Hulu Plus, YouTube TV, fuboTV and Layer3 TV. There are also numerous providers with unique content like Netflix, Amazon Prime, CBS All Access, HBO Go, and more than 100 others.

The field is going to get more crowded this year. Disney is planning a Netflix competitor later this year that will include Disney’s vast library of content including unique content from Marvel, Lucasfilm, 21st Century Fox and Pixar.

AT&T also plans to offer a unique-content platform that includes the vast library of content it acquired through the merger with Time-Warner along with the content from HBO.

Apple has finally been creating unique content that it will start showing some time this year. Amazon has stepped up the creation of unique content. Comcast is planning a launch with the unique content it owns through NBC Universal and Illumination Studios.

But the biggest news is not that there will be more competitors – it’s that each of the creators of unique content is intending to only offer their content on their own platform. This is going to transform the current online landscape.

The big loser might be Netflix. While the company creates more unique content than anybody else in the industry they have benefited tremendously from outside content. I happen to watch a lot of the Marvel content and my wife sometimes refers to Netflix as the Marvel network – but that content will soon disappear from Netflix. Disney owns the Star Wars franchise. NBC Universal (Comcast) recently purchased the rights to Harry Potter. CBS owns the Star Trek franchise. AT&T owns the Game of Thrones. Amazon bought the rights to develop more Middle Earth (Lord of the Rings) content. Is Netflix going to be as attractive if they are unable to carry attractive external content in addition to their own unique content?

Each of the major content owners is aiming to capitalize on their most valuable content. For example, the industry buzz is that there are numerous new Star Trek efforts underway and that CBS All Access will become all Star Trek, all of the time. Each of these content owners is making similar plans to best monetize their content.

This looks it is going to turn into a content arms race. That means more content than ever for the viewing public. But it also means that a household that wants to watch a range of the most popular content is going to need numerous monthly subscriptions. I think 2019 is going to become the year when the monthly cost of online content starts climbing to rival the cost of traditional cable.

My family is probably fairly typical for cord cutters. We watch local channels, traditional cable networks and sports through Playstation Vue. We have subscriptions to Netflix, Amazon Prime and Hulu. During the year we add and subtract networks like ESPN Plus, CBS All Access, HBO NOW and several others. And we also buy individual TV shows and movies that aren’t included in these various platforms.

I’m not unhappy with our array of content. Each of our three family members gets to watch the content they want. We’re each free to use the devices we like and watch at times that are convenient.

The number one reason cited for cord cutting is to save money. I’m pretty certain that as a family that we already aren’t saving anything compared to what content cost us before we went online. However, saving money was not our primary reason for going online. I look forward and I suspect that we’ll probably add some of the new content this year such as Disney, so our costs are likely to keep climbing.

A few years ago there was a lot of speculation about where the industry is headed. A lot of people thought that the Amazon super-aggregator model was the future, and Amazon is doing well by reselling dozens of unique content platforms under its name brand. However, it looks like the industry is now headed in the opposite direction where each big corporate owner of unique content is going to want to extract the maximum value by selling directly to the public.

I have to wonder what this all means for the public. Will the high cost of buying numerous online packages dissuade many from cutting the cord? It’s also confusing trying to find what you want to watch with so many different sources of content that are in separate silos. It’s going to be interesting to see these industry giants battling each other for eyeballs.

ISP Economy of Scale

I’ve worked for numerous small communities over the last few years and the first question they always ask me is if their community is large enough to support a standalone fiber ISP business. What they really want to know is if they can somehow operate their own local ISP and still have affordable broadband rates.

The question about how big an ISP must be (in terms of customers) is really a question about economy of scale. The textbook definition of economy of scale is a business where costs decrease per unit through increased output. The ISP business is clearly an economy of scale business since the cost per customer decreases as an ISP adds more customers.

The example I usually use to demonstrate this is to look at the cost of paying for a general manager for an ISP business. The cost of the general manager is what economists would call a fixed cost – it doesn’t vary as you add or lose customers. However, the amount of the general manager salary that must be covered by each customer gets smaller as the customer base grows larger.

A large part of the costs of operating an ISP are fixed like the GM salary. Costs like operating a business office, doing the needed accounting, buying a billing system, etc. are largely fixed. The largest fixed cost is often the debt service required to pay off the cost of building the network.

I have done hundreds of business plans for communities of all sizes and I have developed a few rules of thumb for operating a traditional ISP – one that has the expected number of employees, charges normal industry rates and has to finance the cost of their network.

  • It’s hard to justify a new standalone ISP with fewer than 2,000 customers.
  • Economy of scale kicks in at that point and the business gets more efficient, when measured on a per customer cost up until an ISP reaches between 20,000 and 25,000 customers.
  • After 20,000 customers the cost curve reaches relative stasis – adding customers increases efficiency but also drives additional fixed costs. For example, companies find they need to hire extra accountants; they might need backoffice positions for functions like personnel or benefits management.
  • At some fairly large size, say 100,000 customers, ISPs tend to become less efficient. Large companies tend to become bureaucratic, hire significant middle management and become less functionally efficient as centralize functions and put them into silos.

There are ways defeat the economy of scale curve to some extent and I have clients who used the following ways to be more efficient than other ISPs of the same size:

Reduce Costs. There are various ways to spend less on needed functions than the average ISP.

  • Use existing excess capacity. A City or an electric company can open an ISP and not have to spend money for a business office since they are already operating one. An existing business like an electric company or a telco might be able to use underutilized customer service reps or line technicians without having to hire a whole new staff for a new ISP venture. A utility or city might be able to use the existing billing systems from their water or electric utilities to support the ISP functions.
  • Reduce Expectations. Small ISPs often can save a lot of money by reducing customer expectations. For instance, they might elect to not have repair service on evening and weekends unless there is a real emergency. Small ISPs also don’t need to offer all of the bells and whistles of larger ISPs and can have a simple product offering. A small ISP might elect to not offer cable TV, which for small ISPs usually has a negative margin. This concept can only be taken so far and works best in communities where an ISP is competing against a decent giant ISP.

Avoid Costs. It’s not easy to avoid the cost of being an ISP, but it can be done.

  • Financing from Other Revenues. I have a client with only 600 potential customers that is successful since they decided to fund their network using property taxes rather than having to pay debt from ISP revenues. This allows them to have broadband rates far lower than surrounding communities by avoiding ISP debt payments. I know municipal and cooperatives power companies that have raised electric rates to help cover the cost of an ISP business since that’s what their customers wanted.

Grow Efficiently. The most obvious strategy to beat the economy of scale curve is to get more customers in an efficient manner.

  • Increase Penetration Rates. For a small ISP the difference between a 50% and a 70% penetration rate can be dramatic. Many ISPs become complacent once they generate enough cash to pay the bills while the smart ones continue to sell hard every year to maximize the customers in their footprint.
  • Expand to Another Market. Most small telcos figured this out years ago and operate a CLEC business outside their regulated footprint.
  • Partnering. There are several examples today of cities that have banded together to create a larger ISP. There are numerous cities and electric cooperatives that are partnering with telcos to gain the economy of scale. We are still seeing small telcos getting purchased by larger ones.

AT&T is Not Launching Mobile 5G

AT&T recently took the next step in the 5G hype race by announcing that it is releasing the first mobile 5G device. The announcement was made at end of the year to cover past AT&T announcements that the company would launch mobile 5G in 2018. The company can now say that they beat Verizon and Sprint to the market.

The AT&T announcement is referring to the device they are calling a puck. It’s a small Netgear modem that is being touted as a 5G mobile hotspot. The puck is based upon at least a few aspects of the 3GPP NR standard, allowing AT&T to claim it’s 5G. AT&T has not been fully forthcoming about how the device works. Where available the device will supposedly grab bandwidth from AT&T’s 5G cellular network – but since the 5G network is mostly still imaginary, in most places it will grab signal from the existing 4G LTE network. Within a home the puck will transmit WiFi, just like any other WiFi router.

There is no real product here. For at least three months AT&T will be giving away the puck and service for free to selected users. After that they’ve said the pricing will be $499 for the puck plus $70 monthly for bandwidth with an incredibly stingy 15 GB data cap. My prediction is that this product never makes it to market because it’s hard to envision anybody in an urban area willing to pay $70 a month such a small amount of WiFi bandwidth. The only market for the puck is possibly a few early adapters with money to burn who want to be able to say they owned the first 5G devices.

This announcement sets a new low for 5G hype. What I found most disturbing is that dozens of news sites picked up the story and basically spit back the AT&T press release and called it news. Those dozens of articles give the public the impression that 5G mobile is right around the corner, which is exactly what AT&T intended – they want the public to equate 5G and the AT&T brand name together. To be fair, there are several industry articles that didn’t buy into the AT&T hype.

The AT&T announcement also made this sound like a breakthrough technology by implying that this will deliver faster cellular speeds. There is a lot needed before there is a faster 5G cellular network. First, AT&T would need to install 5G transmitters on residential streets, requiring them to build neighborhood fiber networks. For the puck to work with millimeter wave spectrum AT&T would need to put a small antenna on the outside of a home to receive the signal since millimeter wave bandwidth won’t pass through the walls of a home. A network that will deliver residential millimeter wave cellular bandwidth is nearly identical to a network that would deliver 5G fixed broadband.

AT&T is not taking any of those needed steps. In fact, AT&T’s CTO Andre Fuetsch spent the fall repeatedly taking potshots at Verizon’s 5G deployment, saying that Verizon is making a mistake chasing the ‘fixed’ 5G market.

To further deflate this announcement, AT&T’s CFO John Stephens recently told AT&T investors to not expect any 5G revenues in 2019. He admitted it will take many years until there are enough 5G phones in the market to make a noticeable difference in revenues. It seems the only cellular carrier being truthful about 5G is T-Mobile which says it will begin introducing some 5G characteristics into their cell sites starting in 2020.

The bottom line is that AT&T just announced the release of a WiFi router that works off their 4G LTE network, but which supposedly will incorporate at least some aspects of the 3GPP NR standard. The company isn’t planning to charge for the product and it’s hard to envision anybody buying hotspot bandwidth at the prices they announced. But AT&T got what they wanted, which was dozens of news articles declaring that AT&T was the first to market with mobile 5G. I bet a decade from now that’s exactly what the Wikipedia article on 5G will say – and that’s all AT&T was really shooting for.

The FCC Looks at 911

The FCC recently released its tenth annual report to Congress reporting on the collection and distribution of 911 fees nationwide. The report includes a number of interesting statistics, a few which will be listed below.

But first I’d like to look backwards a bit because we now take 911 for granted, but it hasn’t always been so. 911 has been implemented during my adult lifetime. The idea for having an emergency phone number was first introduced in 1967 by Lyndon Johnson’s Commission on Law Enforcement. AT&T selected the 9-1-1 digits the following year. An independent telco, the Alabama Telephone Company leaped on the concept and introduced 911 in Haleyville, Alabama in 1968 – but it then took decades for the implementation nationwide since this was deemed a local issue to be implemented by local governments. I recall the introduction of 911 in the DC suburbs in the mid-70s, accompanied by a flurry of radio, newspaper and TV ads to inform the public of the new safety service. There were major metropolitan areas like the Chicago suburbs that didn’t get 911 until the early 1980s.

911 service has been enhanced over the years. For example, by 2015 96% of homes in the US were covered by E-911 (enhanced) where the 911 operator knows the caller’s location according to the phone number for landlines or by using triangulation of cell sites for mobile phones. Currently 911 systems are upgrading to NG911 (next generation) that ties 911 systems into broadband to be able to relay text messages, photos and videos as part of the 911 process.

Some of the interesting statistics from the FCC report:

  • In 2017 almost $3 billion was collected in 911 fees to fund local 911 efforts. The total cost to provide 911 was reported at $4.8 billion, with 911 services in many states also funded partially by tax revenues.
  • States collect 911 fees in different ways. This includes flat rates per telephone or cellular lines, percentage of telecommunications bills, and flat rate per subscriber. Fees vary widely and range from $0.20 per residential landline in Arizona to $3.34 in West Virginia per cell phone. There are states that charge eve more for business landlines.
  • Most states use the 911 fees to fund the 911 service, but six states – Montana, New Jersey, New York, Rhode Island and West Virginia use some of their 911 fee to fund non-safety purposes or even just to go into the general funds of the state. In total $284 million was diverted from collected 911 fees.
  • Thirty-five states, Puerto Rico and the District of Columbia have begun the process of upgrading to NG911.
  • Sixteen states have deployed statewide Emergency Services IP Networks (ESInets) for exclusive use of public safety agencies.
  • Thirty states, Guam, Puerto Rico and the US Virgin Islands have not taken any steps for cybersecurity for 911 centers (PSAPs).
  • There are 5,232 PSAPs in the country. These range from tiny centers in sheriff stations in rural counties to massive 911 centers in major metropolitan areas. For example, Washington DC has one PSAP while there are 586 in Texas.
  • 1,381 PSAPs now had the ability to communicate with the public by text message. Another 1,103 PSAPs will be implementing that capability in 2018.
  • There were over 39,000 operators employed to take 911 calls in 2017.
  • Only 44 states reported 911 call volumes and in those states there were over 211 million calls to 911. Over 70% of calls now come from cellular phones.

I know it’s easy to hate regulation, but without it we wouldn’t have a 911 system that works so well. People in most of the country feel a lot safer knowing they can dial 911 and get help when needed.