Windstream Turns Focus to Wireless

Windstream CEO Tony Thomas recently told investors that the company plans to stress wireless technology over copper going into the future. The company has been using point-to-point wireless to serve large businesses for several years. The company has more recently been using fixed point-to-multipoint wireless technology to satisfy some of it’s CAF II build-out requirements.

Thomas says that the fixed wireless technology blows away what could be provided over the old copper plant with DSL. In places with flat and open terrain like Iowa and Nebraska the company is seeing rural residential broadband speeds as fast as 100 Mbps with wireless – far faster than can be obtained with DSL.

Thomas also said that the company is also interested in fixed 5G deployments, similar to what Verizon is now starting to deploy – putting 5G transmitters on poles to serve nearby homes. He says the company is interested in the technology in places where they are ‘fiber rich’. While Windstream serves a lot of extremely rural locations, there also serve a significant number of towns and small cities in their incumbent service areas that might be good candidates for 5G.

The emphasis on wireless deployments puts Windstream on the same trajectory as AT&T. AT&T has made it clear numerous times to the FCC that they company would like to tear down rural copper wherever it can to serve customers with wireless. AT&T’s approach differs in that AT&T will be using its licensed cellular spectrum and 4G LTE in rural markets while Windstream would use unlicensed spectrum like various WISPs.

This leads me to wonder if Windstream will join the list of big telcos that will largely ignore its existing copper plant moving into the future. Verizon has done it’s best to sell rural copper to Frontier and seems to be largely ignoring its remaining copper plant – it’s the only big telcos that didn’t even bother to chase the CAF II money that could have been used to upgrade rural copper.

The new CenturyLink CEO made it clear that the company has no desire to make any additional investments that will earn ‘infrastructure returns’, meaning investing in last mile networks, both copper and fiber. You can’t say that Frontier doesn’t want to continue to support copper, but the company is clearly cash-stressed and is widely reported to be ignoring needed upgrades and repairs to rural copper networks.

The transition from copper to wireless is always scary for a rural area. It’s great that Windstream can now deliver speeds up to 100 Mbps to some customers. However, the reality of wireless networks are that there are always some customers who are out of reach of the transmitters. These customers may have physical impediments such as being in a valley or behind a hill and out of line-of-sight from towers. Or customers might just live to far away from a tower since all of the wireless technologies only work for some fixed distance from a tower, depending upon the specific spectrum being used.

It makes no sense for a rural telco to operate two networks, and one has to wonder what happens to the customers that can’t get the wireless service when the day comes when the copper network gets torn down. This has certainly been one of the concerns at the FCC when considering AT&T’s requests to tear down copper. The current FCC has relaxed the hurdles needed to tear down copper and so this situation is bound to arise. In the past the telcos had carrier of last-resort obligations for anybody living in the service area. Will they be required to somehow get wireless signal to those customers that fall between the cracks? I doubt that anybody will force them to do so. It’s not far-fetched to imagine customers living within a regulated telcos service area who can’t get telephone or broadband service from the telco.

Customers in these areas also have to be concerned with the future. We have wide experience that the current wireless technologies don’t last very long. We’ve seen electronics wear out and become functionally obsolete within seven years. Will Windstream and the other telcos chasing the wireless technology path dedicate enough capital to constantly replace electronics? We’ll have to wait for that answer – but experience says that they will cut corners to save money.

I also have to wonder what happens to the many parts of the Windstream service areas that are too hilly or too wooded for the wireless technology. As the company becomes wireless-oriented will they ignore the parts of the company stuck with copper? I just recently visited some rural counties that are heavily wooded, and which were told by local Windstream staff that the upgrades they’ve already seen on copper (which did not seem to make much difference) were the last upgrades they might ever see. If Windstream joins the other list of big telcos that will ignore rural copper, then these networks will die a natural death from neglect. The copper networks of all of the big telcos are already old and it won’t take much neglect to push these networks into the final death spiral.

Can Cable Fight 5G?

The big cable companies are clearly worried about 5G. They look at the recently introduced Verizon 5G product and they understand that they are going to see something similar over time in all of their metropolitan markets. Verizon is selling 5G broadband – currently at 300 Mbps second, but promised to get faster in the future – for $70 standalone or for $50 for those with Verizon cellular.

This is the nightmare scenario for them because they have finally grown to the point where they are approaching a near monopoly in most markets. They have successfully competed with DSL and quarter after quarter have been taking DSL customers from the telcos. In possibly the last death knell for DSL, both Comcast and Charter recently increased speeds of their base products to at least 200 Mbps. Those speeds makes it hard for anybody to justify buying DSL at 50 Mbps or slower.

The big cable companies have started to raise broadband rates to take advantage of their near-monopoly situation. Charter just recently raised bundled broadband prices by $5 per month – the biggest broadband price increase I can remember in a decade or more. Last year a major Wall Street analyst advised Comcast that their basic broadband price ought to be $90.

But now comes fixed 5G. It’s possible that Verizon has found a better bundle than the cable companies because of the number of households that already have cellphones. It’s got to be tempting to homes to buy fast broadband for only $50 per month in a bundle.

This fixed 5G competition won’t come over night. Verizon is launching 5G in urban markets where they already have fiber. Nobody knows how fast they will really implement the product, due mostly to distrust of a string of other Verizon hype about 5G. But over time the fixed 5G will hit markets. Assuming Verizon is successful, then others will follow them into the market. I’m already seeing some places where companies American Tower are building 5G ‘hotels’ at poles, which are vaults large enough to accommodate several 5G providers at the same location.

We got a clue recently about how the cable companies might fight back against 5G. A number of big cable companies like Comcast, Charter, Cox and Midco announced that they will be implementing the new 10 Gbps technology upgrade from CableLabs. These cable companies just recently introduced gigabit service using DOCSIS 3.1. It looks like the cable companies will fight against 5G with speed. It sounds like they will advertise speeds far faster than the 5G speeds and try to win the speed war.

But there is a problem with that strategy. Cable systems with the DOCSIS 3.1 upgrade can clearly offer gigabit speeds, but in reality cable company networks aren’t ready or able to deliver that much speed to everybody. Fiber networks can easily deliver a gigabit to every customer, and with an electronics upgrade can offer 10 Gbps to everybody, as is happening in parts of South Korea. But cable networks have an inherent weakness that makes gigabit speed problematical.

Cable networks are still shared networks and all of the customers in a node share the bandwidth. Most cable nodes are still large with 150 – 300 customers in each neighborhood node, and some with many more. If even a few customers start really use gigabit speeds then the speed for everybody else in the node will deteriorate. That’s the issue that caused cable networks to bog done in the evenings a decade ago. Cable companies fixed the problem then by ‘splitting’ the nodes, meaning that they build more fiber to reduce the number of homes in each node. If the cable companies want to really start pushing gigabit broadband, and even faster speeds, then they are faced with that same dilemma again and they will need another round, or even two rounds of node splits.

For now I have serious doubts about whether Comcast and Charter are even serious about their gigabit products. Comcast gigabit today costs $140 plus $10 for the modem. The prices are lower in markets where the company is competing against fiber, and customers can also negotiate contract deals to get the gigabit price closer to $100. Charter has similar pricing – in Oahu where there is competition they offer a gigabit for $105, and their price elsewhere seem to be around $125.

Both of these companies are setting gigabit prices far above Google’s Fiber’s $70 gigabit. The current cable company gigabit is not a serious competitor to Verizon’s $50 – $70 price for 300 Mbps. I have a hard time thinking the cable companies can compete on speed alone – it’s got to be a combination of speed and price. The cable companies can compete well against 5G if they are willing to price a gigabit at the $70 Verizon 5G price and then use their current $100+ price for 10 Gbps. That pricing strategy will cost them a lot of money in node upgrades, but they would be smart to consider it. The biggest cable companies have already admitted that their ultimate network needs to be fiber – but they’ve been hoping to milk the existing coaxial networks for another decade or two. Any work they do today to reduce node size would be one more step towards an eventual all-fiber network – and could help to stave off 5G.

It’s going to be an interesting battle to watch, because if we’ve learned anything in this industry it’s that it’s hard to win customers back after you lose them. The cable companies currently have most of the urban broadband customers and they need to act now to fight 5G – not wait until they have lost 30% of the market.

Putting Skin in the Game for Broadband

Recently, Anne Hazlett, the Assistant to the Secretary for Rural Development at the USDA was quoted in an interview with Telecompetitor saying, “We believe the federal government has a role (in rural broadband), but we also need to see skin in the game from states and local communities because this is an issue that really touches the quality of life in rural America”.

This is a message that I have been telling rural communities for at least five years. Some communities are lucky enough to be served by an independent telco or an electric cooperative that is interested in expanding into fiber broadband. However, for most of rural America there is nobody that will bring the broadband they need to survive as a community.

Five years ago this message was generally not received well because local communities didn’t feel enough pressure from citizens to push hard for a broadband solution. But the world has changed and now I often hear that lack of broadband is the number one concern of rural counties and towns with poor broadband. We now live in a society where broadband has grown to become a basic necessity for households similar to water and electricity. Homes without broadband are being left behind.

When I’m approached today by a rural county, one of the first questions I ask them is if they have considered putting money into broadband. More and more rural areas are willing to have that conversation. In Minnesota I can think of a dozen counties that have decided they will pledge $1 million to $6 million to get broadband to the unserved parts of their county – these are pledges to make outright grants to help pay for the cost of a fiber network.

States are also starting to step up. Just a few year ago there were only a few states with grant programs to help jump start rural broadband projects. I need to start a list to get a better count, but there are now at least a dozen states that either have or are in the process of creating a state broadband grant program.

I don’t want to belittle any of the state broadband grant programs, because any state funding for broadband will helps to bring broadband to places that would otherwise not get it. But all of the state broadband grant programs are far too small. Most of the existing state grant programs allocate between $10 – $40 million annually towards solving a broadband problem that I’ve seen estimated at $40 – $60 billion nationwide. The grants are nice and massively appreciated by the handful of customers who benefit with each grant – but this doesn’t really fit into the category of putting skin in the game at the state level.

The federal programs are the same way. The current e-Connectivity program at $600 million sounds like a lot of assistance for broadband. But this money is not all grants and a significant amount of it will be loans that have to be repaid. Even if this was 100% grant money, if the national cost to bring rural fiber is $60 billion, then this year’s program would help to fund 1% of the national broadband shortfall – all we need to do is to duplicate the program for a century to solve the broadband deficit. If this program was to be spread evenly across the country, it’s only $12 million per state.

For many years we’ve been debating if government ought to help in funding rural broadband. In some ways it’s hard to understand why we are having this debate since in the past the country quickly got behind the idea of the government helping to fund rural electricity, rural telephony and rural roads. It seemed obvious that the whole country benefits when these essential services are brought to everybody. I’ve never seen any criticism that those past programs weren’t successful – because the results of these efforts were instantly obvious.

There is nobody anywhere asking governments to outright pay for broadband networks – although some local governments are desperate enough to consider this when there is no other solution. Building rural fiber – which is what everybody wants – is expensive and putting skin in the game means helping to offset enough of the cost in order to enable a commercial provider to make a viable business plan for fiber.

I wrote a blog in December that references a study done by economists at Purdue who estimate that the benefit of rural fiber is around $25,000 per household. I look at the results of the study and think it’s conservative – but even if the results are a little high this ought to be all of the evidence we need to justify governments at all levels putting more skin in the same.

When I see a rural county with a small population talking about pledging millions of dollars towards getting broadband I see a community that is really putting skin in the game, because that is a major financial commitment. For many counties this will be the largest amount of money they have ever spent for anything other than roads. By contrast, a state grant program of $20 million per year when the state budget might be $20 billion is barely acknowledging that broadband is a problem in their state.

I’m sure I’m going to hear back from those who say I’m being harsh on the state and federal grant program, and that any amount of funding is helpful. I agree, but if we are going to solve the broadband problem it means putting skin into the game – and by definition that means finding enough money to put a meaningful dent in the problem. To me that’s what skin in the game means.

Small Fiber Builders Making an Impact

The research firm RVA, LLC conducted a study for the Fiber Broadband Association looking at the number of homes and businesses that are now passed and/or served with fiber. The numbers show that smaller fiber providers are collectively having a big impact on the industry.

RVA found that as of September 2018 there were 18.4 million homes with fiber, up from 15 million a year earlier. To put that into perspective, at the end of 2017 there was just over 126 million US households, meaning that fiber has now made it into over 14% of US homes. What’s most impressive, though, about that finding is that 2.7% of homes got fiber in that one-year period. The number of fiber households has been creeping up slowly over the decade, but the speed of deployment is accelerating.

RVA also looked at passings and says that 39.2 million or 31% of homes are now passed with fiber. Comparing the 18.4 million fiber customers to the 39.2 million passings shows a fiber penetration rate of 47%. RVA also says that there are 1.6 million homes that are passed by two fiber providers – no doubt in the markets like Kansas City, Austin and the Research Triangle in North Carolina where Google and the incumbents both built fiber. RVA shows that when accounting for homes that have no broadband that fiber networks are achieving a 60% penetration rate.

Small fiber providers are collectively having a big impact on the industry. RVA says there are over 1,000 smaller fiber providers in the country. They quantify the overall market share of these providers as follows: smaller telcos (10.3%), fiber overbuilders (6.4%), cable companies (5.5%), municipalities (3.7%), real estate development integrators (1.1%) and electric cooperatives (0.5%).

In 2018 the small providers built to 29% of the new homes passed with the rest built by four Tier one providers. RVA didn’t identify these big providers, but clearly the biggest fiber builder right now is AT&T. The company has built fiber to over 10 million passings in the past four years and says they will reach about 14 million passings by mid-2019. A lot of the AT&T fiber passings come from an aggressive plan to build to MDUs (apartments and condominium complexes). However, the company is also making fiber available to homes within close range of its numerous existing neighborhood fiber POPs that are near to existing larger AT&T fiber customers.

The other biggest fiber builder right now is Altice. They announced a little over a year ago that they are planning to build fiber across their footprints from the Cable Vision and Suddenlink acquisitions – nearly 8 million passings. The company seems to be fulfilling that promise with a flurry of press releases in 2018 talking about active fiber deployments. Altice is currently trying to sell off some of its European fiber networks to lighten debt load and assumedly raise the cash needed to complete the US fiber build.

Most other large providers have more modest fiber plans. We know that the CenturyLink fiber expansion that was hot news just two years ago is likely now dead. Verizon is now putting its effort into fixed 5G wireless. The big cable companies all build fiber in new subdivisions but have all committed to DOCSIS 3.1 on their existing cable networks.

Looking forward a few years and most of the new fiber is likely to come from smaller providers. AT&T hasn’t announced any plans past the 2019 schedule and by then will have effectively passed all of the low-hanging fruit within range of its existing fiber network. Altice says it will take until at least 2022 to finish its fiber construction. There are no other big companies with announced plans to build fiber.

All of this is good news for the US households lucky enough to get fiber. It’s always been industry wisdom that the industry wouldn’t develop gigabit applications until there are enough fiber households to make it economically viable. While most customers on fiber probably are subscribing to speeds less than a gigabit, there ought to finally be enough gigabit fiber customers nationwide to create a gigabit market.

 

The Pushback Against Smart Cities

If you follow the smart city movement in the US you’ll quickly see that Kansas City, Missouri touts itself as the nation’s smartest city. The smart city movement got an early launch there when the City was announced as the first major market for Google Fiber. That gigabit fiber network attracted numerous small tech start-ups and the City also embraced the idea of being a technology leader.

The city’s primary smart city venture so far has been to bring smart city technology to a 54-block area in downtown. But this area only covers about 1% of the total area of the City. The City is currently contemplating expanding the smart city into the neglected east side neighborhoods near downtown. This is an area with boarded up storefronts and vacant lots, and the hope is that investing in smart city will bring a boost to this area as a way to kick-start economic development.

So far the primary smart city applications include smart parking, smart intersections, smart water meters and smart streetlights. The city also installed video surveillance cameras along the 2.2-mile downtown corridor.  The existing deployment also includes public WiFi provided through 25 kiosks placed throughout the smart city neighborhood. As of last fall there had been a reported 2.7 million log-ins to the WiFi network.

In the east side expansion WiFi will take on a more significant role since it’s estimated that only 40% of the residents in that area have home broadband today – far below the national average of 85%. The city is also looking to implement a rapid transit bus line into the east side as part of the smart grid expansion.

The new expansion into the east side is slated to have more surveillance including new features like gun shot detectors. There has been public fear voiced that this system can be used to disadvantage the largely minority population of the area.

The biggest hurdle to an expanded smart city services is money. The initial deployment was done through a public-private partnership. The city contributed $3.7 million, which it largely borrowed. Sprint, which manages the WiFi network contributed about $7 million and Cisco invested $5 million. The cost to expand the smart city everywhere has been estimated to cost half a billion.

It is the public-private partnerships that bring a troublesome aspect to the smart city concept. It’s been reported that Sprint collects data from those who log in to the free WiFi network – information like home zip code and results of Internet searches. It’s also been reported that Sprint can track people who have once subscribed to the service, even if they don’t log in. Sprint won’t say how it collects and uses customer data – but as we are learning throughout the tech world, it is the monetization of customer data that fuels many ISPs and online services.

There is also growing public concern about surveillance cameras. It’s starting to become clear that Americans don’t want to be tracked by cameras, especially now with the advent of decent facial recognition technology. We saw Seattle have to tear down a similar surveillance network before it ever went into service. We’re seeing huge pushback in Toronto about a proposed smart city network that includes surveillance.

We only have to look at China to see an extreme example of the misuse of this technology. The country is installing surveillance in public places and in retail areas and tracks where people are and what they do. China has carried this to such an extreme that they are in the process of implementing a system that calculates a ‘citizen score’ for every person. The country goes so far as to notify employers of even minor infractions of employees like jaywalking.

It’s going to be an uphill battle, perhaps one that never can be won for US cities to implement facial recognition tracking. People don’t want the government to be tracking where they are and what they do every time they go out into public. The problem is magnified many times when private companies become part of the equation. As much as the people in Kansas City might not fully trust the City, they have far less reason to trust an ISP like Sprint. Yet the smart city networks are so expensive it’s hard to see them being built without private money – and those private partners want a chance to get a return on their investment.

Is the Public Buying the 5G Hype?

T-Mobile recently conducted a survey, conducted by HarrisT, that looks in detail about how the public feels about the role of pending new technologies. They expect to repeat this survey quarterly to track how public perceptions of technology changes over time.

As you would expect, a significant number of the questions in the poll were about 5G. I’m sure that T-Mobile’s motivation for conducting the survey is due to the fact that they are one of the few companies in the industry that are not hyping 5G. They expect 5G to start creeping into the industry in 2020 and then taking as much as a decade to become a widespread reality.

The survey started by asking if respondents had heard of various new technologies. The 5G hype isn’t fully pervasive yet with 57% having heard of the technology. For other technologies: Internet of Things – 29%; machine learning – 26%; virtual reality – 83%; artificial intelligence – 78%; cloud computing – 52% and blockchain – 19%.

One of the most interesting responses in the survey is the public expectation of when they expect to see 5G in the market place. Of those that have heard of 5G, 29% thought it was already here in late 2018. 35% more think they’ll see 5G in 2019 and another 25% expect 5G in 2020. This response has to reflect the flood of marketing hype and press releases extolling 5G. The public has been inundated for several years by articles and press releases that declare that 5G is going to solve our broadband problems by delivering huge data speeds wirelessly everywhere.

When asked more specifics about 5G, 64% were somewhat excited or very excited about 5G in general. They were also somewhat or very excited about the following attributes of 5G: faster upload and download speeds – 92%; wider network coverage – 91%; higher quality streaming video – 85%; higher quality voice calls – 89%; less lag time on mobile devices – 90%; more reliable mobile connections – 93%; greater number of connected devices – 80%; smart city data sensors – 68%; driverless vehicles – 50%; virtual reality in the work environment – 59%; smart energy grids – 75%; supercharged IoT – 64%; expanded use of drones – 47%; next generation artificial intelligence – 59%; telehealth – 68%; remote surgery – 59%; real time language translation – 72%; replacement of landline broadband connections – 75%; replacement of traditional cable TV – 75%.

Interestingly, only 27% of total respondents thought that 5G would have a big influence on their daily life.

In a finding that I find disturbing, 65% of respondents think 5G will have a positive impact on rural America. Even the biggest 5G proponents admit that 5G is going to be hard to justify in low-density areas. It’s not hard to understand this belief because I’ve seen numerous articles that make this claim. 79% think 5G will have a positive impact in cities.

When asked which companies would be leaders in 5G, the unsurprising responses include Verizon (43%), AT&T (36%), Apple (43%), Samsung (35%) and T-Mobile (20%). However, there were surprises on this list including Amazon (24%), Comcast (12%), Google (36%), Facebook (12%), Microsoft (34%) and Dish Networks (5%).

The public believes that 5G is going to bring price increases. 84% said they thought that 5G would result in higher cellular service prices. 77% said they thought 5G would lead to higher cable TV prices (this has me scratching my head). 81% said they thought 5G would lead to higher process for home broadband – but wouldn’t increased competition for home broadband bring lower prices? 86% expect the prices for smart phones to be higher.

Overall, the survey shows an unrealistic public perception about when we’ll see the benefits of 5G. It’s not hard to understand this misperception since there are untold articles making it sound like we’re on the verge of a 5G revolution. I’m guessing this might have been one of the motivations for T-Mobile to sponsor this survey since they are one of the most realistic voices in the industry talking about the 5G time line. It will be interesting to see what the public thinks in a few years after very little 5G has actually been implemented. But perhaps I’m just being overly skeptical since the big carriers like AT&T are now extolling their 4G LTE product as 5G – maybe the public will but it.

Minnesota Sues Comcast

Lori Swanson, the Attorney General of Minnesota sued Comcast on December 21 seeking refunds to all customers who were harmed by the company’s alleged violation of the state’s Prevention of Consumer Fraud Act and Uniform Deceptive Trade Practices Act. The complaint details the sort of practices that we’ve come to expect from most of the big cable companies – and hopefully this serves as a warning to smaller ISPs that might be following similar practices. It’s an interesting read.

The most significant dollar complaint is that Comcast has defrauded customers about the true nature of two fees – the ‘Regional Sports Network Fee’ and the ‘Broadcast TV’ fee. These two fees now total $18.25 per month. These fees are both a part of every cable package and are not optional to customers, but Comcast does not mention them when advertising the cable products. Further, Comcast customer service has repeatedly told the public that these fees are mandated by the government and are some a tax that is not set by Comcast.

Comcast only started charging separately for these two fees in 2014, but the size of these line items has skyrocketed on bills. In recent years the company has put a lot of the annual rate increases into these fees, allowing the company to continue to advertise low prices. The Regional Sports fee passes along the cost of Fox Sports North, and perhaps other regional sports. The Broadcast TV fee includes the amounts that Comcast pays local affiliate stations for ABC, CBS, FOX and NBC.

Interestingly, Comcast was previously sued over this same issue and settled the case without a verdict. As part of that suit the company promised to fix the problems, but they continued into 2017. In a pleading that is sure to displease company employees, Comcast threw its customer service reps under the bus and blame the issue on them. Comcast argues that breaking out these fees makes it easier for customers to know what they are paying for – but there are numerous examples cited in the complaint where new customers were surprised at the size of the first bill they receive from the company.

The complaint also says that the company often misrepresents the fees for equipment rental such as cable settop boxes, digital adapters and broadband modems. The complaint says that for some packages these fees add 30% to the cost of the product and are not fully disclosed to customers.

The complaint also says that Comcast routinely adds unwanted fees to customer bills. Customers that are visited by Comcast field technicians, who visit a business office or who buy from a Comcast door-to-door salesperson are often surprised to see additional products added to their bill. The complaint blames this on the practice of paying commissions to employees for sales.

The complaint notes that Comcast is well aware of these issues. The company settled an FCC complaint about the same issues in 2016 and late last year made refunds to more than 20,000 customers in Massachusetts over these same issues.

It’s not hard to verify some of the issue. If you go to the Comcast website you’ll find that it’s almost impossible to find the real cost of their cable and broadband products. The company constantly advertises low-priced specials that don’t mention the extra programming fees or the equipment fees.

This is a cautionary tale for smaller ISPs that compete with Comcast or other large cable companies. It’s always tempting to advertise cheap special prices in response to big cable company advertising. I know many smaller cable providers that have also separated out the sports and broadcast fees and who are not always fully forthcoming about equipment charges and other fees. It’s hard to watch customers leave who are lured by falsely advertised low prices – but most small ISPs have elected to deal with customers fairly as a way to differentiate themselves from the big companies.

How Bad is the Digital Divide?

The FCC says that approximately 25 million Americans living in rural areas don’t have access to an ISP product that would be considered as broadband – currently defined as 25/3 Mbps. That number comes out of the FCC’s mapping efforts using data supplied by ISPs.

Microsoft tells a different story. They say that as many as 163 million Americans do not use the Internet at speeds that the FCC considers as broadband. Microsoft might be in the best position of anybody in the industry to understand actual broadband performance because the company can see data speeds for every customer that updates Windows or Microsoft Office – that’s a huge percentage of all computer users in the country and covers every inch of the country.

Downloading a big software update is probably one of the best ways possible to measure actual broadband performance. Software updates tend to be large files, and the Microsoft servers will transmit the files at the fastest speed a customer can accept. Since the software updates are large files, Microsoft gets to see the real ISP performance – not just the performance for the first minute of a download. Many ISPs use a burst technology that downloads relatively fast for the first minute or so, but then slows for the rest of a download – a customer’s true broadband speed is the one that kicks in after the burst is finished. The burst technology has a side benefit to ISPs in that it inflates performance on standard speed tests – but Microsoft gets to see the real story.

I’ve ranted about the FCC’s broadband statistics many times. There are numerous reasons why the FCC data is bad in rural America. Foremost, the data is self-reported by the big ISPs who have no incentive to tell the FCC or the public how poorly they are doing. It’s also virtually impossible to accurately report DSL speeds that vary from customer to customer according to the condition of specific copper wires and according to distance from the DSL core router. We also know that much of the reporting to the FCC represents marketing speeds or ‘up-to’ speeds that don’t reflect what customers really receive. Even the manner of reporting to the FCC, by Census block, distorts the results because when a few customers in a block get fast speeds the FCC assumes that everyone does.

To be fair, the Microsoft statistics measure the speeds customers are actually achieving, while the FCC is trying to measure broadband availability. The Microsoft data includes any households that elect to buy slower broadband products to save money. However, there are not 140 million households that purposefully buy slow broadband (the difference between 163 million and 24 million). The Microsoft numbers tell us that the actual speeds in the country are far worse than described by the FCC – and for half of us slower than 25/3 Mbps. That is a sobering statistic and doesn’t just reflect that rural America is getting poor broadband, but also that many urban and suburban households also aren’t achieving 25/3 Mbps.

I’ve seen many real-life examples of what Microsoft is telling us. At CCG Consulting we do community surveys for broadband and we sometimes see whole communities where the achieved speeds for customers is lower than the speeds advertised by the ISPs. We often see a lot more households claim to have no broadband or poor broadband than would be expected using the FCC mapping data. We constantly see residents in urban areas complain that broadband with a relatively fast speed seems slow and sluggish.

Microsoft reported their findings to the FCC, but I expect the FCC to ignore their story. This is a drastic departure from the narrative that the FCC is telling Congress and the public. I wrote a blog just a few weeks ago describing how the FCC is claiming that big ISPs are delivering the speeds that they market. Deep inside the recent reports the FCC admitted that DSL often wasn’t up to snuff – but the Microsoft statistics mean that a lot of cable companies and other ISPs are also under-delivering.

In my mind the Microsoft numbers invalidate almost everything that we think we know about broadband in the country. We are setting national broadband policy and goals based upon false numbers – and not numbers that are a little off, but rather than are largely a fabrication. We have an FCC that is walking away from broadband regulation because they have painted a false narrative that most households in the country have good broadband. It would be a lot harder for politicians to allow broadband deregulation if the FCC admitted that over half of the homes in the country aren’t achieving the FCC definition of broadband.

The FCC has been tasked by Congress to find ways to improve broadband in areas that are unserved or underserved – with those categories being defined by the FCC maps. The Microsoft statistics tell us that there are huge numbers of underserved households, far higher than the FCC is recognizing. If the FCC was to acknowledge the Microsoft numbers, they’d have to declare a state of emergency for broadband. Sadly, the FCC has instead doomed millions of homes from getting better broadband by declaring these homes as already served with adequate broadband – something the Microsoft numbers say is not true.

The current FCC seems hellbent on washing their hands of broadband regulation, and the statistics they use to describe the industry provide the needed cover for them to do so. To be fair, this current FCC didn’t invent the false narrative – it’s been in place since the creation of the national broadband maps in 2009. I, and many others predicted back then that allowing the ISPs to self-report performance would put us right where we seem to be today – with statistics that aren’t telling the true story. Microsoft has now pierced the veil to see behind the curtain – but is there anybody in a position of authority willing to listen to the facts?

Trusting Big ISP Data

The FCC has finally come to grips with the fact that big ISPs are supplying bad data to the various FCC mapping efforts that are then used to distribute FCC funding and to set national policies. The latest mapping snafu come from one-time data collection from the cellular carriers last year showing rural cellular coverage. These maps were to be used to establish a new federal fund called the Mobility Fund II which will distribute $4.53 billion for the expansion of 4G cellular coverage to rural parts of the country that have little or no cellular coverage.

The big cellular companies have been lying about their cellular coverage for years. If you look at the nationwide 4G LTE coverage maps from AT&T and Verizon you’d think that they have cellular coverage virtually everywhere except in areas like deserts and mountains. But anybody living or traveling in rural America knows better. It’s not hard to drive very far off main highways and hit areas that never see a bar of cellular coverage. And even where there is coverage, it’s still often 3G or even older technology.

When the FCC collected data for the Mobility II funding the big carriers stuck to this same flawed mapping data. It turns out that overclaiming rural cellular coverage will keep funding from going to the smaller cellular companies that still serve in many parts of rural America. Luckily the FCC effort included a challenge process and the FCC was flooded with challenges showing that cellular coverage is far worse than is claimed by the big carrier maps. There were so many challenges that the FCC put the Mobility II award process on hold until they can sort it out.

This is just one of the mapping efforts from the FCC that have been used to award billions of dollars of funding over the last decade. The FCC relied on mapping data from the big telcos to establish the areas that were eligible for the billions of dollars of CAF II funding.

Since rural areas served by the biggest telcos have been neglected for years, and since the big telcos deployed very little rural DSL outside of towns it’s not hard to identify huge swaths of rural areas that have little or no broadband. But the big telco broadband coverage data contains a ton of inaccuracies. For example, there are numerous smaller rural towns that are listed in the telco databases as having decent broadband, when the reality on the ground is broadband speeds of a few Mbps at best. It looks like the big telcos often reported marketing speeds rather than actual speeds. This inaccuracy has stopped others from seeking federal grants and loans to upgrade such towns.

I fear that rural broadband mapping is on the verge of the next crisis. As a blogger I am contacted a lot by folks in rural America describing their broadband situation. I’ve heard enough stories to convince me that the big telcos have made only a half-hearted effort at implementing CAF II. I think many homes that should have seen CAF II broadband upgrades will see zero upgrades while many others will get upgraded to speeds that don’t meet even the measly CAF II goal of 10/1 Mbps.

The big telcos are not likely to come clean about having pocketed CAF II funding rather than spending every penny to make upgrades, and so they are going to claim that the CAF II areas have been upgraded, regardless of the actual situation on the ground. Rural households that didn’t see the promised upgrades will then be counted by the FCC as having better broadband. That will make these areas off limits to future federal funding to fix what the telcos botched. We already see the newest federal grant programs having a new requirement that no more than 10% of the homes covered by federal funding can have broadband today. Because of the falsified mapping, many homes without broadband are going to be deemed to be covered and it will be a massive challenge for somebody else to get funding to help such areas. These communities will be harmed twice – once by the telcos that aren’t upgrading speeds and second by the inaccurate mapping that will stop others from funding assistance to fix the problem.

The big telcos and carriers have huge incentives to lie about rural broadband coverage. None of the big telcos or cellular carriers want to spend any of their own money in rural areas, but they love the revenues they are receiving by a captive rural customer base who pays high prices for poor broadband. The big companies are fighting hard to preserve these revenues, which means they don’t want anybody else to get funding to improve broadband. To make matters worse, the big telcos continue to eliminate technicians and maintenance budgets in rural America, making it nearly impossible for customers to get repairs and service.

I unfortunately don’t have any easy solution for the problem of crappy mapping. Perhaps the FCC could entertain challenges to the broadband maps in the same way they are accepting challenges in the Mobility II process. I know a number of rural communities that would make the effort to create accurate broadband maps if this might bring them better broadband.

The Future of Video Streaming

SANYO DIGITAL CAMERA

I predict that we are going to see a huge shake-out in the online video market over the next few years. The field of OTT providers is already crowded. There are providers that offer some version of the programming offered on traditional cable TV like Sling TV, DirecTV Now, Playstation Vue, Hulu Plus, YouTube TV, fuboTV and Layer3 TV. There are also numerous providers with unique content like Netflix, Amazon Prime, CBS All Access, HBO Go, and more than 100 others.

The field is going to get more crowded this year. Disney is planning a Netflix competitor later this year that will include Disney’s vast library of content including unique content from Marvel, Lucasfilm, 21st Century Fox and Pixar.

AT&T also plans to offer a unique-content platform that includes the vast library of content it acquired through the merger with Time-Warner along with the content from HBO.

Apple has finally been creating unique content that it will start showing some time this year. Amazon has stepped up the creation of unique content. Comcast is planning a launch with the unique content it owns through NBC Universal and Illumination Studios.

But the biggest news is not that there will be more competitors – it’s that each of the creators of unique content is intending to only offer their content on their own platform. This is going to transform the current online landscape.

The big loser might be Netflix. While the company creates more unique content than anybody else in the industry they have benefited tremendously from outside content. I happen to watch a lot of the Marvel content and my wife sometimes refers to Netflix as the Marvel network – but that content will soon disappear from Netflix. Disney owns the Star Wars franchise. NBC Universal (Comcast) recently purchased the rights to Harry Potter. CBS owns the Star Trek franchise. AT&T owns the Game of Thrones. Amazon bought the rights to develop more Middle Earth (Lord of the Rings) content. Is Netflix going to be as attractive if they are unable to carry attractive external content in addition to their own unique content?

Each of the major content owners is aiming to capitalize on their most valuable content. For example, the industry buzz is that there are numerous new Star Trek efforts underway and that CBS All Access will become all Star Trek, all of the time. Each of these content owners is making similar plans to best monetize their content.

This looks it is going to turn into a content arms race. That means more content than ever for the viewing public. But it also means that a household that wants to watch a range of the most popular content is going to need numerous monthly subscriptions. I think 2019 is going to become the year when the monthly cost of online content starts climbing to rival the cost of traditional cable.

My family is probably fairly typical for cord cutters. We watch local channels, traditional cable networks and sports through Playstation Vue. We have subscriptions to Netflix, Amazon Prime and Hulu. During the year we add and subtract networks like ESPN Plus, CBS All Access, HBO NOW and several others. And we also buy individual TV shows and movies that aren’t included in these various platforms.

I’m not unhappy with our array of content. Each of our three family members gets to watch the content they want. We’re each free to use the devices we like and watch at times that are convenient.

The number one reason cited for cord cutting is to save money. I’m pretty certain that as a family that we already aren’t saving anything compared to what content cost us before we went online. However, saving money was not our primary reason for going online. I look forward and I suspect that we’ll probably add some of the new content this year such as Disney, so our costs are likely to keep climbing.

A few years ago there was a lot of speculation about where the industry is headed. A lot of people thought that the Amazon super-aggregator model was the future, and Amazon is doing well by reselling dozens of unique content platforms under its name brand. However, it looks like the industry is now headed in the opposite direction where each big corporate owner of unique content is going to want to extract the maximum value by selling directly to the public.

I have to wonder what this all means for the public. Will the high cost of buying numerous online packages dissuade many from cutting the cord? It’s also confusing trying to find what you want to watch with so many different sources of content that are in separate silos. It’s going to be interesting to see these industry giants battling each other for eyeballs.