Big ISP Risk Trends

BDO Global is an international firm that specializes in accounting, tax and financial services. They monitor a number of industries and recently published their 2018 Telecommunications Risk Factor Survey – their fourth annual survey. The survey asks the largest 60 worldwide telecom companies about issues that are decreasing or increasing their market risk.

While the issues identified in the report represent issues faced by the giant telecom companies (certainly would include AT&T, Comcast and Verizon), the findings are important because issues affecting the big telecom companies eventually filter downward to affect the rest of the industry.

There are areas where the big telecom companies are seeing less risk:

  • Reduced Regulatory Risk. Companies are seeing less regulation, something that is particularly true in this country. Since big telecom companies tend towards being monopolies, less regulation generally translates into higher prices and better financial performance.
  • Expansion of Markets. Big telecom companies worldwide are expanding into new product lines. Some are doing this by big mergers, such as Charter and Time Warner Cable. Others are moving into new fields – for example, all the large US ISPs have purchased companies to help them compete with Google for advertising revenues. The big US cable companies are now entering the cellular business. The quad play has now entered our vocabulary.

However, the big companies also see growing risk in some areas including:

  • Increased Competition. US cable companies becoming cellular providers is a great example, and they are putting downward price pressure on all US cellular providers. But competition is coming from many directions – companies like Netflix, Skype, VoIP providers and many others are chipping away at services traditionally provided by telcos.
  • Fast Arrival of New Technologies. The big telcos expressed concern about how quickly new competitive threats are able to make it to market. They see upcoming threats from new technologies like 5G and new satellite broadband networks. They see the proliferation of on-line content. They are generally concerned that new technologies are making it to the market more quickly than in the past and can quickly gain significant market share.
  • Interest Rates. Telecoms are expecting higher future interest rates. This is a big concern since telecoms generally carry a lot of debt and are more susceptible to interest rates than many other industries. Big telecoms have been borrowing heavily for mergers and acquisitions and often finance capital expansion – both which create long-term debt. The big telcos are worried that higher interest will restrict their capacity to grow.
  • Access to Financing. Big telcos see a tightening of the credit markets due to a general tightening of the banking industry, but also due to their own performance. Many telcos are seeing lower margins per customer due to cord cutting, the continued drop of landlines and increased cellular competition. They foresee bankers less willing to extend the same levels of debt they’ve had available in the recent past.
  • Foreign Currency Exchange Rates. Telcos that work in multiple countries, like AT&T, are concerned with fluctuating currency exchange rates which can quickly turn international profits upside down.

Overall the big global telecom companies are doing well today. These BDO risk assessment asks them to look out a few years into the future. The overall worries of the industry as a whole are a little lower in 2018 than in 2017. However, the industry is still far from being rosily optimistic and foresees some dark clouds looking over the future horizon.

If their fears become true, this represents both increased risk and increased opportunity for smaller competitive ISPs. Certainly a tightening of lending and higher interest rates hurt smaller companies even more than the big ones. However, anything that forces the big companies to slow down or retract opens up more competitive options for small companies who compete against them.

This is an interesting look into the near future and is something that is not much publicly discussed. Once has to assume that the big telcos have their own internal economic forecasts – although I’ve never seen a lot of evidence that they look forward much past the next few quarterly Wall Street earnings announcements.

New York Ousts Charter

The New York Public Service Commission voted on Friday to oust Charter from the state. They are revoking the approval of Charter’s acquisition of Time Warner Cable in 2016 due to the company failing to meet the requirements of that merger. The PSC has given Charter 60 days to present a plan for divesting it’s New York property and to subsequently leave the state. Charter announced almost immediately that they will appeal the decision, so expect a big ugly court fight.

The Commission’s order provides the justification for the drastic measure cites the following reasons for the order:

  • The company’s repeated failures to meet deadlines;
  • Charter’s attempts to skirt obligations to serve rural communities;
  • Unsafe practices in the field;
  • Its failure to fully commit to its obligations under the 2016 merger agreement; and
  • The company’s purposeful obfuscation of its performance and compliance obligations to the Commission and its customers.

One of the biggest items under contention is Charter’s agreement to extend its network to 145,000 unserved and underserved residential housing units within four years of the merger. Charter claims that they are meeting that commitment, but the PSC says that a lot of the passings counted by Charter were in places like New York City where the company already had an obligation under local franchise agreements to connect to customers. The PSC ‘s merger requirement specified that Charter would reach beyond its current network boundaries to add suburban and rural customers that are within reasonable range of the Charter network.

The PSC accuses Charter of lying to the PSC and the public about meeting its merger obligations. They say the company has repeatedly falsely advertised and told customers that it is exceeding its commitments to the state. Now that this is likely going to end up in court the facts will be made clear, and it’s likely that the PSC’s facts are correct or they wouldn’t have taken this extraordinary step.

I can only remember a few cases during my career where a state regulatory body disenfranchised a telco or cable company. The few cases I recall were based upon criminal behavior of the company owners. Cities have sometimes cancelled a cable TV franchise, but usually it’s been due to the cable company being nearly dead or bankrupt and the city wanting to be able to legally tear down unused cables.

It’s been routine practice for big ISPs to not fully meet commitments they promise during mergers negotiations with regulators. They generally take a weak stab at meeting commitments, but they’ve never fretted about not fully complying since the only recourse against them are fines, or something more drastic like is being done in this case. It may sound cynical, but I think big companies do the math and gladly accept fines when that’s cheaper than meeting a commitment.

The NY PSC order focused on Charter’s failure to meet merger conditions, but there is an older history of dispute between the PSC and the company. The PSC has had a long-standing dispute in upstate New York and accused Time Warner Cable (and eventually Charter) of defrauding the public by providing old and obsolete cable modems that were not capable of achieving the advertised broadband speeds. In 2013 Time Warner Cable promised the NY PSC it would fix the problem, but the commission sued the company (now Charter) in 2017 after it was shown that most of the old cable modems were still in service – although the cable company had subsequently begun advertising even faster speeds. It turns out that Time Warner / Charter was not only failing to replace old modems as promised, but they were still recirculating the obsolete modems back into service for new broadband customers.

I have no idea of how the courts might rule on this case because the suggested remedy of kicking Charter out of the state is unprecedented. I’d love to hear of any similar cases if readers know of them – but I can’t recall a state regulatory commission trying to kick a major ISP out of their state.

Obviously Charter could have avoided all of this by complying with the requirements of the merger. But I’m sure that an internal decision was made that the capital required to meet those conditions was more than the company is willing to spend. The company didn’t help its case if it lied to the PSC about meeting the commitments. It’s clear that Charter has been derelict in the earlier case of replacing obsolete cable modems and in that case they are clearly bad corporate citizens. It takes a lot for a regulatory to decide to oust a regulated company, and I guess Charter has crossed that line.

The FCC and the Digital Divide

The current FCC Chairman Ajit Pai talks a lot about his commitment to solving the digital divide and to bring broadband to everybody in the country. Chairman Pai has now made numerous visits to rural America and to poor communities and has repeatedly promised that this FCC is on board with finding broadband solutions for everyone. Yet there are numerous actions by this FCC that tell a different story.

Redefining Broadband. Last year the FCC considered changing the definition of broadband – a change which would have drastically lowered the count of households without good broadband The FCC suggested that 10/1 Mbps cellular broadband is equivalent to a 25/3 Mbps landline connection. This change would have reclassified millions of homes as having access to broadband and would have instantly ‘solved’ a huge portion of the digital divide without changing anybody’s broadband. The FCC is required by Congressional edict to set policies that bring broadband to all, and their solution was to unilaterally declare that millions of homes served with only cellular broadband needed no further FCC assistance.

The public and the industry rebelled against this suggestion and the FCC backed down. However, the FCC is required by Congress to examine the availability of broadband every year and they will have annual opportunities to redefine broadband and recalibrate the way we count those on the wrong side of the digital divide. One has to only talk to a rural household trying to run their home broadband from a tethered cellphone to understand the ridiculousness of this idea. The high cost, low data caps, slow speeds and high latency make cellular broadband an extremely expensive and unsatisfactory substitute for landline broadband. There are many people who elect to use only cellular data, but that’s not the same thing as assuming that a cellphone connection can provide enough broadband for the typical home.

Lifeline Program. This FCC seems to be trying to eliminate or greatly restrict the Lifeline program. It’s clear that Chairman Pai would like the program to go away completely and the FCC has been whittling away at the program.

First, they stopped accepting new applications for carriers that want to join the Lifeline program. I know of two municipalities that planned to expand their broadband networks to thousands of low-income homes and offer $10 – $20 broadband that would have been enabled by the $9.95 monthly Lifeline subsidy. They were dissuaded when the FCC made it clear they were not likely to approve new Lifeline providers.

The FCC also changed the rules making it hard or impossible for MVNOs (wireless resellers) from receiving the Lifeline subsidies. These companies were the primary proponents and sellers of low-cost cellular phones and data plans for low-income customers. For example, there are MVNOs that provide a low-function phone, and a limited amount of voice and data to the homeless for the $9.95 reimbursement from the Lifeline fund. There have been numerous testimonials how these phones have improved the quality of life for the homeless by providing them with access to social services and allowing them to make phone calls or texts. Blocking these carriers from Lifeline kills this kind of initiative.

The FCC also eliminated the additional $25 per month from the lifeline program that was available to low-income natives living on tribal land. Eliminating this subsidy and also restricting the Lifeline funds to only facility-based carriers is having the effect of making cellphones unaffordable in some of the poorest places in the country. Even the big cellular companies like AT&T and Verizon opposed this change to the Lifeline fund.

Eliminated Title II Regulation. Perhaps the most damaging change the FCC made was to eliminate all FCC regulation of broadband by eliminating Title II regulation. This FCC order is referred to as the net neutrality order, but there are a number of aspects of the order that have nothing to do with net neutrality.

The FCC removed itself as the watchdog on all aspects of broadband including pricing, data caps, disclosure of practices and policies, etc. The FCC instead shuttled broadband issues to the Federal Trade Commission – an agency that can punish companies which badly abuse the public, but which cannot set proactive policies.

We are poised to see big future increases in broadband prices. That’s the only way that the big monopoly ISPs can continue their historic revenue growth. The big ISPs have hit a wall with slowing numbers of new broadband customers and sinking cable TV and telephone revenues. Rising broadband prices will do more harm to universal service than any other policy. One Wall Street analyst last year suggested that Comcast’s basic broadband price ought to be $90 – something that would drive millions of homes from landline broadband. The FCC has removed themselves as broadband regulators, meaning that the big cable monopolies are going to be free to do what monopolies do and raise rates to maximize profits. Even if the FCC never directly regulates broadband prices they have many other ways to pressure big ISPs to act responsibly – but they’ve given away their regulatory authority and any regulatory leverage is gone.

Fiber in Apartment Buildings

For many years a lot of my clients with fiber networks have avoided serving large apartment buildings. There were two primary causes for this. First, there has always been issues with getting access to buildings. Landlords control access to their buildings and some landlords have made it difficult for a competitor to enter their building. I could write several blogs about that issue, but today I want to look at the other historical challenge to serving apartments – the cost of rewiring many apartment buildings has been prohibitive.

There are a number of issues that can make it hard to rewire any apartment. Some older buildings had concrete floors and plaster walls and are hard to drill for wires. A landlord might have restrictions due to aesthetics and not want to see any visible wiring. A landlord might not allow for adequate access to equipment for installations or repairs, particularly after dark. A landlord might not have a safe space for an ISP’s core electronics or have adequate power available.

But assuming that a landlord is willing to allow a fiber overbuilder, and is reasonable about aesthetics and similar issues, many apartment owners now want fiber since their tenants are asking for faster broadband. There are new ways to serve apartments that were not available in the past that can now make it possible to serve apartments in a cost-effective manner.

G.Fast has come of age and the equipment is now affordable and readily available from several vendors. A number of telcos have been using the technology to improve broadband speeds in apartment buildings. The technology works by using frequencies higher than DSL and using existing telephone copper in the building. Copper wire is mostly owned by the landlord, and they can generally grant access to the telephone patch panel to multiple ISPs.

CenturyLink reports speeds over 400 Mbps using G.Fast, enabling a range of broadband products. The typical deployment brings fiber to the telecom communications space in the building, with jumpers made to the copper wire for customers wanting faster broadband. Telcos are reporting that G.Fast offers good broadband up to about 800 feet, which is more than adequate for most apartment building.

Calix now also offers a G.Fast that works over coaxial cable. This is generally harder to use because it’s harder to get access to coaxial home runs to each apartment. Typically an ISP would need to get access to all of the coaxial cable in a building to use this G.Fast variation. But it’s worth investigating since it increases speeds to around 500 Mbps and extends distances to 2,000 feet.

Millimeter Wave Microwave. A number of companies are using millimeter wave radios to deliver bandwidth to apartment buildings. This is not using the 5G standard, but current radios can deliver two gigabits for about one mile or one gigabit for up to two miles. The technology is mostly being deployed in larger cities to avoid the cost of laying urban fiber, but there is no reason it can’t be used in smaller markets where there is line-of-sight from an existing tower to an apartment building. The radios are relatively inexpensive with a pair of them costing less than $5,000.

It’s an interesting model in that the broadband must be extended to customers from the roof top rather than the basement. The typical deployment would run fiber from the rooftop radio, down through risers and extended out to apartment units.

The good news with stringing fiber in apartments is that wiring technology is much improved. There are now several different fiber wiring systems that are easy to install, and which are unobtrusive by hiding fiber along the corners of the ceiling.

Many ISPs are finding that the new wiring systems alone are making it viable to string fiber in buildings that were too expensive a five years ago.   If you’ve been avoiding apartment buildings because they’re too hard to serve you might want to take another look.

Tackling Hidden Fees

The topic of hidden fees on telecom bills was in the news recently when AT&T tripled their administrative charge on cellular bills – a change that nets then $800 million annually in new bottom line. Consumer Reports recently launched a campaign they are calling “What’s The Fee?” that is identifying and tackling hidden fees from big corporations like ISPs, airlines and banks. Their advocacy branch, Consumers Union launched a web site to identify hidden fees and started a petition drive to notify the big companies that many of their customers are unhappy with these fees. Consumers Union says they get more complaints on the issue for Comcast compared to any other corporation.

I’ve written in the past about the hidden fees that ISPs put onto their bills. I think they use these fees for a number of reasons:

  • The hidden fees disguise the true price of their products. The big cable companies widely advertise the price of cable that doesn’t include the fees without telling the public that the fees can’t be avoided. They night advertise a $69 cable package that might actually cost over $90.
  • The big cable companies have increased the rates for the hidden fees at a much faster pace than the increases in the ‘basic’ published rates for cable TV. This disguises rate increases by holding down the published rates for cable TV.
  • The hidden fees put pressure on competitors. Any competitor to the big ISPs that wants to publish true rates is at a disadvantage when customers compare their true rate to the deceptive basic rates of the cable companies that don’t include the hidden fees. My clients wrestle with this issue all of the time – should they be honest with customers and look to be more expensive or should they follow the same practice of mimicking the hidden fee structure so that their pricing is more easily compared?

What are the hidden fees? Let’s look at Comcast:

  • Broadcast TV Fees. This fee supposedly covers the cost of the retransmission fees paid to the over-the-air networks like ABC, CBS, FOX and NBC. Comcast charged $1.50 for this fee in 2015 and it’s now up to $7.75. Comcast doesn’t mention on bills that they own NBC. Comcast already charges all customers a substantial fee for basic TV that far exceed the cost of buying this programming.
  • Regional Sports Fee. This fee is now up to $6.75 per month in many markets (varies somewhat around the country). This fee supposedly compensates for the various regional sports networks. What Comcast fails to mention is that they now own the majority of regional sports networks, including a big pile they are getting due to the AT&T / Time Warner merger. This fee was $1 in 2015.
  • Settop Box and Cable Modems. While these are not hidden fees, these charges are supposedly set to recover the cost of the hardware. But in recent years Comcast has jacked up these fees significantly, to the point that I would consider a big portion of these to also be hidden fees. The charge for a cable modem is now $11. The company charges $9.95 for the first settop box and $7.75 for additional ones. Just a few years ago these fees were around $5. In both cases it’s likely that the settop box and cable modem costs Comcast $100 or less.
  • HD Fee. Comcast no longer charges separately for this, but I still see this on the bills from some of the other cable companies. This fee was established years ago when HD was a new technology, but today practically every channel is HD.

The Comcast fees have gotten so large that they could add $25 per month to the advertised price of a cable / broadband package. There is an open class-action lawsuit against Comcast that is seeking damages for customers who were charged these fees when they purchased advertised products that didn’t mention the fees.

What is most perplexing is that regulators have been quiet on the topic, even though just about everything to do with these fees is deceptive. Comcast swears that it provide full disclosure about these fees and that customers are not deceived, but one has to read some truly fine print on their web site when ordering a cable product to understand that these fees will be added to the advertised price.

Relying on Cellular Broadband (Part II)

One of my recent blogs talked about the reliability of cellular data as a substitute for wireline broadband. Almost immediately I had an example of a wireless outage shoved in my face. I was in Phoenix at an all-day meeting. When I left at about 4:00 I tried my Uber app and it wasn’t working. The app cycled through but would not find a driver. This was inconvenient because I was standing in the 100-degree sun, so I immediately looked for shade. I tried a few more times. Giving up on Uber I tried Lyft and got the same results. Now I’m figuring a data outage, but since Android phones are sometimes squirrelly, to be safe I rebooted my phone.

That didn’t work and I was standing waiting in hot weather to get a ride to my hotel which was 20-miles away. Uber, Lyft and taxis were out of the question. Luckily my voice was still working, so I called my wife who ordered an Uber for me. But had she not been available I’m not sure how I would have gotten to my hotel. I’m picturing the huge number of other people this also inconvenienced. How many people landed at an airport and couldn’t get a ride? How many people were driving and suddenly lost access to their mapping software? How many businessmen were traveling and couldn’t read or respond to email?

When I got back to a landline connection I looked at the AT&T outage website and it was lit up like a Christmas tree. It looked like the east coast was totally out, but almost every other NFL city also showed an outage. Phoenix, which I knew to be out, didn’t even show on the map as having a problem, and it’s possible that the whole nationwide AT&T network had a data outage. A few days later I checked and AT&T had said nothing about the cause of the outage. Their outage website shows a 17-hour outage that day, without specifying the extent or the reason for the outage.

There is obviously something shoddy in the AT&T national network if an event of any kind can knock out the whole nationwide data network for that long. It’s hard to believe that the company would not have redundant backup for every critical system that is needed to keep the network functioning. There are only a few possible explanations. Possibly some critical component of data routing failed, such as their DNS system that routes Internet traffic for cellphones. The company might also have gone too far with software defined networking and created some new points of failure that could affect the whole network. Or the company had a major fiber cut that feeds the site of one of those key network systems. There is no excuse for any of these possibilities, and a company with nearly 160 million customers ought to have redundancy for every critical component of their wireless network.

I contrast this to the hundreds of companies I know with landline broadband networks. All of my clients worry about total network failure and they work hard to avoid it. Unless they are geographically isolated, most of my clients have redundant routes between their network and the Internet. They generally have redundancy of key routers and switches to keep critical functions operational. Most of my clients have almost no outages that are not caused in the last mile. Local broadband networks are always susceptible to cable cuts in the last mile. But those cuts, by design, only knock out customers who are ‘downstream’ from the cut. It’s becoming extremely rare for my clients to have a total network outage, and if they do they usually take steps to stop it from happening a second time.

The press is in love with wireless right now and there are dozens of articles every month declaring how wireless is our future. Cellphones are going to become blazingly fast and 5G will fill in the gaps where cellular isn’t good enough. I’ve written enough blogs about this that you probably know that I think we are still a number of years away from seeing such wireless technologies.

But this outage makes me wonder about whether people will ever fully trust wireless technologies if they are operated by the big ISPs. The big ISPs are cavalier about network outages and they seem to suppose that their customers will just accept them. If my ISP clients had a 17-hour outage they would have taken steps after the outage to made amends with customers. They would have explained the cause of the outage and talked about their plans to make sure that it didn’t happen again. They likely would have given every customer a day’s credit on their bill for the downtime.

It astounds me that something like this outage could happen. If I was the head of AT&T, heads would have rolled after this was fixed. There is no excuse for a company with a $23 billion annual capital budget to have a network that is vulnerable to a widespread outage. The only reason the company could have such outages is that they don’t place value on redundancy. Until the big ISPs can make their wireless networks as reliable as landline networks I will never consider using them for broadband. I can’t see customers sticking with a 5G network that has a 17-hour outage. Broadband is now critical to many of us and I expect outages to be measured in minutes, not in hours or days.

Update on ATSC 3.0

A few months ago the FCC authorized the implementation of equipment using the ATSC 3.0 standard. The industry has known this has been coming for several years, which has given TV manufacturers the ability to start designing the standard into antenna and TV sets.

ATSC 3.0 is the first major upgrade to broadcast TV since the transition to digital signals (DTV) in 2009. This is a breakthrough upgrade to TV since it introduces broadband into the TV transmission signal. The standard calls for transforming the whole over-the-air transmission to IP which means that broadcasters will be able to mix in IP-based services with normal TV transmissions. This opens up a whole world of possibilities such as providing reliable 4K video through the air, allowing for video-on demand, providing immersive high-quality audio and greatly improving the broadcast emergency alert system. This also can bring the whole array of digital features that we are used to from streaming services like program guides, actor bios and any other kind of added information a station wants to send to customers.

From an economic standpoint this provides a powerful tool to local TV stations to provide an improved and more competitive product. It does complicate the life of any station that elects to sell advanced services because it puts them into the business of selling products directly to the public. Because the signal is IP, stations can sell advanced packages to customers that can only be accessed through a password, like with online TV services. However, this puts local stations into the retail business where they must be able to take order, collect payments and take calls from customers – something they don’t do today.

It creates an interesting financial dynamic for the TV industry. Today local network stations charge a lot of money for retransmission fees to cable companies for carrying their content. But most of that money passes through the local stations and gets passed up to the major networks like ABC or NBC. ATSC 3.0 is going to allow stations to directly market advanced TV service to customers, and it’s likely that many of these customers will be cord cutters that are lured away from traditional cable due to the advanced ATSC 3.0 services they can buy from their local networks. This puts the local network affiliates directly into competition with their parent networks, and it will be interesting to watch that tug of war.

This also opens up yet one more TV option for customers. FCC rules will still require that anybody with an antenna can receive TV for free over the air. But customers will have an additional option to buy an advanced TV package from the local station. If local stations follow the current industry model they are likely to charge $3 to $5 per month for access to their advanced features, and the jury is still out on how many people are willing to buy numerous channels at that price.

There are other interesting aspects to the new protocol. It allows for more efficient use of the TV spectrum, meaning that TV signals should be stronger and should also penetrate better into buildings. The TV signals also will be available to smartphones equipped with the ATSC 3.0 receiver in its chipset. This could enable a generation of young viewers who only watch local content on their phones. Station owners also have other options. They could license and allow other content to ride along with their signal. We might see local stations that bundle Netflix in with their local content.

We probably aren’t going to see many ATSC 3.0 devices in the market until next year as TV and other device makers build ATSC 3.0 tuners into their hardware. Like anything this new it’s probably going to take four or five years for this to go mainstream.

It’s going to an interesting transition to watch because it gives power back to local stations to compete against cable companies. And it provides yet one more reason why people might choose to cut the cord.

Increased Telehealth Funding

On July 11 the FCC announced that they are seeking a new $100 million use of the Universal Service Fund to create a “Connected Care Pilot Program”. The announcement was made in a joint op-ed by FCC Commissioner Brandan Carr and Mississippi Senator Roger Wicker.

Commissioner Carr got interested in the concept when he visited Mississippi six months ago and looked at a telemedicine trial for diabetes patients in the Mississippi Delta. That trial was monitoring patients 24/7 and drastically reduced the cost of patient care by alerting doctors to problems at an early stage and avoiding costly hospital stays. That trial had saved $700,000 per year in savings due only to avoiding hospital readmissions. It’s hard to put a number of the avoidance of misery and early death that was avoided. It’s estimated if the same monitoring was done just for 20% of the diabetes patients in the state that the annual savings would be $189 million.

In the past the Telemedicine Fund has only been used support rural brick-and-mortar facilities – rural health clinics and rural hospitals. This new trial fund will be used to instead fund larger trials to monitor patients in their homes. It’s going to concentrate on programs that will benefit low-income patients including those on Medicaid and veterans receiving free health care. If approved the funding will support a handful of projects for a two or three-year period with the goal of measuring the savings.

We already have evidence that medical monitoring works. The Veterans Administration spends an average of $1,600 on patients in it’s remote monitoring program compared to $13,000 per year for similar patients who have home-based primary care. Another monitoring trial in the northeast showed a $3.30 net savings for every dollar spent on remote monitoring.

The FCC blog on the issue also points out that home monitoring improves the health outcome for patients.

  • A study of 20 remote patient monitoring trials found a 20% reduction in all-cause mortality and a 15% reduction in heart failure-related hospitalizations;
  • The VHA’s remote patient monitoring program resulted in a 25% reduction in days of inpatient care and a 19% reduction in hospital admission;
  • One remote patient monitoring initiative showed a 46% reduction in ER visits, a 53% reduction in hospital admissions, and a 25% shorter length of stay.

For the FCC to get involved in this means there will be connectivity costs to cover. I envision that a significant share of this program will go towards paying for some kind of broadband connectivity for patients in the program. In some places there will be decent broadband and in rural areas this is likely going to mean buying a fixed cellular connection for the monitoring.

Health care costs are out of control in this country and after more of these trials we’ll hopefully see insurance covering the needed connectivity costs for monitoring programs. If the savings are as large a promised then insurance companies and everybody will benefit from monitoring and early detection of problems compared to acute care costs when problems have gone too far.

This funding will be voted on at the August FCC meeting. The FCC in May already increased the annual funding for telehealth from $400 million to $571 million. I can’t tell by the press releases if this would be funded by that increase or if this is additive on top of it.

Reclaiming Spectrum

The FCC recently wrote a letter to DISH Networks warning the company that it had not complied with the FCC’s build-out requirements for its AWS-4 and its E and H blocks of 700 MHz spectrum. The warning was more sternly worded than what we normally see from the FCC, and perhaps they will take steps to reclaim the spectrum if DISH is unable to meet the required deployment of the spectrum. The company has a long history of sitting on spectrum and delaying its use. They recently told the FCC that they want to use the AWS spectrum to launch a nationwide IoT monitoring network and that they are interested in entering the cellular business with the 700 MHz licenses.

Today’s blog is not about DISH specifically. Instead, I want to talk about the FCC reclaiming spectrum. This is an important issue for rural America because the majority of licensed spectrum sits idle in rural America for a number of reasons. We could go a long way towards fixing the rural broadband problem if the unused spectrum could be reclaimed by the FCC and repositioned for rural use. There are a number of reasons why the spectrum sits idle today.

Coverage Rules. Most FCC licenses come with coverage requirements. For instance, a given spectrum might need to eventually be deployed to cover something like 70% of the households in a license area. That rule allows spectrum holders to deploy spectrum to urban areas and legally ignore the surrounding rural areas.

There is nothing wrong with this from a business perspective. Cellular companies only need to use their full inventory of spectrum in urban areas where most customers live, and the FCC rules should not require deployment of spectrum where nobody will use it. But the coverage rules mean that the spectrum will remain unused in rural areas as long as the primary license holder is serving the urban areas – effectively forever. Since the spectrum is licensed, nobody else can use it. This problem is caused by the way that the FCC licenses spectrum for large geographic areas, while the spectrum buyers are interested in serving only a portion of the license areas.

Ideally unused spectrum should be made available to somebody else who can make a business case for it. There are several ways to fix this issue. First, licensed holders could be compelled by the FCC to sub-license the spectrum to others where it sits idle. Or the FCC could reclaim the spectrum in unused geographic areas and distribute it to those who will use it.

Deployment Delays. Other spectrum goes unused due to deployment delays by license holders. The DISH Network spectrum is a perfect example. The company bought this spectrum for a use that they were unable to execute. Since the spectrum is valuable the license holders deploy delaying tactics to stop the FCC from reclaiming the spectrum. The FCC has largely been derelict in enforcing its own rules and I’m sure that DISH was shocked at the FCC response. DISH probably figured that this would be business as usual and that the FCC would grant them more time as had been done in the past. I have no idea if DISH really intends to deploy an IoT network or go into the cellular business – but those are the kinds of new competitive ventures that the FCC has been publicly asking for, so DISH is telling the FCC exactly what it wants to hear. But it’s likely that DISH just wants another delay until they can find a buyer for their sinking satellite business by somebody who will value the spectrum. Regardless of the reasons, the FCC has ignored its own deployment rules numerous times and granted license holders more time.

Spectrum Speculators. There is a class of investors who buy spectrum with the hopes of selling it or licensing it to somebody else. They will buy spectrum and rig up a bogus use of the spectrum to meet the build-out requirements. I’ve seen wireless links deployed that carry no data but that are intended only to prove to the FCC that the spectrum is being used. The FCC ought to do a better job of identifying the fake deployments that are done only to preserve the license.

There’s no way to know if the letter to DISH signals a change at the FCC and if they intend to enforce the spectrum rules. Better enforcement of the rules alone won’t help rural America if the spectrum gets re-licensed and the same cycle repeats. We need spectrum rules that free up spectrum in rural areas where the spectrum sits idle. Perhaps this could be done by requiring license holders to sub-license the spectrum to others where it sits idle. The FCC has said numerous time that wireless technology can be the salvation for rural broadband, yet they allow the most valuable spectrum to sit idle while WISPs are relegated to delivering broadband using a few tiny swaths of unlicensed spectrum. This is not a hard problem to solve, but it requires the will to solve it, and an FCC that won’t cave-in to the big spectrum license holders.

Rising Broadband Speeds

For the second year in a row the coalition M-Lab measured broadband speeds in 200 countries. This coalition includes New America’s Open Technology Institute, Google Open Source Research, Princeton University’s PlanetLab and others, compiled by Cable of the UK. The statistics are based upon over 163 million speed tests. The results are available in a spreadsheet and are worth looking at for those that love numbers.

Because the results use speed tests, the vagaries of those tests must be factored into the results. Hopefully all of the reading use the same speed test, because each speed test on the market uses a different algorithm to calculate speed. For example, the algorithm for speedtest.net operated by Ookla discards the fastest 10% and the slowest 30% of the results obtained. Speed tests are overinflated in many instances when ISPs use a burst technology that provides a faster broadband speed for the first minute or two or any web connection. The results are also lowered due to any network issues at a customer such as an underperforming WiFi network. The bottom line is that any given speed test number must be taken with a grain of salt, but comparing millions of speed test results ought to make a valid relative comparison.

Overall the tests show a worldwide increase in broadband speeds in just one year of 23%. However, to put that in perspective that’s an increase worldwide going only from 7.4 Mbps to 9.1 Mbps. It’s more interesting to look at the results from the countries with the fastest and slowest broadband. The top 25 fastest broadband countries on the list increased speeds by 28.9% while the bottom 25 only increased by 7.4%.

The US moved up one slot, from number 21 to number 20 to this year – increasing average speeds from 25.0 Mbps to 25.9 Mbps. This is a substantial increase that I think can be attributed to three factors. The most significant is probably that several large cable companies have unilaterally increased base speeds due to the introduction of DOCSIS 3.1. Average speeds also continue to climb as several million customers per year migrate from DSL to cable modems. Finally, we are slowly building fiber to residences and probably added a few million fiber passings last year.

The worldwide broadband leader is Singapore with an average speed of 60.4 Mbps. They are followed by Sweden, Denmark, Norway, Romania, Belgium and the Netherlands. Romania is interesting because they rose 13 places with a jump in speed from 21.3 Mbps to 38.6 Mbps – they obviously have been implementing a lot of fiber. The biggest drop on the chart is Hong Kong that fell 10 places on the list as their broadband speeds dropped slightly from 27.2 Mbps to 26.5 Mbps. It wasn’t too many years ago when Hong Kong was way ahead of the US, but that gap has completely closed.

One of the more important things this research shows is that good broadband can be found in North America, most of Europe and some of southeast Asia. Broadband speeds everywhere else are far behind. The gap between the haves and have-nots is growing. The increase in average speeds for the top 100 countries on the list was 5.4 Mbps in one year while the increase for the bottom countries was only 0.4 Mbps.

It’s also worth remembering that speeds differ within each country. In this country we still have millions of rural homes that have no Internet access or access at third world speeds. The same is likely true around the world with better broadband in urban areas compared to rural areas. It’s also worth remembering that only about 4.1 billion people, or 54% of the population of the world have access to broadband.

These kinds of statistics are useful because they probably act as a goad to governments that are far down the list to find ways to improve broadband. We know that good Internet brings a huge number of economic and other advantages, and countries with good broadband are implementing new technologies that aren’t going to be available in countries with slow broadband networks.

There is hope for those areas with little or now broadband. Several groups are proposing satellites that can bring broadband everywhere. Endeavors like Google’s Loon are looking at bringing broadband to rural areas across the globe. Hopefully we will see speeds in the third world increasing significantly over the next decade. While only in the second year, the work being done by M-Lab is another good measuring stick for governments to measure their progress.