Shame on the Regulators

It’s clear that even before the turn of this century that the big telcos largely walked away from maintaining and improving residential service. The evidence for this is the huge numbers of neighborhoods that are stuck with older copper technologies that haven’t been upgraded.  The telcos made huge profits over the decades in these neighborhoods and ideally should not have been allowed to walk away from their customers.

In the Cities. Many neighborhoods in urban areas still have first or second-generation DSL over copper with fastest speeds of 3 Mbps or 6 Mbps. That technology had a shelf-life of perhaps seven years and is now at least fifteen years old.

The companies that deployed the most DSL are AT&T and CenturyLink (formerly Quest). The DSL technology should have been upgraded over time by plowing profits back into the networks. This happened in some neighborhoods, but as has been shown in several detailed studies in cities like Cleveland and Dallas, the faster DSL was brought to more affluent neighborhoods, leaving poorer neighborhoods, even today, with the oldest DSL technology.

The neighborhoods that saw upgrades saw DSL speeds between 15 Mbps and 25 Mbps. Many of these neighborhoods eventually saw speeds as fast as 50 Mbps using a technology that bonded two 25 Mbps DSLs circuits. There are numerous examples of neighborhoods with 50 Mbps DSL sitting next to ones with 3 Mbps DSL.

Verizon used a different tactic and upgraded neighborhoods to FiOS fiber. But this was also done selectively although Verizon doesn’t seem to have redlined as much as AT&T, but instead built FiOS only where the construction cost was the lowest.

In Europe, the telcos decided to complete with the cable companies and have upgraded DSL over time, with the fastest DSL today offering speeds as fast as 300 Mbps. There is talk coming out of DSL vendors talking about ways to goose DSL up to gigabit speeds (but only for short distances). The telcos here basically stopped looking at better DSL technology after the introduction of VDSL2 at least fifteen years ago.

By now the telcos should have been using profits to build fiber. AT&T has done this using the strategy of building little pockets of fiber in every community near to existing fiber splice points. However, the vast majority of rural households served by AT&T are not being offered fiber, and AT&T said recently that they have no plans to build more fiber. CenturyLink built fiber to past nearly 1 million homes a few years ago, but that also seems like a dead venture going forward. But now, in 2019, each of these telcos should have been deep into urban neighborhoods in their whole service area with fiber. Had they done so they would not be getting clobbered so badly by the cable companies that are taking away millions of DSL customers every year.

Rural America. The big telcos started abandoning rural America as much as thirty years ago. They’ve stopped maintaining copper and have not voluntarily made any investments in rural America for a long time. There was a burst of rural construction recently when the FCC gave them $11 billion to improve rural broadband to 10/1 Mbps – but that doesn’t seem to be drawing many rural subscribers.

It’s always been a massive challenge to bring the same speeds to rural America that can be provided in urban America. This is particularly so with DSL since the speeds drop drastically with distance. DSL upgrades that could benefit urban neighborhoods don’t work well in farmland. But the telcos should have been expanding fiber deeper into the network over time to shorten loop lengths. Many independent telephone companies did this the right way and they were able over time to goose rural DSL speeds up to 25 Mbps.

The big telcos should have been engaging in a long-term plan to continually shorten rural copper loop lengths. That meant building fiber, and while shortening loop lengths they should have served households close to fiber routes with fiber. By now all of the small towns in rural America should have gotten fiber.

This is what regulated telcos are supposed to do. The big telcos made vast fortunes in serving residential customers for many decades. Regulated entities are supposed to roll profits back into improving the networks as technology improves – that’s the whole point of regulating the carrier of last resort.

Unfortunately, the industry got sidetracked by competition from CLECS. This competition first manifested in competition for large business customers. The big telcos used that competition to convince regulators they should be deregulated. Over time the cable companies provided real residential competition in cities, which led to the de facto total deregulation of telcos.

In Europe, the telcos never stopped competing in cities because regulators didn’t let them quit. The telcos have upgraded to copper speeds that customers still find attractive, but the telcos all admit that the next upgrade needs to be fiber. In the US, the big telcos exerted political pressure to gain deregulation at the first hint of competition. US telcos folded and walked away from their customers rather than fighting to maintain revenues.

Rural America should never have been deregulated. Shame on every regulator in every state that voted to deregulate the big telcos in rural America. Shame on every regulator that allowed companies like Verizon palm off their rural copper to companies like Frontier – a company that cannot succeed, almost by definition.

In rural America the telcos have a physical network monopoly and the regulators should have found ways to support rural copper rather than letting the telcos walk away from it. We know this can be done by looking at the different approaches taken by the smaller independent telephone companies. These small companies took care of their copper and most have now taken the next step to upgrade to fiber to be ready for the next century.

Cable Companies and 10 Gbps

One topic covered extensively at the recent SCTE-ISBE Cable-Tec Expo in New Orleans was the ability of cable networks to deliver 10 Gbps broadband to customers. The fact that this is even being discussed is a testament to the fact that big ISPs all acknowledge the huge growth of demand from consumer and business broadband in the country.

Most urban cable companies just made the upgrade to DOCSIS 3.1 in the last year or so that allows them to offer gigabit products to customers. Everybody acknowledges that the need for 10 Gbps products is likely at least a decade away, but now is the time to start the technology research needed to create a product in that timeframe.

It’s been clear for some time that cable companies don’t want to lose the speed battle and are working to compete against the introduction of fiber in urban markets. The number of households being passed by fiber continues to grow. AT&T built past millions of homes in the last few years, mostly in small pockets around existing fiber nodes. CenturyLink even built residential fiber for a few years before abandoning the concept to concentrate on building fiber to businesses. It’s not clear who might build urban and suburban fiber, but the fact that the cable companies are looking at 10 Gbps speeds means they think that somebody will do so.

Other than some limited cases, most fiber providers are still building fiber networks with 1 Gbps fiber speeds. Verizon is building a 10 Gbps fiber network to supply bandwidth to small cell sites but is not yet using the new technology in the FiOS network. The whole fiber industry is waiting for one of the big ISPs to embrace 10 Gbps products to help pull down equipment prices, but that doesn’t look likely to happen any time soon.

There are significant upgrades needed for the cable industry to offer 10 Gbps speeds. A 10 Gbps downstream data path requires 1.3 GHz of bandwidth, which is greater in capacity than all but a handful of cable networks. Adding a decent upload data stream and still carry TV channels means that cable systems will need to upgrade to 2 – 3 GHz of bandwidth. That’s a major upgrade and would likely require replacing most or all of the amplifiers and power taps in the outside coaxial cable network. This would also likely require some replacement of older coax cable. Upgrading to faster speeds would mean an upgrade to headends as well as to the millions of DOCSIS modems sitting in customer homes.

I’ve heard speculation that cable companies will consider an upgrade to fiber rather than going to 10 Gbps over DOCSIS. Almost every cable company is now using PON technology when building to greenfield subdivisions. While it’s expensive to build fiber to every home, almost every CEO of the big cable company has acknowledged that their eventual future is with fiber. Altice is already pursuing the upgrade to fiber and other cable companies will all eventually consider it.

There are always skeptics of the need for big bandwidth and many in the industry scoff at gigabit broadband today as nothing more than a marketing ploy. What the critics ignore is that the world grows into larger bandwidth over time. Residential broadband usage is currently growing at a rate of about 21% annually in terms of both total monthly downloads and of desired customer speeds. When gigabit products were first introduced, they were 40 times faster than the average broadband product at that time of about 25 Mbps.

There will inevitably be new uses of bandwidth that will require faster speeds. Just as one example, I saw that Verizon had acquired the products of the augmented reality firm Jaunt. We have all been promised the future ability to hold virtual hologram meetings, and when somebody develops such a product it’s going to sweep the country.  When that happens, households will be bouncing up against the gigabit speed limit and asking for more. I also ask the cable companies to not forget my holodeck – I’m still waiting.

Farm Access to Broadband

The US Department of Agriculture has been measuring computer usage on farms and publishes the results every two years in its Farm Computer Usage and Ownership report. The most recently released report for 2019 was compiled by asking questions to 20,000 farmers. This is a large sample from the more than 2 million farms in the country.

One of the key findings of the report is that 75% of farms reported having access to to the Internet in 2019, up from 73% in 2017. The breakdown of farms by type of connection is as follows:

2017 2019
Satellite 23% 26%
DSL 28% 22%
Cellphone 19% 18%
Cable 16% 16%
Fiber 9% 12%
Dial-up 3% 3%
Other 2% 3%

There are a few notable highlights in these numbers.

  • First, farms are abandoning rural DSL, as are many other customers. If CAF II upgrades had been done right, the DSL category ought to at least be holding even.
  • I also find it surprising that fixed-wireless isn’t listed as a choice. Fixed wireless is now available in many parts of the country. While many WISPs today offer slow broadband speeds, this category of connections should grow as speeds improve significantly over the next few years.
  • It’s a national shame that 3% of farms are still stuck with dial-up.
  • Far too many farms still use their cellphone for Internet access.

The report is also an interesting way to look at general broadband availability in rural America. For example, a few states have a high fiber coverage rate to farms, such as North Dakota (61%), Montana (39%), and South Dakota (36%). Other states have practically no broadband to farms, such as California and Louisiana at 1%, and other states below 5% including Georgia, Michigan, New York, Ohio, Pennsylvania, and South Carolina.

The states with the biggest reliance on cellphones for farm broadband include Louisiana (52%), Michigan (37%), and Florida (34%).

The poor penetration rate of real broadband is further evidenced by the way that farmers conduct business. 49% of farmers used a desktop or laptop to conduct business in 2019 while 52% used their cellphone. 24% of farmers buy agricultural inputs over the Internet and only 19% use the Internet to sell their goods.

There has been a lot of press in the last few years talking about how technology is transforming farming. However, these innovations are not coming to farms that are stuck with dial-up, satellite or rural DSL technology.

We’ve seen that better broadband can come to farms by looking at the high fiber coverage of farms with fiber in Montana and the Dakotas. That fiber has been built using a combination of subsidies from the Universal Service Fund and low-cost loans from the USDA and cooperative banks. We know how to fix rural broadband – we just don’t have the national will yet to get it done.

Be Wary of 5G Hardware

We’ve now entered the period of 5G competition where the wireless carriers are trying to outdo each other in announcing 5G rollouts. If you believe the announcements, you’d think 5G is soon going to be everywhere. Device manufacturers are joining the fray and are advertising devices that can be used with the early carrier 5G products. Buyers beware – because most of what the cellular companies and the manufacturers are hyping as 5G is not yet 5G. Any first generation hardware you buy today will become quickly obsolete as future 5G upgrades are introduced.

5G Cellular. Cellular carriers are introducing two new spectrum bands – CBRS spectrum and millimeter wave spectrum – as 5G. The actual use of these spectrums is not yet technically 5G because the carriers aren’t yet using much of the 5G specifications. These two specific spectrum bands come with another warning in that they are only being used to produce faster outdoor broadband. Customers who live in places where they can receive the new frequencies, and who compute outdoors might see value in paying extra for the devices and the 5G data plans. Most people are not going to find any value in what these plans offer and should not get sucked into paying for something they can’t get or won’t use.

Cellphone manufacturers are already starting to build the CBRS spectrum into high-end phones. By next year there should be a 5G version of every major cellphone – at a premium price. Within a few years this will be built into every phone, but for now, expect to pay extra.

The question that users need to ask is if faster cellphone data is worth the extra hardware cost and worth the extra monthly fee that will be charged for 5G browsing. I’ve thought about the cellphone functions that would be improved with faster broadband and the only one I can come up with is faster downloads of movies or software. Faster broadband is not going to make web browsing any faster on a cellphone. Cellphones have been optimized for graphics, which is why you can scroll easily through a Google map or can flip easily between videos on social media. The trade-off for faster graphics is that cellphones aren’t good at other things. Cellphones crawl when trying to process non-cellular websites or when trying to handle spreadsheets. Faster broadband is not going to make these functions any faster, because the slowness comes from the intrinsic design of the cellphone operating software and can’t be improved with faster broadband.

I also think customers are going to face a huge challenge in getting a straight answer about when CBRS spectrum or millimeter wave spectrum will be available in their local neighborhood. The carriers are in full 5G marketing mode and are declaring whole metropolitan areas to have 5G even if that only means new spectrum is in a few neighborhoods.

Finally, beware that both of these spectrums only work outdoors. And that means on foot, not in cars. Millimeter wave spectrum is likely to always be a gimmick. Folks testing the spectrum today report that they can lose the connection simply by rotating their phone slightly or by putting their body in the path from the transmitter. CBRS spectrum will be much more well-behaved.

Laptops.  Lenovo has already announced a 5G-capable laptop coming in 2020 and others will surely jump on the bandwagon soon. The big issue with laptops is also an issue with cellphones. It might be reasonable in an area with good CBRS spectrum coverage to get a 100 Mbps or faster cellular connection. This is going to tempt a user to use a laptopas if it was on a home broadband connection. However, this is still going to be cellular data supplied on a cellular data plan. Unless the carriers decide to lift data caps, a customer using a CBRS spectrum laptop might find themselves exhausting their monthly data cap in a day or two. It’s also worth repeating that these are outdoor spectrums, and so only students or others who regularly use computers outdoors a lot are going to find this spectrum potentially useful.

5G Hotspots. A 5G hotspot is one that broadcasts bandwidth in millimeter wave spectrum. Sprint is already marketing such a hot spot. This takes us back to the early days of WiFi when we needed a dongle to use WiFi since the spectrum wasn’t built into desktops or laptops. A 5G hotspot will have that same restriction. One of the primary reasons to consider a millimeer wave hotspot is security. It will be much harder to hack a millimter wave connection than a WiFi connection. But don’t select the millimeter wave hot spot for speed because a millimeter wave connection won’t be any faster than the WiFi 6 routers just hitting the market.

In future years, 5G hotspots might make sense as millimeter wave spectrum is built into more devices. One of the biggest advantages of indoor millimeter wave spectrum is to avoid some of the interference issues inherent in WiFi. I picture the ideal future indoor network to be millimeter wave spectrum used to provide bandwidth to devices like computers and TVs while WiFi 6 is used for everything else. There is likely to be an interesting battle coming in a few years between millimeter wave and WiFi 6 routers. WiFi already has a huge advantage in that battle since the WiFi technology will be included in a lot more devices. For now there won’t be many easy ways to use a 5G millimeter wave hotspot.

Comcast Breaks Promise of Lifetime Prices

Barely a month goes by when I don’t read about a colossal failure of customer service by one of the big ISPs. The latest comes from Comcast, and the company seems to have broken a major promise made to customers.

When Google Fiber announced in 2016 that they were coming to Salt Lake City, Comcast decided to compete against Google Fiber by offering ‘lifetime’ prices for various bundles. For example, there was a triple play bundle at $120 per month plan that included broadband, cable TV and a telephone line. In anticipation of Google coming to the market, Comcast engaged in a door-to-door sales campaign that marketed the lifetime special and other discounts on Comcast products in an attempt to lock down customers before Google Fiber hit the market. Ironically, Google Fiber changed their mind and never made any significant investment in the market.

The lawsuit alleges that Comcast doorknockers promised customers the lifetime product and backed this up in writing that their price would be good for as long as the customer kept the plan. Customers were assured at each step of the process that they were buying a lifeline plan and that rates would never be increased. For example, Comcast customer service reps on the phone repeated the assurance that the prices would be good forever. The lawsuit asserts that as many as 20% of the 200,000 upgrades sold during the sales campaign in Utah were sold as lifetime plans.

As you might expect from the title of this blog, after a few years Comcast raised the prices on the lifetime plans. At that point, Comcast customer service denied any knowledge that these were lifetime rates and said they had never heard of such a plan. Comcast enforced the rate increases, some of which were substantial.

It’s hard to imagine that any company would sell a guaranteed lifetime price for a bundle that includes cable TV. The cost of buying wholesale programming has been increasing at 10% – 15% annually for many years. In a decade, any lifetime plan would be massively underwater. Additionally, Comcast is now in the mode of annually increasing broadband prices – but that’s not something that was probably discussed inside of the company in 2016.

It’s not hard to figure out how this could happen in a big corporation. I’m just speculating, but I expect the marketing campaign included an outside sales team. These sales teams get most of their compensation from completed sales and are famous in the industry for making outrageous claims to customers. I always caution my clients about hiring sales companies that bring entire sales teams in from out of state. While these companies will get sales, the worst of them often leave a trail of unhappy customers behind them. I would expect that this sales staff had some role in choosing the message of lifetime rates – something they know they can sell.

However, it had to be more than a rogue sales team that pushed the lifetime rates. Comcast customer service at the time was also telling the customers that the plans were lifetime rates. I’ve talked to several Comcast customer service reps over the years and they describe the customer care process at the company as chaotic. From what they’ve described it’s not hard to imagine the specific customer care group supporting the sales campaign also supported this effort because they also could make sales commissions. Many of the horror stories coming out Comcast customer care over the years have involved employees engaging in bad behavior to chase sales commissions.

But there also had to be local management buy-in of the plan. I’m sure we’ll never know, but it would be interesting to know if this was strictly a local management decision in Salt Lake City or if there was corporate buy-in. Comcast seems to have overreacted to Google Fiber elsewhere and it’s possible that this was a corporate plan.

This lawsuit highlights the difficulty in operating a huge ISP. Many big companies have seen sales commission plans gone awry. Inevitably, some employees find ways to maximize bonuses through bad behavior. We saw something similar from Wells Fargo bank last year and it’s hard for any giant corporation to strenuously push sales campaigns while also policing that employees don’t take advantage of the plans.

This story offers a few lessons for other ISPs. I am a huge believer in the efficacy of door-to-door sales plans done well. But there are unscrupulous outside firms that will sell anything for a high-enough commission. The best sales plan involves local people trained and managed by an ISP directly. The other lesson is that sales commission plans for non-salespeople must be carefully designed to not promote bad behavior.

Broadband Partnerships

A big part of my daily consulting life is helping entities like municipalities and rural electric cooperatives find ways to enter the broadband business. The majority of these entities consider partnerships as an alternative to becoming an ISP. While there are no two partnerships that are exactly alike, today’s blog looks at the predominant types of partnerships that I see being created.

Operator. The simplest kind of partnership is for somebody to build a network and hire an ISP to operate it. Entities like cities are often leery about becoming an ISP and are a lot more comfortable building a fiber network if somebody with a proven track record operates the business.

In this kind of relationship, the operator is largely a vendor for hire and doesn’t invest in the business. The operator gets compensated through management fees for operating the business and often will also be compensated by some sort of profit-sharing.

This arrangement is attractive to some ISPs because they have no financial risks. Since the operator makes no investment, they are not going to have a downside if the business underperforms, other than reduced fees and profit-sharing. Perhaps the biggest advantage to an operator is that their existing business gains economy of scale. The operator can spread overhead costs over the new business and reduce the overheads that apply to other parts of their business.

In this arrangement, the network owner is responsible for all operating costs. Hiring an operator can be a costly alternative since network owners are on the hook to pay the operator even if the business underperforms. This business structure only makes financial sense for a business that will generate enough profit to be able to afford the extra payments to the operator.

Partner Covers Customer Costs. The most common kind of partnership I’m seeing is where one entity builds the core network and a partner pays for the capital cost of adding customers to the network. The biggest variable in this arrangement is how much cost the partner covers. At a minimum a partner would cover the cost of everything inside the customer premise. At the other extreme is a dark fiber network where the operating partner covers everything from the curb and also the cost of lighting the network.

This arrangement flips the responsibilities compared to the operator model. The ISP that builds at the home is normally responsible for all operating costs associated with customers, and for that risk, they collect all revenues. The network owner gets compensated by some sort of lease and might benefit at some point in profit sharing.

The stickiest issue for creating this kind of partnership is figuring out how to compensate the network owner during the first few years of the partnership until there are sufficient customer revenues to cover the full cost of financing the network. At the one extreme the network owner makes the early debt payment out of pocket. In the other extreme the operating ISP pays the early debt payments before there are revenues. In both cases, this big early cost is one of the biggest challenges in creating this kind of business structure.

Equal Partners. The least common kind of partnership is a true partnership where two entities form a partnership and fund and operate the network as full partners. This structure generally means forming a new corporation where an ISP and a city or a cooperative each own some share of the business. In this arrangement, the partners share in gains or losses.

Who operates the business is open to negotiation and there could be staff provided by both entities or the business could hire all new staff.

I think true partnerships are rare because it’s hard for disparate entities like cities and ISPs to govern a business together. A commercial ISP likely has different goals than a city or an electric cooperative. For example, a commercial ISP might want to flip the business in 7-10 years to realize a full return on their investment while cities and cooperatives are in the business for the long haul. In some states it’s a challenge for a city to own part of a commercial business. Cities can also be unreliable partners since they can change drastically with a change of administration.

It’s hard to manage this kind of business jointly. Cities and cooperatives typically aren’t as nimble in their ability to make quick business decisions that a commercial partner is going to expect. The partners are also likely to have different views on what to do with profits or for issues like setting rates.

I’ve found that a successful partnership requires partners that share the same vision for the business, and it’s hard to find a commercial ISP that can share a vision with a city or a cooperative. Over the years I’ve seen several such partnerships created that either became rancorous after only a few years or which split apart when it became obvious that the partners had a different view of operating a business.

The Myth of 5G and Driverless Cars

A colleague sent me an article that had been published earlier this year in MachineDesign magazine that predicts that driverless cars can’t be realized until we have a ubiquitous 5G network. When looking for the original article on the web I noticed numerous similar articles like this one in Forbes that have the same opinion.

These articles and other similar articles predict that high-bandwidth, low-latency 5G networks are only a few years away. I’m not quite sure who these folks think will invest the $100 billion or more that would likely be required to build such a wireless network along all of the roads in the country. None of the cellular carriers have such grandiose plans, and if they did their stockholders would likely replace a management team that suggested such an investment.

It’s easy to understand how this myth got started. When 5G was first discussed, the cellular companies listed self-driving cars as one of the reasons the government should support 5G. However, over time they’ve all dropped this application from their 5G message and it’s no longer a cellular company talking point.

The idea that 5G is needed for self-driving cars is bolstered by the belief that the computing power of a data center is needed to process the massive amounts of data generated by a self-driving car. That very well may be true, and the current versions of self-driving cars are essentially data centers on wheels that contain several fast computers.

The belief that 5G will enable self-driving cars also comes from the promise of low latency, near to that of a direct fiber connection. The folks that wrote these articles envision a massive 2-way data transfer happening constantly with 5G for every self-driving car. I can’t imagine they have ever talked to a network engineer about the challenge of creating 2-way wireless gigabit connections with hundreds of moving cars simultaneously on a freeway at rush hour. It’s hard to envision the small cell site and fiber infrastructure needed to handle that without hiccups. I also don’t know if the authors have recently driven down many rural reads recently to remind themselves of the huge challenge of implementing rural gigabit 5G.

The talk of using wireless for vehicles also ignores some fundamental issues. Wireless technologies are wonky in the real world. Radio waves do odd things in the wild and every wireless network has dead zones and places where the system mysteriously won’t work the way it’s supposed to. Worse, the dead spots and odd spots move around with changes in temperature, humidity, and precipitation.

Network engineers also would advise that for a critical task like driving at high speeds that every vehicle should have a redundant back-up connection, meaning a second wireless connection in case the first one has a problem. Anybody that puts critical tasks on a fiber network invests in such redundancy. Hospitals that use broadband as part of a surgical procedure or a factory that does precision manufacturing will have a second fiber connection to be safe. It’s hard to imagine a redundant connection for a moving car since the only place it can come from is the nearest cell sites that provide the primary connection.

I don’t know how other feel about this, but I’m not about to trust my life to a self-driving car that needs a connection to an external data center to be safe. I know too much about how broadband networks function to believe that 5G networks will somehow always make perfect connections when other fiber networks don’t.

One of the first things that came to my mind when I read these articles was to wonder what happens when there is a fiber outage on the network supporting the 5G cell sites. Do all of the self-driving cars just stop and wait for a broadband signal? I picture a city during an event like the 19-hour CenturyLink fiber outage a year ago and wonder if we are so stupid as to make our transportation systems reliant on external computing and external networks. I sure hope that we are not that dumb.

CoBank Supports Telemedicine

For those who don’t know CoBank, it’s a bank that specializes in loans to telecom and electric cooperative but which also has funded numerous rural fiber projects for other borrowers over the years. In August CoBank filed comments In FCC Docket 18-213 in support of expanded use of the Universal Service Fund for rural telemedicine. CoBank is a huge supporter of telemedicine and has made substantial grants to telemedicine projects dealing with diabetes management, opioid abuse, prenatal maternity care, and veteran care.

As part of that filing, CoBank discussed a telemedicine trial they had sponsored in rural Georgia. The trial was conducted in conjunction with Perry Health, a software provider and Navicent Health, a healthcare provider in Macon, Georgia.  The trial was for 100 low-income patients with uncontrolled Type 2 diabetes. These patients were on a path towards kidney failure, amputation, loss of vision, and numerous other major related health problems. These are patients who would normally be making numerous emergency room visits and needing other costly medical procedures.

In the trial, the patients were provided with tablets containing Perry Health software that provided for daily interaction between patients and Navicent. Patients were asked to provide daily feedback on how they were sticking to the treatment regimen and provided information like the results of blood sugar tests, the food they ate each day, the amount of daily exercise, etc. The tablet portal also provided for communication from Navicent asking patients how they generally felt and providing recommendations when there was a perceived need.

The results of the trial were hugely positive. In the trial of 100 patents, 75% of the patients in the trial showed a marked improvement in their condition compared to the average diabetes patient. The improvements for these patients equated to reduced health care costs of $3,855 per patient per year through reduced doctor visits and reduced needs to make emergency room visits. The American Diabetes Association says that patients with Type 2 diabetes have 2-3 times the normally expected medical costs, which they estimate totals to $327 billion per year.

Patients in the trial liked the daily interaction which forced them to concentrate on following treatment plans. They felt like their health care provider cared about how they were doing, and that led them to do better. After the trial, Navicent Health expanded the telemedicine plan to hundreds of other patients with Type 2 diabetes, heart failure, and Chronic Obstructive Pulmonary Disease (COPD).

One of the interesting outcomes of the trial was that patents preferred to use cellphones rather than the special tablets. The trial also showed the need for better broadband. One of the challenges of the trial was the effort required by Navicent Health to make sure that a patient had the needed access to broadband. To some degree using cellphones gives patients easier access to broadband. However, there are plenty of rural areas with poor cellular data coverage, and even where patients can use cellular data, the cost of cellular data can be prohibitive if heavily used. Landline broadband is still the preferred connection to take advantage of unlimited WiFi connections to the healthcare portal.

One thing that struck me about this study is that this sounds like it would be equally useful in urban areas. I’ve read that a lot of healthcare costs are due to patients who don’t follow through on a treatment plan after they go home after a procedure. The Navicent Health process could be applied to patients anywhere since the biggest benefit of the trial looks to be due to the daily interface between patient and doctor.

The FCC has already pledged to increase funding for the rural medicine component of the Universal Service Fund. However, that funding is restricted. For example, funding can only be granted to rural non-profit health care providers.

Telemedicine has been picking up steam and is seeing exponential growth. But telemedicine still only represents just a few percentages of rural healthcare visits. The primary barrier seems to be acceptance of the process and the willingness of health care providers to tackle telemedicine.

The Market Uses for CBRS Spectrum

Spencer Kurn, an analyst for New Street Research recently reported on how various market players plan to use the 3.5 GHz CBRS spectrum recently approved by the FCC. I described the FCC’s order in this recent blog. As a quick refresher, this is a large swath of spectrum and the FCC has approved 80 MHz of spectrum for public use and will be auctioning 70 MHz of the spectrum in 2020.

Cellular Bandwidth. Kurn notes that Verizon plans to use the new spectrum to beef up 4G bandwidth now and eventually 5G. Verizon plans to use the spectrum in dense markets and mostly outdoors. Cable companies like Comcast and Charter that have entered the wireless business are also likely to use the spectrum in this manner.

I’ve been writing for a while about the crisis faced by cellular network. In urban areas they are seeing broadband usage double almost every two years and keeping up with that growth is a huge challenge. It’s going to require the combination of new spectrum, more cell sites (mostly small cells), and the improvements that come with 5G, mostly the frequency slicing.

It’s interesting that Verizon only sees this as an outdoor solution, but that makes sense because this spectrum is close in characteristics as the existing WiFi bands and will lose most of its strength in passing through a wall. It also makes sense that Verizon will only do this in metro areas where there is enough outdoor traffic for the spectrum to make a difference. I’ve seen several studies that say that the vast majority of cellular usage is done indoors in homes, businesses, and schools. But this spectrum still becomes one more piece of the solution to help relieve the pressure on urban cell sites.

For this to be of use the spectrum has to be built into cellular handsets. Apple recently announced that they are building the ability to receive Band 48 of CBRS into their new models. They join the Samsung Galaxy S10 and the Google Pixel 3 with the ability to use the spectrum. Over time it’s likely to be built into many phones, although handset manufacturers are always cautious because adding new spectrum bands to a handset increases the draw on the batteries.

Point-to-Multipoint Broadband. Numerous WISPs and other rural ISPs have been lobbying for the use of the spectrum since it can beef up point-to-multipoint broadband networks. These are networks that put a transmitter on a tower and then beam broadband to a dish on a subscriber premise. This technology is already widely in use mostly using the 2.4 GHz and 5.0 GHz WiFi spectrum. Layering on CBRS will beef up the broadband that can be delivered over a customer link.

It will be interesting to see how that works in a crowded competitive environment. I am aware of counties today where there are half a dozen WISPs all using WiFi spectrum and the interference degrades network performance for everybody. There are five SAS Administrators named by the FCC that will monitor bandwidth usage and who also will monitor interference. The FCC rules don’t allow for indiscriminate deployment of public CBRS spectrum and we’ll have to see how interference problems are dealt with.

One interesting player in the space will be AT&T who intends to layer the frequency onto their fixed wireless product. AT&T widely used the technology to meet their CAF II buildout requirements and mostly has used PCS spectrum to meet the FCC requirement to deliver at least 10/1 Mbps speeds to customers. Adding the new spectrum should significantly increase rural customer speeds – at least for those with a few miles of AT&T towers.

Cable Company Edge-out. The most interesting new players considering the market are the cable companies. Kurn believes that the big cable companies will use the spectrum to edge out to serve rural customers with fixed wireless around their existing cable networks. He says the cable networks could theoretically pass 6 – 7 million new homes if this is deployed everywhere. This is an ideal application for a cable company because they typically have fiber fairly close the edge of their service areas. The point-to-point wireless product operates best when the radios are fiber-fed and cable companies could deliver a product in the 50-100 Mbps range where they have line-of-sight to customers.

We’ve already seen one cable company tackle this business plan. Midco was awarded $38.9 million in the CAF II reverse auctions to deploy 100 Mbps broadband in Minnesota and the Dakotas. Midco is going to need this spectrum, and probably even more to deliver 100 Mbps to every customer. Their deployment is not really an edge-out, and the company plans to build networks that will cover entire rural counties with fixed wireless broadband.

Court Upholds Repeal of Net Neutrality

The DC Circuit Court of Appeals ruled on the last day of September that the FCC had the authority to kill Title II regulation and to repeal net neutrality. However, the ruling wasn’t entirely in the FCC’s favor. The agency was ordered to look again at how the repeal of Title II regulation affects public safety. In a more important ruling, the courts said that the FCC didn’t have the authority to stop states and municipalities from establishing their own rules for net neutrality.

This court was ruling on the appeal of the FCCs net neutrality order filed by Mozilla and joined by 22 states and a few other web companies like Reddit and Etsy. Those appeals centered on the FCC’s authority to kill Title II regulation and to hand broadband regulation to the Federal Trade Commission.

Net neutrality has been a roller coaster of an issue. Tom Wheeler’s FCC put the net neutrality rules in place in 2015. An appeal of that case got a court ruling that the FCC was within its power to implement net neutrality. After a change in administration, the Ajit Pai FCC killed net neutrality in 2017 by also killing Title II regulation. Now the courts have said that the FCC also has the authority to not regulate net neutrality.

The latest court order will set off another round of fighting about net neutrality. The FCC had quashed a law in California to introduce their version of net neutrality and this order effectively will allow those California rules to go into effect. That battle is far from over and there will be likely new appeals against the California rules and similar rules enacted in Washington. It wouldn’t be surprising to see other states enact rules in the coming year since the net neutrality issue is overwhelmingly popular with voters. It’s possibly the worst of all worlds for big ISPs if they have to follow different net neutrality rules in different states. I think they’d much prefer federal net neutrality rules rather than different rules in  a dozen states.

The reversal of net neutrality rules only went effect in June of 2018 and there have been no major violations of the old rules since then. The ISPs were likely waiting for the results of this court ruling and also are wary of a political and regulatory backlash if they start breaking net neutrality rules. The closest thing we had to a big issue was mentioned in this ruling. Verizon had cut off broadband for firemen in California who were working on wildfires after the firemen exceeded their monthly data caps. It turns out that wasn’t a net neutrality violation, but rather an enforcement issue on a corporate cellular account. But the press on that case was bad enough to prompt the courts to require the FCC to take another look at how ISPs treat public safety.

This issue is also far from over politically. Most of the democratic presidential candidates have come out in favor of net neutrality and if Democrats win the White House you can expect a pro-net neutrality chairman of the FCC. Chairman Pai believes that by killing Title II regulation that a future FCC will have a harder time putting the rules back in place. But the two court appeals have shown that the courts largely believe the FCC has the authority to implement or not implement net neutrality as they see fit.

While net neutrality is getting all of the press, the larger issue is that the FCC has washed its hands of broadband regulation. The US is the only major economy in the world to not regulate the broadband industry. This makes little sense in a country where are a large part of the country is still controlled by the cable/telco duopoly, which many argue is quickly becoming a cable monopoly. It’s easy to foresee bad behavior from the big ISPs if they aren’t regulated. We’ve seen the big ISPs increase broadband rates in the last few years and there is no regulatory authority in the country that can apply any brakes to the industry. The big ISPs are likely to demand more money out of Google, Facebook and the big web companies.

The FCC handed off the authority to regulate broadband to the Federal Trade Commission. That means practically no regulation because the FTC tackles a single corporation for bad behavior but does not establish permanent rules that apply to other similar businesses. The FTC might slam AT&T or Comcast from time to time, but that’s not likely to change the behavior of the rest of the industry very much.

There is only one clear path for dealing with net neutrality. Congress can stop future FCC actions and the ensuing lawsuits by passing a clear set of laws that either implements net neutrality or that forbids it. However, until there is a Congress and a White House willing to together implement such a law this is going to continue to bounce around.

The big ISPs and Chairman Pai argued that net neutrality was holding back broadband investments in the country – a claim that has no basis when looking at the numbers. However, there is definitely an impact in the industry from regulatory uncertainty, and nobody is benefitting from an environment where subsequent administrations alternately pass and repeal net neutrality. We need to resolve this once way or the other.