Local Government Funding for Fiber

There is an interesting new trend where local government acts as the banker for rural broadband projects. It’s an interesting new twist on public / private partnerships and is a model that more communities should consider.

Consider these rural broadband projects in Minnesota.

  • First is RS Fiber. This is a new broadband cooperative that serves most of Sibley County and some of Renville County in Minnesota. Bonds were approved to fund 25% of a broadband project and those bonds are backed by the counties, some small cities and also by townships that are getting the fiber. The expectation is that the project will make the bond payments.
  • Next is in Swift County Minnesota. Federated Telephone Cooperative, an existing telephone company, was awarded $4.95 million to build fiber to rural homes in the county. The county approved general obligation bonds of $7.8 million to complete the project, or 60% of the funding.

Both projects are classic examples of a public private partnership. In these particular cases the company that will own and operate the network is a cooperative, but these same agreements could have been made with a for-profit telco or some other telecom provider as well.

These kinds of projects make sense for a number of reasons:

  • The process of approving bond financing is far faster than securing traditional funding for these kinds of projects.
  • Bonds for fiber can be financed over a long period of time – 20 to 30 years, while loan terms for commercial loans are usually shorter. Just like with a home mortgage, borrowing for a longer time period means lower annual debt payments, which is essential to make these projects financially feasible.

In both cases the Counties and other local government entities have taken on the role of banker. The local governments will have no operational role in running the fiber business (a role they did not want). The Counties expect for the bond payments to be covered by the fiber project. And since these networks are being built in rural areas with few other broadband alternatives the new fiber ventures should get high customer penetration rates. But if the ventures fail then the local governments are on the hook to cover any shortfalls in the bond payments.

These are both cases of local governments deciding that the need for rural broadband was great enough to risk taxpayer money to get this done. They also decided that the risk of not getting paid is low. The business cases show that even in the worst case the revenues from the projects should cover almost all costs, meaning that the downside risk to the Counties is minimal. In the case of RS Fiber, as a start-up new cooperative, they would not have been able to get any traditional funding without the seed money from the local governments.

This is a model that the rest of rural America should consider. Small ISPs like these cooperatives stand ready to serve a lot of rural America, but they often don’t have the financial wherewithal to do so. In these cases, a public private partnership with local government as the banker seemed to be the only way to make this happen.

Everywhere I travel in rural America homeowners and farmers want good broadband. They understand that it’s costly to build fiber to farms and small rural towns. But they also seem willing to help pay to make this work. I think if more rural counties would listen to their constituents they would take a harder look at this model.

Of course, a county needs to do their homework up front and make sure they know it’s a sound project and that the estimated cost of building the broadband network is accurate. But assuming there is a solid business plan, perhaps the most valuable role a county can tackle is that of being the banker to help new broadband builds get off the ground

A Doubling of Broadband Prices?

In what is bad news for consumers but good news for ISPs, a report by analyst Jonathan Chaplin of New Street Research predicts big increases in broadband prices. He argues that broadband is underpriced. Prices haven’t increased much for a decade and he sees the value of broadband greatly increased since it is now vital in people’s lives.

The report is bullish on cable company stock prices because they will be the immediate beneficiary of higher broadband prices. The business world has not really acknowledged the fact that in most US markets the cable companies are becoming a near-monopoly. Big telcos like AT&T have cut back on promoting DSL products and are largely ceding the broadband market to the big cable companies. We see hordes of customers dropping DSL each quarter and all of the growth in the broadband industry is happening in the biggest cable companies like Comcast and Charter.

I’ve been predicting for years that the cable companies will have to start raising broadband prices. The companies have been seeing cable revenues drop and voice revenues continuing to drop and they will have to make up for these losses. But I never expected the rapid and drastic increases predicted by this report. Chaplin sets the value of basic broadband at $90, which is close to a doubling of today’s prices.

The cable industry is experiencing a significant and accelerating decline in cable customers. And they are also facing significant declines in revenues from cord-shaving as customers elect smaller cable packages. But the cable products have been squeezed on margin because of programming price increases and one has to wonder how much the declining cable revenue really hurts their bottom line.

Chaplin reports that the price of unbundled basic broadband at Comcast is now $90 including what they charge for a modem. It’s even higher than that for some customers. Before I left Comcast last year I was paying over $120 per month for broadband since the company forced me to buy a bundle that included basic cable if I wanted a broadband connection faster than 30 Mbps.

Chaplin believes that broadband prices at Comcast will be pushed up to the $90 level within a relatively short period of time. And he expects Charter to follow.

If Chaplin is right one has to wonder what price increases of this magnitude will mean for the public. Today almost 20% of households still don’t have broadband, and nearly two-thirds of those say it’s because if the cost. It’s not hard to imagine that a drastic increase in broadband rates will drive a lot of people to use broadband alternatives like cellular data, even though it’s a far inferior substitute.

I also have to wonder what price increases of this magnitude might mean for competitors. I’ve created hundreds of business plans for markets of all sizes, and not all of them look promising. But the opportunities for a competitor improve dramatically if broadband is priced a lot higher. I would expect that higher prices are going to invite in more fiber overbuilders. And higher prices might finally drive cities to get into the broadband business just to fix what will be a widening digital divide as more homes won’t be able to afford the higher prices.

Comcast today matches the prices of any significant cable competitor. For instance, they match Google Fiber’s prices where the companies compete head-to-head. It’s not hard to foresee a market where competitive markets stay close to today’s prices while the rest have big rate increases. That also would invite in municipal overbuilders in places with the highest prices.

Broadband is already a high-margin product and any price increases will go straight to the bottom line. It’s impossible for any ISP to say that a broadband price increase is attributable to higher costs – as this report describes it, any price increases can only be justified by setting prices to ‘market’.

All of this is driven, of course, by the insatiable urge of Wall Street to see companies make more money every quarter. Companies like Comcast already make huge profits and in an ideal world would be happy with those profits. Comcast does have other ways to make money since they are also pursuing cellular service, smart home products and even now bundling solar panels. And while most of the other cable companies don’t have as many options as Comcast, they will gladly follow the trend of higher broadband prices.

Why Isn’t Everybody Cutting the Cord?

Last year at least two million households cut the cord. I’ve seen headlines predicting that as many as 5 million more this year, although that seems too high to me. But both of these numbers are a lot lower than the number of people who say they are going to cut the cord in the coming year. For several years running various national surveys show that 15 million or more households say they want to cut the cord. But year after year they don’t and today’s blog looks at some of the reasons why.

I think one of the primary reasons people keep traditional cable is that they figure out that they won’t save as money with cord cutting as they had hoped. The majority of cord cutters say that saving money is their primary motivation for cutting the cord, and once they look hard at the actual savings they decide it’s not worth the change.

One issue that surprises a lot of potential cord cutters is the impact of losing their bundling discount if they are buying programming from a cable company. Big cable companies penalize customers who break the bundle. As an example, consider a customer who has a $50 broadband product and a $50 cable product, but for which the cable company charges $80. When a customer drops one of the two products the cable company will charge them $50 for the remaining one. That means there is a $20 penalty for cutting the cord and thus not much savings from cutting the cord.

Households also quickly realize that they need to subscribe to a number of OTT services if they want a wide array of programming choices. If you want to watch the most popular OTT shows that means a $10 subscription to Netflix, an $8.25 per month subscription to Amazon and a Hulu package that starts at $8. If you want to watch Game of Thrones you’ll spend $15 for HBO. And while these packages carry a lot of movies, if you really love movies you’ll find yourself buying them on an a la carte basis.

And OTT options are quickly proliferating. If you want to see the new Star Trek series that means another $5.99 per month for CBS All Access. If your household likes Disney programming that new service is rumored to cost at least another $5 per month.

And none of these options bring you all of the shows you might be used to watching on cable TV. One option to get many of these same networks is by subscribing to Sling TV or PlayStation Vue, with packages that start at $20 per month, but which can cost a lot more. If you don’t want to subscribe to these services, then buying whole season of one specific show can easily cost $100.

And then there is sports. PlayStation Vue looks to have the best basic sports package, but that means buying the service plus add-on packages. A serious sports fan is also going to consider buying Fubo. And fans of specific sports can buy subscriptions to Major League baseball, NBA basketball or NHL hockey.

Then there are the other 100 OTT options. There is a whole range of specialty programmers that carry programming like foreign films, horror movies, British comedies and a wide range of other programming. Most of these range from $3 to $7 per month.

There are also hardware costs to consider. Most people who watch a range of OTT programming get a media streaming device like Roku, Amazon Fire, or Apple TV. Customers that want to record shows shell out a few hundred dollars for an OTT VCR. A good antenna to get local programming costs between $30 and $100.

The other reason that I think people don’t cut the cord is that it’s not easy to navigate between the many OTT options. They all have different menus and log-ins and it can be a pain to navigate between platforms. And it’s not easy to find what you want to watch, particularly if you don’t have a specific show in mind. It’s hard to think that it’s going to get any easier to use the many OTT services since they are in competition with each other. It’s hard to ever see them agreeing on a common interface or easy navigation since each platform wants viewers to stay on their platform once logged in.

Finally, none of these combinations gets you everything that’s on cable TV today. For many people cutting the cord means giving up a favorite show or favorite network.

If anything, OTT watching is getting more complicated over time. And if a household isn’t careful they might spend more than their old cable subscription. I’m a cord cutter and I’m happy with the OTT services I buy. But I can see how this option is not for everybody.

 

When Customers Use Their Data

In a recent disturbing announcement ,Verizon Wireless will be disconnecting service to 8,500 rural customers this month for using too much data on their cellphones. The customers are scattered around 13 states and are a mix those with both unlimited and limited data plans.

Verizon justifies this because these customers are using data where Verizon has no direct cell towers, meaning that these customers are roaming on cellular data networks owned by somebody else. Since Verizon pays for roaming the company say that these customers are costing them more in roaming charges than what the company collects in monthly subscription fees.

Verizon may well have a good business case for discontinuing these particular data customers if they are losing money on each customer. But the act of disconnecting them opens up a lot of questions and ought to be a concern to cellular customers everywhere.

This immediately raises the question of ‘carrier of last resort’. This is a basic principle of utility regulation that says that utilities, such as traditional incumbent telephone companies, must reasonably connect to everybody within their service territory. Obviously cellular customers don’t fall under this umbrella since the industry is competitive and none of the cellular companies have assigned territories.

But the lines between cellular companies and telcos are blurring. As AT&T and Verizon take down rural copper they are offering customers a wireless alternative. But in doing so they are shifting these customers from being served by a regulated telco to a cellular company that doesn’t have any carrier of last resort obligations. And that means that once converted to cellular that Verizon or AT&T would be free to then cut these customers loose at any time and for any reason. That should scare anybody that loses their rural copper lines.

Secondly, this raises the whole issue of Title II regulation. In 2015 the FCC declared that broadband is a regulated service, and that includes cellular data. This ruling brought cable companies and wireless companies under the jurisdiction of the FCC as common carriers. And that means that customers in this situation might have grounds for fighting back against what Verizon is doing. The FCC has the jurisdiction to regulate and to intervene in these kinds of situations if they regulate the ISPs as common carriers. But the current FCC is working hard to reverse that ruling and it’s doubtful they would tackle this case even if it was brought before them.

Probably the most disturbing thing about this is that it’s scary for these folks being disconnected. Rural homes do not want to use cellular data as their only broadband connection because it’s some of the most expensive broadband in the world. But many rural homes have no choice since this is their only broadband alternative to do the things they need to do with broadband. While satellite data is available almost everywhere, the incredibly high latency on satellite data means that it can’t be used for things like maintaining a connection to a school server to do homework or to connect to a work server to work at home.

One only has to look at rural cellular networks to understand the dilemma many of these 8,500 households might face. The usable distance for a data connection from a cellular tower is only a few miles at best, much like the circles around a DSL hub. It is not hard to imagine that many of these customers actually live within range of a Verizon tower but still roam on other networks.

Cellular roaming is an interesting thing. Every time you pick up your cellphone to make a voice or data connection, your phone searches for the strongest signal available and grabs it. This means that the phones of rural customers that don’t live right next to a tower must choose between competing weaker signals. Customers in this situation might be connected to a non-Verizon tower without it being obvious to them. Most cellphones have a tiny symbol that warns when users are roaming, but since voice roaming stopped being an issue most of us ignore it. And it’s difficult or impossible on most phones to choose which tower to connect to. Many of these customers being disconnected might have always assumed they actually were using the Verizon network. But largely it’s not something that customers have much control over.

I just discussed yesterday how we are now in limbo when it comes to regulating the broadband practices of the big ISPs. This is a perfect example of that situation because it’s doubtful that the customers being disconnected have any regulatory recourse to what is happening to them. And that bodes poorly to rural broadband customers in general – just one more reason why being a rural broadband customer is scary.

Broadband Regulation is in Limbo

We have reached a point in the industry where it’s unclear who regulates broadband. I think a good argument can be made that nobody is regulating broadband issues related to the big ISPs.

Perhaps the best evidence of this is a case that is now in Ninth Circuit Court of Appeals in San Francisco. This case involves a 2014 complaint against AT&T by the Federal Trade Commission based on the way that AT&T throttled unlimited wireless data customers. The issue got a lot of press at the time when AT&T started restricting data usage in 2011 for customers when they hit some arbitrary (and unpublished) data threshold in a month. Customers got shuttled back to 3G and even 2G data speeds and basically lost the ability to use their data plans. The press and the FTC saw this as an attempt by AT&T to drive customers off their grandfathered unlimited data plans (which were clearly not unlimited).

AT&T had argued at the FTC that they needed to throttle customers who use too much data as a way to manage and protect the integrity of their networks. The FTC didn’t buy this argument ruled against AT&T. As they almost always do the company appealed the decision. The District Court in California affirmed the lower court ruling and AT&T appealed again, which is the current case in front of the Ninth Circuit. AT&T is making some interesting claims in the case and is arguing that the Federal Trade Commission rules don’t allow the FTC to regulate common carriers.

There are FTC rules called the ‘common carrier exemption’ that were established in Part 5 of the original FTC Act that created the agency. These exemptions are in place to recognize that telecom common carriers are regulated instead by the FCC. There are similar carve-outs in the FTC rules for other industries that are regulated in part by other federal agencies.

The common carrier exemption doesn’t relieve AT&T and other telecom carriers from all FTC regulation – it just means that the FTC can’t intercede in areas where the FCC has clear jurisdiction. But any practices of telecom carriers that are not specifically regulated by the FCC then fall under FTC regulations since the agency is tasked in general with regulating all large corporations.

AT&T is making an interesting argument in this appeals case. They argue since they are now deemed to be a common carrier for their data business under the Title II rules implemented in the net neutrality order that they should be free of all FTC oversight.

But there is an interesting twist to this case because the current FCC filed an amicus brief in the appeal saying that they think that the FTC has jurisdiction over some aspects of the broadband business such as privacy and data security issues. It is this FCC position that creates uncertainty about who actually regulates broadband.

We know this current FCC wants to reverse the net neutrality order, and so they are unwilling right now to tackle any major issues that arise from those rules. In this particular case AT&T’s throttling of customers occurred before the net neutrality decision and at that time the FCC would not have been regulating cellular broadband practices.

But now that the FCC is considered to be a common carrier it’s pretty clear that the topic is something that the FCC has jurisdiction of today. But we have an FCC that is extremely reluctant to take on this issue because it would give legitimacy to the net neutrality rules they want to eliminate.

The FCC’s position in this case leads me to the conclusion that, for all practical purposes, companies like AT&T aren’t regulated at all for broadband issues. The prior FCC made broadband a common carrier service and gave themselves the obligation to regulate broadband and to tackle issues like the one in this case. But the new FCC doesn’t want to assert that authority and even goes so far as to argue that many broadband related issues ought to be regulated by the FTC.

This particular case gets a little further muddled by the timing since AT&T’s practices predate Title II regulation – but the issue at the heart of the case is who regulates the big ISPs. The answer seems to be nobody. The FCC won’t tackle the issue and AT&T may be right that the FTC is now prohibited from doing so. This has to be a huge challenge for a court because they are now being asked who is responsible for regulating the case in front of them. That opens up all sorts of possible problems. For example, what happens if the court rules that the FCC must decide this particular case but the agency refuses to do so? And of course, while this wrangling between agencies and the courts is being settled it seems that nobody is regulating AT&T and other broadband providers.

The Next Big Broadband Application

Ever since Google Fiber and a few municipalities began building gigabit fiber networks people have been asking how we are going to use all of that extra broadband capability. I remember a few years ago there were several industry contests and challenges to try to find the gigabit killer app.

But nobody has found one yet and probably won’t for a while. After all, a gigabit connection is 40 times faster than the FCC’s current definition of broadband. I don’t think Google Fiber or anybody thought that our broadband needs would grow fast enough to quickly fill such a big data pipe. But year after year we all keep using more data, and since the household need for broadband keeps doubling every three years it won’t take too many doublings for some homes to start filling up larger data connections.

But there is one interesting broadband application that might be the next big bandwidth hog. Tim Cook, the CEO of Apple, was recently on Good Morning America and he said that he thinks that augmented reality is going to be a far more significant application in the future than virtual reality and that once perfected that it’s going to be something everybody is going to want.

By now many of you have tried virtual reality. You don a helmet of some kind and are then transported into some imaginary world. The images are in surround-3D and the phenomenon is amazing. And this is largely a gaming application and a solitary one at that.

But augmented reality brings virtual images out into the real world. Movie directors have grasped the idea and one can hardly watch a futuristic show or movie without seeing a board room full of virtual people who are attending a meeting from other locations.

And that is the big promise of virtual reality. It will allow telepresence – the ability for people to sit in their home or office and meet and talk with others as if they are in the same room. This application is of great interest to me because I often travel to hold a few hour meetings and the idea of doing that from my house would add huge efficiency to my business life. Augmented reality could spell the end of the harried business traveler.

But the technology has far more promise than that. With augmented reality people can share any other images. You can share a sales presentation or share videos from your latest vacation with grandma. This ability to share images between people could drastically change education, and some predict that over a few decades that augmented reality would begin to obsolete the need for classrooms full of in-person students. This technology would fully enable telemedicine. Augmented reality will enhance aging in the home since shut-ins could still have a full social life.

And of course, the application that intrigues everybody is using augmented reality for entertainment. Taken to the extreme, augmented reality is the Star Trek holodeck. There are already first-generation units that can create a virtual landscape in your living room. It might take a while until the technology gets as crystal clear and convincing as the TV holodeck, but even having some percentage of that capability opens up huge possibilities for gaming and entertainment.

As the quality of augmented reality improves, the technology is going to require big bandwidth connections with a low latency. Rather than just transmitting a 2D video file, augmented reality will be transmitting 3D images in real time. Homes and offices that want to use the technology are going to want broadband connections far faster than the current 25/3 Mbps definition of broadband. Augmented reality might also be the first technology that really pushes the demand for faster upload speeds since they are as necessary as download speeds in enabling a 2-way augmented reality connection.

This is not a distant future technology and a number of companies are working on devices that will bring the first-generation of the technology into homes in the next few years. And if we’ve learned anything about technology, once a popular technology is shown to work, demand in the marketplace there will be numerous companies vying to improve the technology.

If augmented reality was here today the biggest hurdle to using it would be the broadband connections most of us have today. I am certainly luckier than people in rural areas and I have a 60/5 Mbps connection with a cable modem from Charter. But the connection has a lot of jitter and the latency swings wildly. My upload stream is not going to be fast enough to support 2-way augmented reality.

The economic benefits from augmented reality are gigantic. The ability for business people to easily meet virtually would add significant efficiency to the economy. The technology will spawn a huge demand for content. And the demand to use the technology might be the spur that will push ISPs to build faster networks.

Is 5G Really a Fiber Replacement?

I recently saw a short interview in FierceWireless with Balan Nair, CTO of Liberty Global. In case you haven’t heard of the company, they are the biggest cable company in the world with over 28 million customers.

One of the things he discussed was the practical widespread implementation of 5G gigabit technology. He voiced the same thing I have been thinking for the last year about the economics of deploying 5G. He was quoted as saying, “5G will be a ‘game-changer’ in its superior ability to transfer data, but the technology will not replace fixed-network broadband services anytime soon. The economics just aren’t there. You’re talking about buying hundreds of towers and all of that spectrum. And on the residential end, putting a device outside the window and wiring it back into the home. It’s a question of business model and if you plan on making any money. The economics benefit fixed.”

The big telcos are making a big deal out of 5G, mostly I think to appear cutting-edge to their investors. And I have no doubt that in certain places like dense urban downtowns that 5G might be the best way to speed up gigabit broadband deployment. But I look at what’s involved in deploying the technology anywhere else and I have a hard time seeing the economic case for using 5G to bring fast broadband to the masses.

5G will definitely make an impact in urban downtowns. You might assume that cities already have a great fiber infrastructure, but this often isn’t the case. Look at Verizon’s FiOS deployment strategy in the past – they deployed fiber where the construction was the most cost effective, and that meant suburban areas where they had existing pole lines or conduit. Verizon largely avoided much of the downtowns of eastern cities because the cost per mile of fiber construction was too expensive.

Now, 5G can be deployed from the top of high-rises to reach the many downtown buildings that never got fiber. New York City recently sued Verizon since the company reneged on its promise to build fiber everywhere and there are still 1 million living units in the city that never got fiber broadband. Verizon, or somebody else is going to be able to use 5G in the densely populated cities to bring faster broadband, and as Nair said, this might be a game changer.

But as soon as you get out of downtowns and high-rises the math no longer favors 5G. There are three components of a 5G network that are not going to be cheap in suburbia. First, 5G needs fiber. You might be able to use a little wireless backhaul in a 5G network, but a significant portion of the network must be fiber fed. And in most of the country that fiber is not in place. Deloitte recently estimated that the cost for just the fiber to bring 5G everywhere is $130 billion. There is nobody rushing to make that investment.

5G then needs somewhere to place the transmitters. This is more easily achieved in a downtown where there are many tall rooftops and existing towers. But the short delivery distances for millimeter wave frequencies mean that transmitters need to be relatively close the end-user. And in suburban areas that’s going to mean somehow building a lot of new towers or else placing smaller transmitters on existing poles. We know suburbia hates tall towers and it’s always a struggle to build new ones. And the issues associated with getting access to suburban poles are well documented. An ISP needs to affordably get onto poles and also get fiber to those poles – two expensive and time-consuming challenges.

And then there is the economics of the electronics. Because millimeter wave spectrum is easily disrupted by foliage or any impediments it means that there won’t be too many homes served from any one pole-mounted transmitter. But the 5G revenue stream still has to cover both ends of the radios as well as wiring into the home.

I build a lot of landline business plans and I can’t see this making any economic sense for widespread deployment. In many cases this 5G network might be more expensive and slower to deploy than an all-fiber network.

I instead envision companies using 5G technology to cherry pick. There will be plenty of places where there is existing fiber and poles that can be used to serve suburban apartment complexes or business districts. I can see strategic deployment in those areas and the technology used in the same way that Verizon deployed fiber – 5G will deployed only where it makes sense. But like with FiOS, there are going to be huge areas where there will be no 5G deployment, even in relatively dense suburbia. And the business case for rural America is even bleaker. 5G will find a market niche and will be one more technology tool for bringing faster broadband – where it makes economic sense.

The End of the Free Web

The web model of using advertising revenues to pay those who create content is quickly breaking down and it’s going to drastically change the free web we are all used to. It feels like a lot longer, but the advertising web model has now been operating for only twenty years. Before that people and companies built web sites and posted content they thought was interesting, but nobody got compensated for anything on the web.

But then a few companies like AOL discovered that companies were willing to pay to place advertising on web pages and the web advertising industry was born. Today news articles and other content on the web are plastered with ads of various kinds. And it is these ads that have funded the new industry of web content providers. These are now numerous web magazines and other websites that are largely funded by the revenues from ads. Most of the news articles you read on the web have been funded from the ad revenues.

But ad revenue of this kind are disappearing and this is likely going to mean a major transformation of the web in the near future. Here are some of the main reasons that ad revenues are changing:

  • People have changed the way that they find and read content. Twenty years ago we all had a list of our favorite bookmarked sites and we would peruse those web sites from time to time to catch up on their content. But today the majority of people get their content through an intermediate platform like Facebook, Twitter or Google. These platforms learn about your tastes and they direct articles of interest to you. We no longer search for content, but rather content finds us.
  • And that means that the big platforms like Facebook control the flow of content. A few years ago Facebook reacted to user complaints that their feeds were too long and busy and the company reacted to this by only flowing a percentage of potential content to users. That meant that a person might not see that an old high school friend bought a new puppy, but it also meant that each user on Facebook saw fewer web articles. The impact from this change was dramatic to web publishers, who on average saw a 50% immediate drop in their revenue from Facebook.
  • Meanwhile the big platforms decided that they should keep more advertising revenue and they are now promoting content directly on their platform. For example, Facebook now pays people to create content and Facebook favors this over content created elsewhere – which has further decreased ad revenues.
  • Advertisers have also gotten leery about the web advertising environment. This has worked using instantaneous auctions where web sites bid for advertising slots. Web sites willing to pay the most get the best advertising content, but the automated selling platforms strives to place every ad somewhere on the web. This resulted in large companies getting grief after finding their ads on unsavory web sites. Big companies were not enamored in finding that they were advertising on sites promoting racism or radical political views. So the big companies have been redirecting their advertising dollars away from the auction-driven ad system and have instead been placing ads directly on ‘safer’ sites or directly on the big web platforms. Google and Facebook together now collect the majority of web advertising.
  • There has also been a huge growth in ad blockers. People use ad blockers in an attempt to block many of the obnoxious ads – those that pop up and interrupt with reading content. But using ad blockers also deprive revenue for those sites that any user most values. While only miniscule amounts of money flow from each ad view, it all adds up and ad blockers are killing huge numbers of views.
  • The last straw is that web browsers are starting to block ads automatically. For example, the new version of Chrome will block ads by default. Soon, anybody using these browsers will be free of auction-generated ads, but in doing so will kill even more ad revenues that have been paying those that create the content that people want to read.

We are already seeing what this shift means. We are seeing content providers now asking readers to directly contribute to help keep them in business. More drastically we are seeing a lot of the quality content on the web go behind paywalls. That content is only being made available to those willing to subscribe to the content. And we are seeing a drop in new quality content being created since many content creators have been forced to make a living elsewhere.

But the quiet outcome of this is that a huge chunk of web content is going to disappear. This probably means the death of content like “The ten cutest puppies we found this week”, but it also means that writers and journalists that have been compensated from web advertising will disappear from the web. We’ll then be left with the content sponsored by the big platforms like Facebook or content behind paywalls like the Washington Post. And that means the end of the free web that we all love and have come to expect.

Lowering the Official Speed of Broadband

The FCC’s intention to kill net neutrality is getting all of the headlines, but there is another quieter battle going on at the FCC that has even bigger implications for rural America.

The last FCC under Chairman Tom Wheeler raised the definition of broadband in 2015 to 25/3 Mbps, up from the archaic definition of 4/1. In doing so the FCC set the speed based upon the way that an average household uses broadband. At the time many people argued that the FCC’s way of measuring broadband need was somewhat contrived – and perhaps it was because it’s really a challenge to define how much broadband a home needs. It’s not as easy as just adding up the various web connections as I described in a recent blog.

The FCC is now considering lowering the definition of broadband down to 10/1 Mbps. That would be a dagger to the heart of rural broadband, as I will discuss below.

One only has to look at the big ISPs to see that the FCC is ignoring the realities of the market. The big cable companies have all set minimum broadband speeds above the 25/3 Mbps current FCC broadband definition. Charter’s base broadband product for a new customer is 60 Mbps. Depending upon the market Comcast’s base speeds are 50 Mbps or 75 Mbps. AT&T says they are starting to back out of their DSL business because their fastest U-verse product only has speeds up to 50 Mbps. These big ISPs all get it and they know that customers are only happy with their broadband connection when it works without problems. And providing more speed than 25/3 Mbps is how these companies are satisfying that customer demand.

Unfortunately the FCC’s definition of broadband has huge real life implications. The big urban ISPs won’t change what they are doing, but a lower threshold could kill attempts to improve rural broadband. The FCC has a mandate from Congress to take steps to make sure that everybody in the country has adequate broadband. When the FCC increased the definition to 25/3 Mbps they instantly recognized that 55 million people didn’t have broadband. And that forced them to take steps to fix the problem. Since 2015 there has been a lot of rural broadband construction and upgrades made by cable networks in small town America and the latest estimates I’ve seen say that the number of those without 25/3 Mbps broadband is now down to around 39 million. That’s still a lot of people.

If the current FCC lowers the definition to 10/1 Mbps then many of those 39 million people will instantly be deemed to have broadband after all. That would take the FCC off the hook to try to solve the rural broadband gap. To really show that this is just a political decision, the FCC is also contemplating counting a cellular broadband connection as an acceptable form of broadband. In doing so they will then be able to declare that anybody that can get this new slower speed on a cellphone has an adequate broadband solution.

Of course, when I say this is all just politics there are those that said the same thing when the Wheeler FCC raised the definition to 25/3 Mbps. At that time critics might have been right. In 2015 there were a lot of homes that were happy with speeds less than 25/3 Mbps and that definition might have been a little bit of a stretch for the average home.

But when you take all of the politics out of it, the reality is that the amount of broadband that homes need keeps growing. Any attempt to define broadband will be obsolete within a few years as broadband usage continues on the path of doubling every three years. A home that needed 15 or 20 Mbps download in 2015 might now easily need more than 25/3 Mbps. That’s how the math behind geometric growth is manifested. .

It is disheartening to see the FCC playing these kinds of political games. They only need to go visit any rural kid trying to do homework to understand that 10/1 Mbps broadband on a cellphone is not broadband. The FCC only needs to go talk to somebody in rural America who can’t take a good-paying work-at-home job because they don’t have good broadband. They only need to go and talk to farmers who are losing productivity due to lack of a good broadband connection. And they only need to talk to rural homeowners who can’t find a buyer for their home that doesn’t have broadband.

This is too critical of an economic issue for the country to let the definition of broadband change according to the politics of the administration in office. Rather than political haggling over the official definition of broadband we ought to try something new. For example, we could set a goal that rural America ought to at least have half of the average speeds of broadband available in urban America. Using some kind of metric that people can understand would take the politics out of this. This is a metric that companies like Akamai already quantify and measure. The amount of broadband that homes needs is a constantly growing figure and pinning it down with one number is always going to be inadequate. So maybe it’s time to remove politics from the issue and make it fact based.

Maybe Coops are the Answer

I’ve been talking with a lot of rural counties lately and also with rural service providers. For the vast majority of rural broadband projects the biggest roadblock to getting started is almost always funding. Building fiber-to-the-home or even fiber backbones to extend fiber deeper into rural communities is expensive and there are not a lot of funding sources ready to support fiber projects. But there is one business structure that can sometimes make financing a little easier and perhaps it is time for more communities to consider forming a cooperative as a way to get a broadband solution.

Cooperatives are governed under federal law by the Capper Volstead Act. There are also state laws governing coops that differ a bit from state to state, but are mostly the same everywhere. A cooperative is a legal entity owned and controlled by its members and members generally are also the consumers of its products or services. Cooperatives are typically based on the cooperative values of self-help, self-responsibility, community concern, and caring for others. Cooperatives generally aim to provide their goods or services at close to cost and any excess earnings are generally required by law to be reinvested in the enterprise or returned to individual patrons based on patronage of the cooperative.

There are several advantages of coops that make them worth considering:

  • Coops are corporations and not municipal entities. Coops ought to be exempt from all of the many state laws that prohibit or discourage municipal ownership of broadband networks. If you’re in a place that makes it hard to create a municipal broadband solution then a cooperative might be a great alternative.
  • Cooperatives don’t have the same profit-motive as privately-owned entities. From a financing perspective this makes them look more like a municipal venture in that a coop is happy with cash flows that cover costs rather than having to also make a profit.
  • Cooperatives often have some tax advantages over other kinds of corporations. For example, ‘profits’ from serving their customers is often income-tax free. This can vary by state, but for the most part cooperatives pay little income taxes as long as they focus only on serving their own members.
  • The typical financing sources for broadband are used to working with cooperatives. The RUS, part of the Department of Agriculture has a long history of lending to cooperatives. CoBank, a bank that is part of the US Farm Credit System was established specifically to loan to agricultural, electric and telecom cooperatives. While the RUS was tasked a number of years ago to include municipalities under their umbrella, the nuances of that program make it nearly impossible for a municipality to borrow from them.
  • Cooperatives have a unique funding source that is not available to anybody else. Coops are allowed to loan excess cash to each other and I’ve seen new coops get low, or even zero interest loans from other cooperatives to help them get started. Many older electric and agriculture coops sit on big cash reserves that they might consider lending – particularly when the new telecom cooperative covers the same member territory.

But as you might expect, there are other issues that present challenges for new cooperatives:

  • Any lender to a new fiber venture is going to want to see some equity put into a new venture so that it is not 100% financed. It can be more of a challenge for a cooperative to raise equity compared to a commercial company because there is no way to guarantee that such equity will earn a good return or that it can be returned in any reasonable time frame. So this basically means that an equity drive means asking prospective members in the community for money. I’ve seen a few cooperatives get started and it can be done – but it’s not easy.
  • Cooperatives are governed by Boards elected from the membership base. Existing coops hire employees to operate the business and these employees provide the technical expertise that makes lenders trust lending to the business. But until a new cooperative is funded and can hire those employees there is a classic chicken-and-egg dilemma in that a lender can’t be positive that the cooperative knows how to operate their business.
  • The local acceptance of the cooperative idea varies by region. In some places in the Midwest a majority of local businesses are cooperatives, but there are other places where there are few if any cooperatives.

There are many situations where a cooperative might be the only reasonable operating structure for a rural area to get the broadband they want. If a community is not finding any solutions from a commercial provider and is unable to provide municipal funding, then a cooperative is well worth considering.