Simultaneous Data Streams

By working all over the country I get to hear a lot of stories about how people use broadband. I’ve noticed that over the last few years that the household expectation for broadband performance has changed.

As recently as three or four years ago most households seemed to judge the adequacy of their broadband connection by how well it would handle a video stream from Netflix or other streaming service. Households that couldn’t stream video well were unhappy, but those that could generally felt that their broadband connection was good enough.

Interestingly, much of the perceived improvement in the ability to steam video was not due to better broadband performance. Streaming services like Netflix took steps to improve the performance of their product. Netflix had always buffered their content, meaning that a customer would load the video stream a few minutes ahead of viewing to eliminate the variation in customer broadband connections. They subsequently built some brains into the service so that the compression used for a given stream would vary according to the broadband connection of the customer. They also began caching their content with ISPs so that their signal would be generated from the ISP’s local network and not from somewhere in the distant cloud.

Streaming quality then became an issue again with the introduction of live streaming sports and other content, and many of the flaws in the video stream became more apparent. I remember trying to watch ESPN online when it was first offered by Sling TV and the experience was miserable – the stream would crash a number of times during a football or basketball game. Live-streaming services have subsequently improved their product to work better with a variety of broadband connections.

Over the last two years I’ve noticed a big change in how households talk about their broadband performance. I haven’t heard anybody mention single video streaming in a few years and the expectation for a broadband connection now is that it can handle multiple data streams at the same time.

This tells me two things. First, as mentioned above, video streaming has improved to the point where you don’t get interruptions on most broadband connections. But more importantly, households have changed how they use broadband. I think my household is a typical example. The only broadband need we have that is different from many families is that my wife and I both work from home. But other than that, we don’t have atypical broadband demands.

If you go back five years we probably had perhaps half a dozen devices in our home capable of connecting to the Internet. We rarely demanded a lot of simultaneous broadband. Today we have over 40 Internet capable devices in our house. While some of them use little or no broadband, we’ve changed how we use broadband. We are cord cutters and routinely are streaming several videos at the same time while also using the Internet for gaming and schoolwork. We’re often stream music. Our computers automatically upload files to the cloud and download software updates. Cellphones are connected to the WiFi and there is regular use of FaceTime and other apps that include video streams.

Interestingly, when the FCC established 25/4 Mbps as the definition of broadband they justified the speed by looking at simultaneous uses of multiple broadband services. At that time a lot of critics derided the FCC’s justification since it wasn’t realistic for how most households really used broadband. Perhaps the staff at the FCC was prescient, because their illustrative examples are exactly how a lot of homes use broadband today.

If anything, the FCC’s method was conservative because it didn’t account for the interference that arises in a home network that is processing multiple data streams at the same time. The more streams, the more interference, and it wouldn’t be unusual for a home like ours to experience 20% to 30% overhead in our WiFi network while processing the numerous simultaneous streams.

Unfortunately, many policy makers are still stuck on the old paradigm. This is the only way they can justify something like the CAF II program that will provide data steams in the 10 Mbps range. They still talk about how that connection will allow a household to watch video or do homework, but they ignore all of the other ways that homes really use broadband today. I know for my home that a 25 Mbps broadband stream is not sufficient and will bog down at various times of the day – so I buy something faster. It’s hard to imagine stepping back to a 10 Mbps connection, because doing so would force us to make hard choices on curtailing our broadband usage.

Comcast Dismantles Data Throttling

On June 11 Comcast announced they had dismantled a congestion management system that had been in place since 2008. This system was used to throttle data speeds for large users of residential data. The company says that their networks are now robust enough that they no longer need to throttle users and that they system wasn’t used for the last year.

Comcast implemented the congestion management system in 2008 after it had been caught throttling traffic to and from Bit Torrent. The FCC said the throttling was discriminator and ordered Comcast to cease the practice. Comcast responded to the FCC with the introduction of the congestion management system that cut back usage for all large residential data users, with what Comcast said was a non-discriminatory basis.

At the time Comcast claimed that large data users, who at that time were exchanging video files, were slowing down their network – and they were probably right. The ISP industry has been blindsided twice in my memory by huge increases in demand for bandwidth. The first time was in the 1990s when Napster and many others promoted the exchange of music MP3 files. The same thing happened a decade ago when people started sharing video files – often pirated copy of the latest movies.

To be fair to Comcast, a decade ago the number one complaint about cable company broadband was that speeds bogged down during the evening prime time hours – the time when most customers wanted to use the network. The Comcast throttling was an attempt to lower the network congestion during the busiest evening hours. Comcast says the throttling system is no longer needed since the widespread implementation of DOCSIS and improvements in backhaul have eliminated many of the network bottlenecks.

Comcast now offers gigabit download speeds in many markets. I suspect that they are relying that only a small percentage of their customers will buy and use this big bandwidth in a given neighborhood, because a significant number of gigabit users could still swamp an individual neighborhood node. I wonder if the company would reinstitute the throttling system again should their network become stressed with some future unexpected surge in broadband traffic. It’s possible that some big bandwidth application such as telepresence could go viral and could swamp their data networks like happened in the past with music files and then video.

Interestingly, the company still maintains customer data caps. Any customer that uses more than 1 terabyte in a month must pay $10 for each extra 50 gigabytes or pay $50 extra to get unlimited data. Comcast never directly said that the data caps were for congestion management, although they often hinted that was the reason for the caps.

The official explanation of the data caps has been that heavy users need to pay more since they use the network more. Comcast has always said that they use the revenues from data caps to pay for the needed upgrades for the network. But this seems a little ingenuous from a company that generated $21.4 billion in free cash in 2017 – nearly $1.8 billion per month.

Comcast is not the only ISP that has been throttling Internet traffic. All four major wireless carriers throttle big data users at some point. T-Mobile is the most generous and starts throttling after 50 GB of month usage while the other three big wireless carriers throttle after 20 – 25 GB per month.

A more insidious form of data throttling is the use of bursting technology that provides faster broadband speeds for the first minute or two of any given broadband session. During this first minute customers will get relatively fast speeds – often set at the level of their subscription – but if the session is prolonged past that short time limit then speeds drop significantly. This practice fools customers into thinking that they get the speeds they have subscribed to – which is true for the short duration of the burst – but is not true when downloading a large file or streaming data for more than a minute or two. The carriers boast about the benefits of data bursts by saying they give extra broadband for each request – but they are really using the technology to throttle data for any prolonged data demands.

The Latest on Agricultural IoT

For years we’ve heard about how broadband was needed to support agriculture. However, for many years it was hard to find widespread use of farming technologies that need broadband. Finally, agricultural use of the Internet of Things is spreading rapidly – the research firm Alpha Brown estimates that there were over 250,000 US farmers using IoT technology at the end of 2017.

Alpha Brown says there are 1.1 million farms that could benefit from the technology, with broadband connectivity being a huge limiting factor today. Surveys show that more than half of farmers already say they are interested in deploying the technology. Berg Insight, another firm that tracks the industry says that there is the potential for as many as 27.4 million sensors being deployed by US agriculture by 2021.

Agricultural sensors mostly rely on the 802.15.4 standard, which defines the operation of low-rate wireless personal area networks (LR-WANs). Any given sensor doesn’t generate a huge amount of data, but the deployment of multitudes of sensors can require significant bandwidth.

Following are just a few of the agricultural IoT applications already being deployed.

Cattle Farmers and Ranchers. This is the most widespread use of IoT so far. There are numerous IoT applications being used:

  • Moocall is a device that monitors the delivery of calves. It’s a wireless sensor that is strapped to a pregnant cow and that provides an hour’s notice when a cow is ready to give birth.
  • eCow makes a bolus (IoT ‘pill’) that sits in a cow’s stomach for up to five months and which transmits constant readings for temperature and pH.
  • There are several vendors making sensors specific to dairy cows that measure a wide-range of biometric data including temperature, animal activity, GPS position, pulse and various biological metrics. Dairy farming has become scientific with farmers treating each cow individually to maximize milk output.

Crop Farming. There are numerous sensors not available for specific crops:

  • Bosch makes a sensor specific to asparagus farming. Asparagus yields depend upon the ground temperature and farmers use a two-sided foil (black on one side, white on the other) to add or deflect heat from the soil. The sensor measure temperature at various depth and notifies the farmer when it’s time to flip the foil.
  • Semios makes sensors that are specific to fruit orchards which measure leaf-wetness, soil moisture, pest presence, and frost monitoring that can be tailored to each specific orchard.
  • TracoVino makes similar sensors that are specifically to monitor grape vines.
  • There are numerous vendors making IoT sensors that measure characteristics of the soil, plants and environment to notify the need for irrigation.
  • There are several vendors providing systems to automate and monitor greenhouses.

Drones. Drones are being used for a number of different agricultural tasks:

  • DroneSeed provides drones for forestry management. The drones can identify trees with pest problems and then selectively spray only those trees. The drones also collect data on forest conditions – something that was never easily available in the past. They are several vendors using drones to plant new trees being used to reforest empty land and to renew mangrove swamps.
  • Water Conservation. Drones can provide real-time moisture monitoring that can allow farmers to save as much 90% of irrigation water by only watering where needed. This requires real-time collection of data tied into watering systems.
  • Chemical use. Drones are also reducing the amount of chemical being applied by monitoring plant health to direct fertilizer or insecticide only where needed.


Now That Net Neutrality is Dead . . .

The FCC’s net neutrality rules expired last week. There is a process for FCC rule changes that require the agency to take steps like publishing their decisions in the Federal Register, and all of the administrative steps have been taken and the old rules expired.

The press and social media made a big deal about the end of the administrative process, but the issue is a lot more complicated than that and so today I’ll look at what happens next. Officially the big ISPs are now free to make changes in their policies that were prohibited by net neutrality, but for various reasons they are not likely to do so.

First, 22 states filed a lawsuit against the FCC challenging various aspects of the FCC’s ruling. That suit now resides at the US Circuit Court of Appeals in Washington DC. The big ISPs are unlikely to make any significant changes in policies that might be reversed by the courts. In the past the whole industry has waited out the appeals process on this kind of lawsuit because the Courts might find reason to reverse some or all of the FCC’s actions. The ISPs aren’t legally obligated to wait out the lawsuits, but I’m sure their legal counsel is telling them to do so.

Interestingly, the judges hearing this case also heard the previous appeals associated with net neutrality and are familiar with the issues. This court previously had ruled that the FCC had the authority to use Title II regulation as the way to regulate broadband and net neutrality. I’ve not read any predictions yet of how the courts might rule in this case. But if the FCC had the authority to institute Title Ii authority I would think they also have the authority to reverse that decision.

The big ISPs also have to worry about Congress. The Senate voted to reverse the repeal of Title II regulations as part of the Congressional Review Act (CRA) that was used to pass the last budget. The issue is not currently slated for a vote in the House of Representatives and it seems clear that there are not enough votes there to reverse the FCC’s decisions. But it’s only four months until the next election and there is a chance that the Democrats will win a majority of seats. One would think that net neutrality would be on the list of legislative priorities for a Democratic House since polls show over an 80% public approval of the issue.

A vote by Congress to implement net neutrality would end the various court cases since the new laws would supersede any actions taking by the FCC on prior rules. It’s been the lack of Congressional action that has been the underlying reason for all of the various FCC actions and lawsuits on the topic over the years – Congress can give the FCC specific direction and the authority to enforce whatever Congress wants done.

There is another wild card in the mix in that numerous states have either passed rules concerning net neutrality or are contemplating doing so. Most of the state laws would restrict the award of state telecom business to vendors that adhere to net neutrality. My guess is that these lawsuits will make it through appeals because States have the authority to determine their purchasing preferences. But realistically these laws might backfire since most ISPs that are large enough to tackle state telecom needs are likely to be in violation of net neutrality. States implementing these rules might find themselves unable to find a suitable telecom vendor.

The most direct state net neutrality law comes from Washington. Their law, which went into effect automatically when the FCC net neutrality laws expired, prohibits ISPs from blocking or throttling home landline or mobile data. It also specifically prohibits paid prioritization.  An even more stringent bill was near passage by the California legislature. As I was writing this blog it appears that AT&T lobbyists were successful in derailing that legislation. It’s likely that we’ll see more actions from state legislatures in the coming year.

The FCC stated in the Title II repeal order that States were not allowed to override the FCC order. But as we’ve seen many times in the past at the FCC, there is a constant battle between federal authority and state’s rights, .and disputes of this kind are almost always resolved by the courts. There is a long history of battles between FCC authority and State’s rights and over the years both sides have won battles.

The big ISPs hate uncertainty and each of these paths provide a way to reinstate net neutrality. It seems unlikely that the big ISPs will be aggressive with changes until they get a better feel for the resolution of these various challenges to the FCC. Some of the ISPs already had practices that skirted net neutrality rules such as zero-rating of their own content. It’s seems likely that the ISPs will continue to push around the edges of net neutrality, but it seems unlikely that the ISPs will be more aggressive with implementing products and practices that are clear net neutrality violations. The bottom line is that the end of the FCC administrative process was only the beginning of the process and we still have a way to go to get a clear resolution of the issue.

5G Cellular for Home Broadband?

Sprint and T-Mobile just filed a lengthy document at the FCC that describes the benefits of allowing the two companies to merge. This kind of filing is required for any merger that needs FCC approval. The FCC immediately opened a docket on the merger and anybody that opposes the merger can make counterarguments to any of the claims made by the two companies.

The two companies decided to highlight a claim that the combined Sprint and T-Mobile will be able to roll out a 5G network that can compete with home broadband. They claim that by 2024 they could gain as much as a 7% total market penetration, making them the fourth biggest ISP in the country.

The filing claims that their 5G network will provide a low-latency broadband product with speeds in excess of 100 Mbps within a ‘few years’. They claim that customers will be able to drop their landline broadband connection and tether their home network to their unlimited cellular data plan instead. Their filing claims that the this will only be possible with a merger. I see a lot of holes that can be poked into this claim:

Will it Really be that Fast? The 5G cellular standard calls for eventual speeds of 100 Mbps. If 5G follows the development path of 3G and 4G, then those speeds probably won’t be fully met until near the end of the next decade. Even if 5G network can achieve 100 Mbps in ideal conditions there is still a huge challenge to meet those speeds in the wild. The 5G standard achieves 100 Mbps by bonding multiple wireless paths, using different frequencies and different towers to reach a customer. Most places are not receiving true 4G speeds today and there is no reason to think that using a more complicated delivery mechanism is going to make this easier.

Cellphone Coverage is Wonky.  What is never discussed when talking about 5G is how wonky all wireless technologies are in the real world. Distance from the cell site is a huge issue, particular with some of the higher frequencies that might be used with 5G. More important is local interference and propagation. As an example, I live in Asheville, NC. It’s a hilly and wooded town and at my house I have decent AT&T coverage, but Verizon sometimes has zero bars. I only have to go a few blocks to find the opposite situation where Verizon is strong and AT&T doesn’t work. 5G is not going to automatically overcome all of the topographical and interference issues that affect cellular coverage.

Would Require Significant Deployment of Small Cell Sites. To achieve the 100 Mbps in enough places to be a serious ISP is going to require a huge deployment of small cell sites, and that means the deployment of a lot of fiber. This is going to be a huge hurdle for any wireless company that doesn’t have a huge capital budget for fiber. Many analysts still believe that this might be a big enough hurdle to quash a lot of the grandiose 5G plans.

A Huge Increase in Wireless Data Usage. Using the cellular network to provide the equivalent of landline data means a magnitude increase in the bandwidth that will be carried by the cellular networks. FierceWireless along with Strategic Analytics recently did a study on how the customers of the major cellular companies use data. They reported that the average T-Mobile customer today uses 18.4 GB of data per month with 5.3 GB on the cellular network and the rest on WiFi. Sprint customers use 18.2 GB per month with 4.4 GB on the cellular networks. Last year Cisco reported that the average residential landline connection used over 120 GB per month – a number that is doubling every three or four years. Are cellular networks really going to be able to absorb a twenty or thirty times increase in bandwidth demand? That will require massive increases in backhaul bandwidth costs along with huge capital expenditures to avoid bottlenecks in the networks.

Data Caps are an Issue.  None of the cellular carriers offers truly unlimited data today. T-Mobile is the closest, but their plan begins throttling data speeds when a customer hits 50 GB in a month. Sprint is stingier and is closer to AT&T and Verizon and starts throttling data speeds when a customer hits 23 GB in a month. These caps are in place to restrict data usage on the network (as opposed to the ISP data caps that are meant to generate revenue). Changing to 5G is not going to eliminate network bottlenecks, particularly if we see millions of customers using cellular networks instead of landline networks. All of the carriers also have a cap on tethering data – making it even harder to use as a landline substitute – T-Mobile caps tethering at 10 GB per month.

Putting it all into Context. To put this into context, John Legere already claims today that people ought to be using T-Mobile as a landline substitute. He says people should buy a multi-cellphone plan and use one of the phones to tether to landline. 4G networks today have relatively high latency and 4G speeds today can reach 15 Mbps in ideal conditions but are usually slower. 4G also ‘bursts’ today and offers faster speeds for the first minute or two and then slows down to a crawl (you see this when you download phone apps). I think we have to take any claims made by T-Mobile with a grain of salt.

I’m pretty sure that concept of using the merger to create a new giant ISP is mostly a red herring. No doubt 5G will eventually offer an alternative to landline broadband for those homes that aren’t giant data users – but it’s also extremely unlikely that a combined T-Mobile / Sprint could somehow use 5G cellular to become the fourth biggest ISP starting ‘a few years from now’. I think this claim is being emphasized by the two companies to provide soundbites to regulators and politicians who want to support the merger.

Sonic – the Transition from UNEs to Fiber

In my continuing series of writing about interesting competitors, today’s blog is about Sonic, a CLEC and fiber overbuilder working in the San Francisco Bay area and other communities in California. It’s an interesting company because they are the poster child for building a competitive telecom company based upon the rules established by the Telecommunications Act of 1996. That Act required that the large telephone companies unbundle their networks to allow competitors to use their copper lines.

Sonic got started in 1994 as an ISP, then became a CLEC in 2006 and followed the path envisioned by the 1996 Act. This meant collocating electronics in AT&T central offices to provide DSL to customers over unbundled copper loops (UNEs). The company found a receptive customer base since they offered faster broadband than AT&T’s at an affordable price. They grew to be collocated in 200 AT&T central offices around the Bay Area, Sacramento and greater Los Angeles. These offices are tied together by the use of unbundled interoffice transport – also created by the 1996 Act. They originally deployed DSL that used one copper pair but have migrated to VDSL2 and other faster versions of DSL that use two copper pairs and delivers significant bandwidth. They still have almost 50,000 customers in the region using this technology.

What’s interesting is that Sonic did this starting in 2006 – a time by which much of the rest of the industry had written off the use of telco copper. The UNE business plan got a sour reputation with many in the industry when the CLEC industry using UNEs spectacularly imploded in 2001-2002. This collapse of the CLEC industry was due to a perfect storm of economic events and had little to do with the benefits of using telco copper.

If anything, it’s easier to use telco copper today because today’s DSL technology is far better than the DSL in 2000. Sonic and other CLECs are able to provide fast and reliable broadband using ADSL2+ and VDSL2, bonded over multiple copper pairs. Most people in the industry are probably surprised to hear that Sonic can use bonded copper UNEs to provide speeds as fast as 400Mbps to serve businesses. The usefulness of unbundled UNEs is far from dead.

Sonic also reaches roughly 25,000 customers using resale. This allows them to sell the same DSL products sold by AT&T in locations where they don’t have collocations. All of the Sonic products offer a bundle with a voice product that includes all of the expected features plus unlimited calling to the US and to landlines in 66 other countries. They are still finding strong demand for the voice product – something that also might surprise many in the industry.

Five years ago the company decided to use the cash flow from the UNE business to build fiber. Their fiber network now covers roughly 1/3 of the City of San Francisco, plus Brentwood, Sebastopol, Albany, Kensington and Berkeley in the East Bay. They are eying other markets around the region, the state, and beyond. They are an aggressive competitor and their fiber product line starts with a symmetrical gigabit for $40 per month, bundled with the unlimited voice product. They won’t publicly disclose the number of fiber customers, but their goal is to soon have more customers on fiber than on DSL. In my opinion, this is the essence of the vision of the 1996 Act – a transition from UNEs to facility-based networks.

The company’s biggest worry right now is that the FCC recently got a petition from the large telcos asking to end the use of unbundled network elements (UNEs). The big telcos argue that the UNE business plan is obsolete and that there is sufficient competition in the marketplace without unbundling their copper – while also claiming that “In the residential marketplace, competition will not be materially affected by forbearance from Section 251 ( c )(3) because there is effectively no remaining UNE-based competition in that marketplace.” and that “To the extent CLECs serve residential customers using ILEC facilities, they do so on commercial platforms.

But Sonic and a number of other CLECs using UNEs show this to be untrue. Given that just Sonic alone serves nearly 50,000 California households with UNEs these claims are incorrect and misleading. Sonic is using the unbundled copper in exactly the manner envisioned by Congress when they wrote the 1996 Act – to allow competitors to place the best technology possible on the telco copper networks. The Congress at the time reasoned that telephone ratepayers had paid for the copper networks and that the public ought to derive any benefits possible from the networks they had paid for.

The big telcos have always hated the idea of unbundling their networks. They have slowly chipped away at some of the products envisioned by the 1996 Act such as access to telco dark fiber. They would love to kick CLECs like Sonic off their networks – and in Sonic’s case that would deprive 50,000 customers of fast DSL and telephone service at prices they can afford.

Almost every major market in the country, and many smaller ones have CLECs that use unbundled network elements to provide DSL – usually the newer and faster DSL that the telcos won’t invest in. The telcos are slowly walking away from DSL which can be seen by the huge numbers of customers switching to the cable companies.

But CLECs like Sonic have used the copper to bring products that people want – and, unlike the telcos they are pouring those profits back into building fiber to these same communities. That’s exactly what Congress had in mind in 1996 and it would be a shame to see the FCC choke off some of the companies who are offering a competitive alternative to the big cable companies.

Buying a Home with No Broadband

A few weeks ago attended a public meeting at one of my clients and I met a guy there who recently purchased a house in the area that has no broadband. He was told by both customer service at bth the cable company and the local telco that broadband was available – but when he showed up they would not serve him.

It seems like everywhere I travel today I hear this or similar stories and it makes me realize the gigantic value difference between homes with and without broadband. This particular guy works from home and is now scratching his head looking for a solution. He’s not unique and most families with school kids and even most families without look at broadband today as a necessity. Buying a house without broadband is starting to feel a lot like buying a house without electricity or running water – it’s not a home that most people would willingly buy.

Unfortunately, people like this guy, who are not familiar with rural broadband are often told there is broadband when there isn’t. People who move from urban areas often have no clue about the atrocious state of broadband in rural America. They can’t imagine a world where there isn’t even DSL and where folks have to somehow get by on cellular data or satellite data to have connection to the outside world.

I purchased several homes over the last few decades and I’ve always made proof of broadband a contingency in my purchase offer. I then contacted the ISPs and placed an order to be sure that the broadband was real. Sadly, like the guy in this story, one often gets the wrong answer from a call to customer service and I’ve always gone a step further and placed an order. Even that is not always a great solution – when I moved to Florida I was in the house for over a month before Comcast finally connected my home – even though there was a Comcast pedestal at the end of my driveway!

I’ve spoken to a number of rural real estate agents over the last few years and they say almost universally that home broadband is now at or near to the top of homebuyer’s wish these days. They are often surprised by homebuyers who don’t understand the lack of rural broadband. They all have stories about buyers who quickly abandon searches in all parts of a county that don’t have broadband.

There have been numerous studies done that show that a home with broadband is worth more than one without. But I don’t buy the results of those studies any more. We are now at an overall 84% national penetration for broadband and a huge majority of people don’t want a home without broadband. Those studies show an increase of a few thousand dollars in value for home without broadband – but what is value of broadband if you are unable to find a buyer for a home that doesn’t have it? That’s the story that real estate agents tell me today – the inability to sell rural homes without broadband.

One of the interesting things about rural broadband is that the people in rural areas know exactly where the broadband line stops. They know the home closest to them with cable service, they know where DSL becomes too slow to be relevant, and they know where cell phones lose their bars for broadband connectivity. Many rural customers are irate because many of them live just past the broadband dividing line. I hear it all of the time, “The home two houses away has cable TV”, “I’m within a quarter of a mile of good DSL”, “The people on the other side of that hill have a good WISP”, “I can walk to the fiber”.

I remember when I was house-hunting here in Asheville. I live a mile from center city and I can look out my window and see homes with no broadband. My wife had assembled a list of homes to check out and I recall saying a lot, “This area has no broadband, turn the car around”. It is often surprising how close you can be to a town and have no broadband. I think this area is not untypical of a rural county seat where broadband extends only sporadically past the city limits. Folks who don’t know how to look at the wires on poles often don’t realize how broadband often ends at, or just past the city boundary.

This issue is going to get more severe over the next decade and I predict that we’ll start seeing people walk away from rural homes due to lack of willing buyers. I keep expecting to see a lawsuit from a homebuyer who sues a realtor for not telling them the truth about lack of broadband. Such a suit will inevitably bring another piece of paper into home disclosures – a broadband disclosure – which most people care more about than termites and the dozen other things we check off before buying a home.

Working From Home

Governments are starting to catch onto to the idea that one of the most dynamic parts of the new economy is people working from home. Governor Phil Scott of Vermont just signed legislation that provides an incentive for people who want to move to Vermont and work from their homes.

The program consists of grants of up to $5,000 per year, not to exceed $10,000 to help cover the cost of relocating to the state. To qualify for the grants a worker must already be employed by an out-of-state company, work primarily from home and move to the state after January 1, 2019.

The overall program isn’t large, set at $125,000 for 2019, $250,000 for 2020 and back to $125,000 in 2022. If awards are made at the $5,000 level this would cover moving 100 new workers to the state.

In economic development terms, landing 100 new full-time families using a $500,000 tax subsidy is a bargain. Governments regularly provide tax incentives of this size to attract factories or other large employers. The impact on the economy from 100 new high-income families is gigantic and over time time the taxes and other local benefits from these new workers will greatly exceed the cost of the program.

Vermont is like many states and finds itself with an aging population while also seeing an outflow of young people seeking work in New York, Boston and other nearly cities. These grants create an opportunity for young families to move back to the state.

One key aspect of the work-at-home economy is good broadband. Many companies are now insisting that employees have an adequate broadband connection at a home before agreeing to allow a worker to work remotely. I’ve talked to a few people who recently made the transition to home work and they had to certify the speed and latency of their broadband connection.

One reason that this program can work in Vermont is there are areas of the state with fiber broadband. The City of Burlington built a citywide fiber network and local telcos and other cities in the state have built fiber in more rural parts of the state. But like most of America, Vermont still has many rural areas where broadband is poor or non-existent.

What surprises me is that many communities with fiber networks don’t take advantage of this same opportunity. It’s easy for a community with good broadband to not recognize that much of America today has lousy broadband. Communities with fiber networks should consider following Vermont’s example.

I know of one community that is doing something similar to the Vermont initiative. The City of Independence, Oregon has benefitted from a municipal fiber network since 2007, operating under the name of MINET and built jointly with the neighboring city of Monmouth. The city has a new economic development initiative that is touting their fiber network. Nearby Portland is now a hotbed for technology companies including a lot of agricultural technology research.

Independence has one major benefit over Portland and the other cities in the state – gigabit broadband. The new economic development initiative involves getting the word out directly to workers in the agricultural research sector and letting them know that those that can work at home can find a simpler and less expensive lifestyle by moving to a small town. They hope that young families will find lower housing prices and gigabit fiber to be an attractive package that will lure work-at-home families. Independence is still close enough to Portland to allow for convenient visits to the main office while offering faster broadband than can be purchased in the bigger city.

The Big Telco Problem

A few weeks ago I made the observation in a blog that we don’t really have a rural broadband problem – we instead have a rural big telco problem. As I work around the country helping communities that are looking for broadband solutions it finally struck me that the telcos in almost all of these areas are the big companies – AT&T, CenturyLink, Verizon, Frontier, Windstream, etc.

I don’t see these same problems in areas served by smaller telephone companies. These smaller telcos have either upgraded networks to deliver faster broadband or have plans to do so over the next few years. I know of numerous rural telcos that are currently building fiber to rural areas, and those networks are going to serve those areas for many decades to come. There are undoubtably a few small telcos that are not making the needed upgrades, but for the most part the smaller telcos are doing the right thing – they are reinvesting into the rural areas and making the upgrades needed for the future.

The large telcos have done just the opposite. Most of them have been ignoring rural America for decades. They yanked customer service centers from smaller communities many years ago. They drastically cut back on rural technical staffs and it often takes weeks for customers to get repairs. They stopped investing in rural networks and have not upgraded electronics or networks for decades.

There is currently a burst of activity in these rural areas for those big telcos that accepted the billions of dollars of CAF II funding. This funding requires them to upgrade rural broadband to a measly and inadequate broadband speed of at least 10/1 Mbps. However, the rules in the CAF program are weak and there are no repercussions for not meeting the goals and I’ve always expected they will spend the FCC’s money until it’s gone, and then stop the upgrades. This means while some rural customers will get speeds even a little faster than 10 Mbps that there are likely to be many customers who will so no upgrades. I don’t expect the big telcos to spend a dime of their own in rural America once the CAF II upgrades are finished.

While I call this a big telco problem I might just as easily have called it a regulator problem. The FCC and the various state commissions largely deregulated telephone service, and the FCC recently washed their hands of broadband regulation. The big telcos have been milking big profits out of the rural copper networks for decades and have not reinvested any of those profits back into the networks. That’s how big companies act if regulators don’t require them to spend some of their profits on service and upgrades.

By contrast the smaller telcos were not required to upgrade networks, but they have done so anyway. The small companies got a big boost recently from the ACAM program – a different FCC plan that encourages building forward-looking broadband networks. Many of these companies had already upgraded to fiber before the FCC money was available. These smaller telcos are part of the rural community and feel an obligation to do the right thing – and the right thing is to find a way to bring broadband that rural customers need.

Regulators have let us down by not forcing the big telcos to act responsibly. The big telcos now want to walk away from rural copper that they claim is obsolete and in bad shape. But that copper would be in much better shape had these telcos done routine maintenance for the last thirty years. We built a great nationwide copper network due to the simple regulatory principle of universal service. Regulators at both the state and local level believed that the role of government was to ensure that everybody got access to the communications networks that ties us together as a nation. They know that universal service was good for people, but also good for the economy and good for the country as a whole. It’s something that very few other countries did and set America apart from the rest of the world.

I worked at Southwestern Bell pre-divestiture and it was a source of company pride that the company served every customer to the best of our ability. But along came competition and any sense of obligation to the public went out the door and the big telcos instead concentrated on satisfying Wall Street’s demand for ever-higher profits. There have been big benefits from this competition that are hard to deny, but what was missed in the transition to a competitive telecom world was that competition was never going to benefit rural America in the same way it benefits urban areas. We should have foreseen this and kept the universal service policy in place for rural America.

I get angry when I hear politicians and regulators say that municipalities shouldn’t be in the broadband business because the commercial sector will take care of our broadband needs. That is obviously not true and one only has to look at the big telco networks ten miles outside any urban area to see how the big telcos have abandoned customers in higher cost areas.

The big telcos are still milking big profits out of rural America and are still not reinvesting any of their own capital there. I don’t know if there is a way to put the genie back into the bottle and reintroduce regulation for rural America. If we don’t then we are only a few years away from having third-world telecom networks in rural America that will be a major drag on our society and economy.

How to Talk to Bankers

I spend a lot of time assisting clients in finding financing and in doing do I’ve learned a lot about what bankers are looking for from any prospective borrower. Here are some of the key takeaways I’ve learned over the years from talking to bankers:

Be Ready with a Worst Case Scenario. Borrowers invariably create a rosy best-look business plan to demonstrate how well they will perform with the borrowed money. But bankers have learned from hard experience that things often don’t go as well as planned. While bankers certainly want to see the optimistic business projection they are more interested in your worst case scenario, so a smart borrowers will prepare a worst case scenario along with the best case one.

The bankers wants to hear about everything that might go wrong with your plan – project delays, slow sales, higher than expected cost of construction – and then to understand how the borrower plans to cope with each potential snag. They want to be shown that the borrower will be able to repay the loan even if things go wrong. A banker is going to be far more impressed to see ta plan that considers the challenges and has a solution for every contingency. I’ve seen bank loan applications fail when the borrower was unable to answer simple questions about how their plan might fail.

Don’t Talk in Acronyms. Telecom borrowers are invariably highly technical people who understand the nuances of building and operating complex networks. The banker can understand this by looking at your credentials, experience and references. Most bankers don’t want to hear about the detailed nuts and bolts about how the technology works, and they are going to care a lot more about the products to be sold on the network and the plan to effectuate those sales.

I’ve sat on meetings and calls between borrowers and bankers where the response to a simple technical question elicits a fifteen minute spiel on the nuances of the technology. Bankers are not impressed with this, and in fact it can be worrisome if it they perceive the borrower as somebody who can’t explain their business in plain English – because the banker will know that’s what customers are going to want to hear. My advice is to tone down the technology unless the banker specifically wants to hear the details.

Understand the Market. I’ve had numerous clients over the years who have had the philosophy of “build it and they will come,” meaning that they were so positive of the superiority of their proposed network that they just assumed that people will buy their products. The vast majority of the business failures I’ve seen over the years were due to this blindness of the market.

Bankers are going to want to see evidence that people are ready to buy from the new network. In larger markets that might mean a statistically valid survey. In smaller ventures that is going to mean pre-sales and having a list of customers who are ready to buy service. Bankers also want to see a comparison of proposed prices and the prices of the competition  – I am often surprised by proposed new ventures that haven’t taken the time to look at actual customer bills in their proposed market. Do the homework and make the effort to understand the market before asking for funding.

Understand What Bankers are Looking For. Every lender is different and early in the process you need to ask them how they will judge your loan application. I recommend a two-stage process for getting a loan. The first meeting should be to understand what the bank is looking for. Have the banker describe the borrowing process and then use a second meeting to make a formal presentation of the business plan in a way that meets their requirements.

If the bank is interested primarily in collateral, then walk into the presentation ready to talk about that. If they are more focused in seeing a business plan that meets some set of financial metrics like debt service coverage ratio, then walk into the presentation ready to answer those questions.

Bankers talk in lingo just like our industry, and it’s vital to make sure that you understand what they want from you. I’ve seen many borrowers who don’t understand a bank’s requirements and who then never answer the basic questions the bank asks of them. It’s not uncommon for a borrower to be intimidated by the banking process and to be afraid to show that they don’t understand the banking lingo. In the end, if you don’t understand what your banker wants, then it’s likely you won’t be making the right proposal and the chance of getting a loan are greatly diminished.