Title II Regulation and Investment

As the FCC continues its effort to reversing Title II regulation, I’ve seen the carriers renewing their argument that Title II regulation has reduced their willingness to invest in infrastructure. However, their numbers and other actions tell a different story.

The FCC put broadband under Title II regulation in February of 2015 and revised the net neutrality rules a few months later in April. So we’ve now had nearly three years to see the impact on the industry – and that impact is not what the carriers are saying it is.

First, we can look at annual infrastructure spending for the big ISPs. Comcast spent $7.6 billion upgrading its cable plant in 2016, its highest expenditure ever. Charter spent 15% more in 2016 compared to what was spent on it and the cable companies it purchased. Even Verizon’s spending was up in 2016 by 3% over 2015 even though the company had spun off large fiber properties in Florida, Texas, California and other states. AT&T spent virtually the same amount on capital on 2015 and 2016 as it had done in 2013 and 2014.

I’ve seen a number of articles that focus on the overall drop in investment from the cellular industry in 2015. But that drop is nearly 100% attributable to Sprint, which pulled back on new capital spending due to lack of cash. All of the big cellular companies are now crowing about how much they are going to spend in the next few years to roll-out 5G.

It’s important to remember that what the big ISPs tell their investors is often quite different than what they say when lobbying. As publicly traded companies the ISPs are required by law to provide accurate financial data including a requirement to warn stockholders about known risk factors that might impact stock prices. I’m one of those guys that actually reads financial statements and I’ve not seen a single warning about the impact of Title II regulation in the financial reporting or investor press releases of any of the big ISPs.

But the lobbying side of these businesses is a different story. The big ISPs started complaining about the risks of Title II regulations as far back as 2013 when it was first suggested. The big companies and their trade associations have written blogs warning about Title II regulation and predicted that it would stifle innovation and force them to invest less. And they’ve paid to have ‘scholarly’ articles written that come to the same conclusion. But these lobbying efforts are aimed mostly at the FCC and at legislators, not at stockholders.

The fact that big corporations can get away with having different public stories has always amazed me. One would think that something published on the AT&T or Comcast blog would be under the same rules as documents formally given to investors – but it’s obviously not. AT&T in particular tells multiple stories because the company wears so many different hats. In the last year the company has taken one position as an owner of poles that is diametrically opposed to the position it takes as a cellular company that wants to get onto somebody else’s poles. Working in policy for the big ISPs has to be a somewhat schizophrenic situation.

It seems almost certain that this FCC is going to reverse Title II regulation. The latest rumor floating around is that it will be on their agenda on the day before Thanksgiving. That may lead you to ask why the ISPs are still bothering cranking out the lobbying arguments against Title II if they have already won. I think they are still working hard to get a legislative solution through Congress to kill Title II regulation and net neutrality, even if the FCC kills it for now. I think they well understand that a future FCC under a different administration could easily reinstate Title II regulation – particularly now that it has passed muster through several court challenges. The ISPs understand that it will be a lot harder to get a future Congress to reverse course than it might be if Democrats are back in charge of the FCC.

Until recently I always wondered why the ISPs are fighting so hard against Title II regulation. All of the big companies like Comcast, AT&T and Verizon have told stockholders that their initial concerns about Title II regulation did not materialize. And it’s obvious that Title II hasn’t changed the way they invest in their own companies.

But recently I saw an article and wrote a blog about an analyst who thinks that the ISPs are going to drastically increases broadband prices once Title II regulation is gone. Title II is the only tool that the government can use to investigate and possibly act against the ISP for rate increases and for other practices like data caps. If true, and his arguments for this are good ones, then there is a huge motivation for the big ISPs to shed the only existing regulation of broadband.

Smart Cities and Fiber

I’ve noticed that a lot more cities are talking about becoming ‘smart cities.’ Only a few years ago this was something that only NFL cities talked about, but now I see it as a goal for much smaller cities. ‘Smart city’ is an interesting concept. If you listen to the various vendors pushing the idea this means investing in massive amounts of sensors and the computing power to make sense of them. But there are also a lot of lower-tech ideas that fit under this same umbrella.

I’ve had discussion with folks at cities who think that they need fiber in order to have a smart city. Nobody is a bigger proponent of fiber than I am, but fiber is not necessarily needed for many of the concepts that are part of this high-tech vision.

Having smarter traffic flow is generally at the top of everybody’s list. It’s common sense that having vehicles needlessly waiting for lights wastes fuel and wastes time. Smarter traffic lights in cities would improve the quality of life and the economy. A decade ago a lot of cities built fiber networks just to provide a real-time connection to each traffic signal. Those fiber networks allowed the city to change signal timing in reaction to emergencies and similar events, but the whole effort is largely still manual.

But with AI starting to become a realistic technology it looks like truly smart traffic lights are a possibility in the near future. A smart traffic system could change lights on the fly in response to real-life traffic to reduce the average time that vehicles wait for a green light. But the question that must be asked is if this really requires fiber? A decade ago it did. Fiber was needed just to provide the traffic cameras needed to allow somebody at traffic headquarters to eyeball the situation at a given intersection.

But we are now seeing a revolution in sensing devices. We are not too many years removed from the big push to do all heavy-computing in the cloud. A decade ago the vision was that a smart traffic light system would rely on cloud computing power. But faster computers have now reversed that trend and today it makes more sense to put smart computers at the edge of network. In the case of traffic lights, smart computers at the edge reduces the need for bandwidth. Sensors at an intersection no longer need to broadcast non-stop and only need to relay information back to the central core when there is some reason to do so.

For example, one of the uses of a smart traffic system is to identify problem intersections. Sensors can be programmed to record every instance when somebody runs a red light or even a late yellow light and this can alert authorities to problems long before a tragic accident. But these sensors only need to send data when there is an actionable event, and even that doesn’t require a gigantic burst of data.

The same goes for smart traffic control. The brains in the device at an intersection can decide to allow for a longer green for a turn lane if there are more cars than normal waiting to turn. That doesn’t need a big continuous bandwidth connection. The city will want to gather data from intersections to know what the devices are doing, but with smart edge devices a wireless connection provides adequate broadband and a lower cost solution for data gathering.

This same trend is happening with other kinds of sensors. Sensors that listen for gunshots, smart grid sensors used to monitor water and electric networks, and smart sensors used to provide smarter lighting all can be done wirelessly and do not need a fiber connection.

The real purpose behind the concept of a smart city is to provide better government service to constituents. Many of the best ideas out there don’t involve much bandwidth at all. For example, I recently watched a demo of a system in a mid-western city that allows citizens to see, in real time, the location on a map all of the snow plows and trash trucks operating in the city – much like is done when you can see a Lyft ride coming to pick you up. This will drastically cut down on calls during snowstorms since citizens can see a plow making its way towards their street. (And watching the plow inch towards you on a snowy day is good entertainment!)

Cities are undertaking all sorts of other initiatives to improve quality of life. I see cities working on computer systems that put all government forms and processes online, making it easier to get a permit or to report a problem to the city. Cities are reducing pollution by passing ordinances that promote roof-top gardens, that require that new high-rises that are energy self-sufficient and that promote safe bicycling.

There are still big corporations out pitching the expensive smart city vision. But there are now smaller boutique smart city vendors that working towards more affordable and reasonably-priced sensors to spread around a city.

Like anyone who lives in a city I would love to see my city implement smart city ideas that improve the quality of life. But as much as I am a fiber-proponent, I am finding it hard to make a case that a lot of urban fiber is needed to implement the best smart-city ideas.

The Battle Over Small Cell Deployment

Governor Jerry Brown of California recently vetoed a bill, SB 649, that would have given wireless carriers cheap and easy access to poles. He said the bill was too much in the favor of the wireless companies and that a more balanced solution is needed.

This law highlights the legislative efforts of the cellular industry and the big telcos working to deploy 5G networks who want cheap and fast access to poles. There were similar pushes in many state legislative bodies this past year including in Texas, Florida and Washington. I think we can expect this to appear in many more state legislatures next year. This is obviously a big priority for the carriers who reportedly spent tens of millions of dollars lobbying for this in the recent legislative sessions.

It’s not hard to understand why the carriers want a legislative solution, because the alternative is the regulatory path. This is a complicated issue and the carriers know that if they try to get this through state regulatory commissions that it will take a long time and that regulators are likely to provide a balanced solution that the carriers don’t want.

There is one regulatory push on the issue and the FCC is considering it. The FCC voted in May to begin an investigation on the issues involved. One of the things they are examining are the regulatory impediments at the state and local levels that affect the issue. But the carriers know that the FCC path is a slow one. First, any FCC decision is likely to be challenged in court, a tactic that the carriers themselves often use to slow down the regulatory process. But there is also a big jurisdictional question, because today the states have the authority to override FCC rules concerning pole issues.

The issue is important because it’s at the heart of the hottest area of telecom growth in the deployment of mini-cell sites and the upcoming deployment of the various kinds of 5G. Not only do the carriers need to deploy millions of such connections to implement the networks they are promising to stockholders, but they also will have to be building a lot of new fiber to support the new wireless deployments.

It’s easy to sympathize with the carriers. I’ve herd the horror stories of it taking two years to get a wireless attachment approved in some cities, which is an obvious impediment to any sensible business plan deployment. But as is typical with these carriers, rather than asking for sensible rule changes that everybody can agree on they are promoting plans that are heavily lopsided in their favor. They want to deploy wireless devices using a method they are calling one-touch – which they interpret to mean installing devices on poles and telling the pole owner after it’s done. They also want these connections for dirt cheap. And they don’t want to have to be concerned with the safety issues involved in adding boxes and live electric connections into the mix of wires on existing poles.

The issue is interesting from the perspective of small CLECs and fiber overbuilders because small carriers have been yelling for years about the problems associated with getting access to poles – and nobody has been listening. In fact, one of the big proponents of the legislative process is AT&T, which is still fighting Google and others about getting access to AT&T poles. It’s not surprising to see that the proposed new laws favor wireless deployments without necessarily making it any easier for fiber overbuilders.

Since the carriers are throwing a lot of money at this it certainly seems likely that they will win this issue in some states. There are a number of states where the lobbying money of the big carriers has always gotten the carriers what they wanted. But there are plenty of states where this won’t pass, and so we are likely going to end up with a hodgepodge of rules, state by state, on the issue.

I’m not even sure where I stand on the issue. As a consumer I want to see advanced wireless technologies deployed. But as a homeowner I don’t necessarily want to see an ugly proliferation of big boxes on poles everywhere. And I certainly don’t want to see 120-foot poles deployed in my neighborhood and the trees decimated to accommodate line-of-sight wireless connections to homes. And as somebody who mostly works for smaller carriers I’m naturally biased against anything that benefits the big carriers over everybody else. I don’t know if there is a better indication about how complicated this is when somebody with my knowledge has mixed feelings about the issue.

Cellular WiFi Handoffs

If you use anybody except Verizon you may have noticed that your cellphone has become adept at handing your cellular connections to a local WiFi network. Like most people I keep my smartphone connected to WiFi when I’m at home to save from exhausting my cellular data cap. I have AT&T cellular service and I’ve noticed over the last year that when I’m out of the house that my phone often logs onto other WiFi networks. I can understand AT&T sending me to their own AT&T hotspots, but often I’m logged on to networks I can’t identify.

When I lived in Florida I was a Comcast customers and so when I was out of the house my phone logged onto Comcast hotspots. Even today my phone still does this, even though I’m no longer a Comcast customer and I assume there is a cookie on the phone that identifies me as a Comcast customer. I understand these logins, because after I the first time I logged onto a Comcast hotspot my phone assumed that any other Comcast hotspot is an acceptable network. This is something I voluntarily signed up for.

But today I find my phone automatically logged onto a number of hotspots in airports and hotels which I definitely have not authorized. I contrast this with using my laptop in an airport or hotel. With the laptop I always have to go through some sort of greeting screen, and even if it’s a free connection I usually have to sign on to some terms of service. But my phone just automatically grabs WiFi in many airports, even those I haven’t visited in many years. I have to assume that AT&T has some sort of arrangement with these WiFi networks.

I usually notice that I’m on WiFi when my phone gets so sluggish it barely works. WiFi is still notoriously slow in crowed public places. Once I realize I’m on a WiFi network I didn’t authorize I turn the WiFi off on my phone and revert to cellular data. Every security article I’ve ever read says to be cautious when using public WiFi and so I’d prefer not to use these connections unless I have no other option.

There was a major effort made a few years back to create a seamless WiFi network for just this purpose. The WiFi Alliance created a protocol called Hotspot 2.0 that is being marketed under the name of Passpoint. The purpose of this effort was to allow cellular users to automatically connect and roam between a wide variety of hotspots without having to ever log in. Their ultimate goal was to enable WiFi calling that could hand off between hotspots in the same way that cellular phones hand-off between cell sites.

It’s obvious that AT&T and other cellular carriers have implemented at least some aspects of Hotspot 2.0. In the original vision of Hotspot 2.0 customers were to be given the option of authorizing their participation in the Passpoint network. But AT&T has never asked my permission to log me onto WiFi hotspots (unless it was buried in my terms of service). AT&T has clearly decided that they want to use these WiFi handoffs in a busy environment like an airport to protect their cellular networks from being swamped.

It’s interesting that Verizon is not doing this. I think one reason for this is that they don’t want to give up control of their customers. Verizon foresees a huge future revenue stream from mining customer data and I’m guessing they don’t want their customer to be shuttled to a WiFi network controlled by somebody else, where they can’t track customer behavior. Verizon is instead pushing forward with the implementation of LTE-U where they can direct some data traffic into the WiFi bands, but all under their own control. While LTE-U uses WiFi frequency, it is not a hotspot technology and is as hard to intercept or hack as any other cellular traffic.

Most new cellphones now come with the Passpoint technology baked into the chipset. I think we can expect that more and more of our cellular data connections will be shuttled to hotspots without notifying us. Most people are not going to be bothered by this because it will reduce usage on their cellular data plans. I’m just not nuts about being handed off to networks without some sort of notification so that I can change my settings if I don’t want to use the selected network. I guess this is just another example of how cellular companies do what they want and don’t generally ask for customer permission.

Death of the Smartphone?

The smartphone has possibly been the most transformative technology of the past hundred years. It’s unleashed the power of the computer in a portable always-with-us way that has changed the way that most of us interface with the world. But as unlikely as it might seem, it also might be one of the shortest-lived major technologies in history.

When looking forward it seems inevitable that smartphones will largely be replaced by voicebot technology. Voicebots are already intertwining into our lives in major ways. Apple’s Siri, Amazon’s Echo and Google Assistant are already replacing a lot of other technologies.

Voicebots have already entered my life in several key ways. As a music lover I’ve gone through every technology upgrade since vinyl. I had a huge CD collection and burned tons of custom CDs of my favorite songs. I used an iPod heavily for a few years. I downloaded music and built custom playlists of my music. And I used streaming radio services. But this has all now been replaced by my Amazon Echo. It’s integrated into Amazon music, Sirius XM Radio, and Pandora, and I can just ask aloud to hear the music I want.

I also now use voicebots for simple web searches and I no longer have to use my phone or PC to find out when a local store or restaurant is open. I use my Echo to take notes to remember later, something that is important to me since I wake with ideas at 2:00 in the morning!  In the past I would scramble for something to write on, which inevitably woke me up – but no longer.

Voicebots are also supplanting a lot of apps I used to use. It’s a lot easier to just ask about the weather rather than look it up. I can ask for sports scores before my feet hit the floor out of bed. Voicebots are starting to displace other smartphone functions. I can now make and receive texts by voice – this isn’t quite fully integrated into Echo, but I expect it soon will be. Voicebots integrated into the car give us driving directions and can lead us to the nearest gas station, all directed by voice.

Voicebots are growing steadily better at voice recognition. I’ve had the Amazon Echo for about 18 months and it gets a little better month by month. Voicebots are also getting better at responding to requests. All of the major voicebots are using primitive artificial intelligence to learn from their mistakes and to get better at responding to user requests. Questions that puzzled my Echo months ago are now sailing through.

Some voicebot functions are still nearly unusable. I have Microsoft’s Cortana on my PC and it’s not really helpful in the way I would like to use it. Ideally it could replace most of my keyboard functions. But it’s not hard to forecast that within a few years that voice commands will finally make it easier to use a PC.

If voicebots are going to grow to the next level it’s going to take improvements in AI. But everything is pointing in that direction. Just a few weeks ago a new AI from Google learned the game of Go from scratch in just three days with nothing more than being given the rules of the game. The new AI won 100 games straight against the older Google AI that beat the best human player earlier this year.

As AI gets better the voicebots are going to get better. There will come a time soon where it’s easier to use a voicebot for most of the apps on a smartphone, and that’s when voicebots will start to eat away at smartphone penetration rates.

I for one would love to ditch my smartphone. Even after all of these years I’ve never been comfortable having to remember to carry it and I walk away and leave it all of the time. And somehow we’ve gotten roped into spending $600 or more every two years for a new device. I would be much happier wearing tiny earbuds that let me talk to a voicebot that has been able to learn my habits.

Most of the developers in the AI world think that voicebots will enable real digital assistants that step in many times a day to make our lives easier. This trend seems inevitable and one has to wonder how the emergence of voicebots will affect the huge push for 5G wireless? Most of the things that voicebots do for us are low bandwidth and could easily be done using a fully implemented LTE network. It’s hard to picture where this all might lead, but one thing seems certain – the death of the smartphone will probably be just as disruptive as its birth.

Why the Big Programming Cost Increases?


I recently talked to several clients who are expecting an increase in cable TV programming costs of between 8.5% and 9% for next year. They are able to forecast this because most of the contracts for programming cover at least three years of baked-in rate increases.

Every one of these clients is bleeding cable customers. We hear about how the big cable companies are experiencing impact from cord cutting. Last year the big companies altogether lost about 1.7 million customers, which is a little less than 2% of their customer base. But my small clients seem to be losing cable customers at a much faster pace. Cord cutting is obviously a real phenomenon and I’ve seen recent estimates that the big companies are expected to lose around 1.9 million customers this year. But while the big companies are losing customers at a steady pace, smaller cable operators are seeing a much bigger impact.

I think there are a number of reasons that small cable providers are suffering more.

  • Most of my small clients don’t pay the same billing games as the big cable companies. The big companies have created a number of ‘fees’ such as a local programming fee or a sports fee to disguise the real cost of cable. Many customers think these fees are taxes of some sort and they believe that the base price of cable shown on their bill is the actual price they are paying. That lower number is the one that they use when comparing to other alternatives.
  • The big companies are also far more aggressive with their bundling. They work hard to force customers into bundles and they penalize customers for leaving a bundle. Customers often don’t know what they pay for any specific product in a bundle and when they try to drop one product the full bundle savings are applied to that product. Even when small companies have bundles they don’t create a huge financial disincentive to leave the bundle.
  • Big companies are willing to give ‘special’ pricing to keep customers. They tend to give special pricing discounts aimed at new customers to anybody else who is willing to wade through the customer service minefield to ask for it. I think since smaller companies often don’t advertise ‘special’ prices they are far less likely to even be asked to reduce rates.
  • My smaller clients are generally more rural than the big companies, and as such they face far stiffer competition from the satellite companies. Both of the satellite providers now have a ‘skinny’ bundle that a lot of customers are finding attractive.

Why are the programmers raising rates so aggressively when it’s clear that the price of cable service is the number one driver of cord cutting? I have several ideas why they might be doing this:

  • These are all publicly traded companies and to some degree they don’t have a choice. Over 90% of cable channels are bleeding customers much faster than the rate of cord cutting. This shows that many customers are cord shaving and downgrading to smaller, less expensive packages. The programmers are compelled to increase profits, and with declining sales they can only compensate by raising programming rates. That sounds insane because it sounds like the beginning of a classic death spiral. But you must remember that any large publicly traded company that performs poorly is subject to being purchased by somebody else who will then force profits back up again. Our dreadful quarterly profit driven economy is forcing the programmers into a path that is not in anybody’s best interest.
  • They are all chasing hit shows. There are now a lot more companies like Netflix and Amazon creating unique programming, which adds to the pressure on the programmers. The financial rewards from producing even one hit show is gigantic, so they all keep spending money trying to find that next big hit, and raising rates to cover the cost of producing content.
  • Another theory is that the current rate increases are their last hurrah. They can see where the industry is headed. I saw an interview with the head of programming for FOX and he said that he expects that the company is going to have to ultimately collapse most of its many channels as they keep losing customers. And so perhaps these rate increases are the chance for making big profits for a few more years before the wheels come off. It seems that end is coming anyway, so maybe raising rates now is a way to milk every last penny out of a fading industry.

Programming content is certainly never going to go away. But companies like Netflix and Amazon are showing that there are reasonable alternatives to the huge TV bundles. I just wish I knew what to tell my clients. The most common question I seem to be getting these days is, “Should I even be in the cable business any longer?” I’m starting to think that the answer for many of these businesses is no – or it will be no within a few short years.

When a Consultant Says ‘No’

Doug Dawson, 2017

One of my competitors recently held a webinar where they told a group of municipalities that they should never accept ‘no’ from a consultant who is evaluating fiber business plans. This is about the worst advice I think I have ever heard for many reasons. I think perhaps this consultant meant that one shouldn’t be afraid to be creative and to look at alternative ideas if your first ideas don’t pan out. But that’s not what they said.

Building and operating a fiber network is like any other new business venture and sometimes a new business venture is just not a good idea. This is why anybody launching a new business of any type does their homework and kicks the tires on their ideas to quantify the opportunity. A feasibility study means going through the process of gathering as many facts as possible in order to make an informed decision about a new opportunity.

The advice in this webinar was given to municipalities. Somebody giving this same advice to for-profit ISPs would be laughed out of the room. Established commercial ISPs all understand that they have natural limitations. They are limited in the amount of money they can borrow. They understand that there are natural limits on how far they can stretch existing staff without harming their business. They understand that if they expand into a new market and fail that they might jeopardize their existing company. My experience in building business plans for existing ISPs is that they are as skeptical of a good answer as a bad one and they dig and dig until they understand the nuances of a business plan before ever giving it any real consideration.

But municipalities build fiber networks for different reasons than for-profit ISPs. Existing ISPs want to make money. They also undertake expansion to gain economy of scale, because in the ISP world being larger generally means better margins. But cities have a whole other list of motivations for building fiber. They might want to solve the digital divide. They might want to lower prices in their market and foster competition. They might want to promote economic development by opening their communities to the opportunities created by good broadband.

These are all great goals, but I have rarely talked with a municipality that also doesn’t want a broadband business to at least break even. I say rarely, because there are small communities with zero broadband that are willing to spend tax dollars to subsidize getting broadband. But most communities only want a fiber business if the revenues from the venture will cover the cost of operations.

Sometimes a strong ‘no’ is the best and only answer to give to a client. Clients often come to me determined to make one specific business plan idea work. For example, many communities don’t just want a fiber network, but they want a fiber network operating under a specific business model like open access. That’s a business model where multiple ISPs use the network to compete for customers. Open access is an extremely hard business plan to make work. I’ve often had to show municipalities that this specific idea won’t work for them.

Or a commercial ISP might want to enter a new market and want to make it work without having to hire new employees. My advice to them might be that such an expectation is unrealistic and that over time they will have to hire the extra people.

My advice to clients is that they should be just as leery of a ‘yes’ answer as a ‘no’ answer. For example, every one of the big open access networks has an original business plan on the shelf that shows that they were going to make a lot of money – and those business plans were obviously flawed. If they had challenged some of the flawed assumptions in those business plans they probably would not have entered the business in the way they did. It’s a shame their original consultant didn’t say ‘no’.

I’ve always said that ‘dollars speak’ and any new business has to make financial sense before you can think about meeting other goals. Every business plan contains hundreds of assumptions and it’s always possible to ‘cook’ the assumptions to find a scenario that looks positive. I have created business plans many times for commercial and municipal clients where an honest look at the numbers just doesn’t add up. I’ve had a few clients ask me to create a more rosy forecast and I’ve always refused to do this.

I personally would be leery of a consultant that doesn’t think that ‘no’ can be the right answer for doing something as expensive as launching a fiber venture. Sometimes ‘no’ is the right answer, and if somebody tells you ‘no’ you ought to listen hard to them. It makes sense to kick the tires on all of the assumptions when you hear ‘no’ and to get a second opinion, if needed. But it’s important to kick the tires just as hard when you get ‘yes for an answer.

California Lowers the Definition of Broadband

California Governor Jerry Brown just signed a bill into law that lowers the official definition of broadband in the state while also providing state funding to upgrade rural broadband. The bill, AB 1665, goes into effect immediately. It lowers the definition of broadband in the state to 10 Mbps down and 1 Mbps up. But it goes even further and lowers the definition of an unserved customer to somebody who can’t get speeds of 6 Mbps and 1 Mbps up.

The bill reinstates a telecom tax that will provide a $300 million fund intended to be used to improve rural broadband. The California press believes that the fund will largely go to AT&T and Frontier, which both lobbied hard for the bill. My reading of the bill is that the incumbent carriers have first shot at the funding and anybody else only gets it when they don’t take it. In practical terms, assuming those two companies take the funding, almost none of this money would be made available to anybody who wants to build something faster in unserved areas.

We know that state funding done the right way can be a tremendous boon to broadband expansion. Consider, for example, the Minnesota DEED grants that have coaxed dozens of telecom providers to expand fiber networks deep into unserved and underserved areas of the state. It’s commonly understood that it can be hard to justify bringing fiber to rural areas, but some grant funding can be an effective tool to attract private money to fund the rest.

We also understand today that there are huge economic benefits for areas that have good broadband. The farmers in Minnesota that benefit from the grant program there are going to have a competitive advantage over farmers elsewhere that have little or no broadband. I’ve been looking at the IOT and other fiber-based technologies on the horizon for farming that are going to vastly increase productivity.

We also know that having good broadband benefits the small communities in rural America as well. These communities have been experiencing brain drain and economic flight as people are forced to go to metropolitan areas to find work. But broadband opens up work-at-home opportunities that ought to make it possible for families to thrive in rural America.

This move by California is a poor decision on many levels. First, it funnels money to the incumbent providers to make tiny tweaks to the existing networks so that existing broadband is just a little better. The new 10/1 Mbps broadband definition is also nothing more than a legislative definition of broadband and has no relevance in the real world. Many homes need more broadband than that, and as household broadband demand grows, a 10/1 Mbps connection will become inadequate for every home.

Another reason this is a bad idea is that the incumbents there are already making improvements to increase broadband to the 10/1 Mbps level. AT&T took $361.4 million of FCC CAF II funding that is to be used to upgrade broadband to 141,500 homes in California. That works out to $2,554 per home passed. Frontier took another $36.6 million, or $2,853 per home passed to improve broadband to 12,800 homes. That federal money requires that speeds increase to the 10/1 Mbps speed. This state funding will be an additive to those large federal amounts that these two companies have already received from the government.

AT&T has also already said that it plans to meet its CAF II obligations by upgrading rural cellular speeds. Frontier is mostly going to improve DSL on ancient copper and also is now looking at using point-to-point wireless technology to meet the CAF II obligations.

I don’t know how much it’s going to cost these companies to upgrade their rural customers to 10/1 Mbps. But the federal funding might be enough to pay for all of it. Adding the state funding means it’s likely that these two companies will make an immediate profit from upgrading rural customers to barely adequate broadband speeds. As we’ve seen many times in the past, this bill is good evidence that the big companies get value out of their lobbying efforts. The losers in all of this are the homes that won’t get anything faster than CAF II broadband. This $300M could have been used as matching grants to bring much faster broadband to many of these homes.


Will Banks Invest in Infrastructure Again?

Six local banks in Kentucky banded together to create a $150 million investment fund to support public private partnerships. The fund is called the Commonwealth Infrastructure Fund and is intended to provide debt financing to state and local PPP initiatives in the state.

You might not think this is newsworthy, but it is for several reasons. It’s one of only a handful of examples of bank debt being clearly earmarked for infrastructure investing. In this country virtually all debt for projects that involve the government is financed with municipal bonds. But this wasn’t always the case. While municipal bonds, or their equivalent, have been around for centuries, as recently as fifty years ago banks also played a big role in lending to municipal projects.

But for various reasons banks backed out of infrastructure investing. First, banks have backed away over the years from lending into long-term projects. Municipal projects are often of long duration and it’s not unusual to see infrastructure projects financed over 20 – 30 years. That’s a long time for a bank to tie up money and it also carries the risk of lending into future higher interest rates.

There have also been some spectacular failures with municipal bond defaults in places like New York City and Detroit. While the risk of lending to commercial businesses is a lot higher, the municipal defaults have added risk to lending to municipal-based projects. However, to offset this, the collateral on municipal loans can be extremely safe, particularly if default on a loan is backed by tax revenues.

It’s important to note that this particular fund is looking specifically at public private partnerships. That is a venture that that benefits the government but is backed to some degree by private capital. PPPs come in many flavors. At one end of the spectrum are projects that are all private money, such as some recent projects where a commercial company built new schools and then leased them back to the government. At the other end of the spectrum are PPP projects where the government mostly finances but a private firm largely operates the venture. A good example of this is the fiber network in Huntsville, AL where the city built the project and Google Fiber operates the business.

This fund is something that the country really needs. I’ve seen estimates that there are somewhere between $4 – $6 trillion of needed infrastructure improvements in the country. This ranges from deteriorating roads, crumbling overpasses and bridges, old government buildings, outdated schools, old dams and water projects, etc. But currently there is already over $3.7 trillion in outstanding municipal bond debt. The cities and states can’t begin to take on all of the additional debt needed to bring our infrastructure up to snuff. So we need private money to enter the picture and to help pay for projects where that makes sense.

Anybody lending into PPPs understands the relatively low returns from infrastructure investing. Municipal bonds today generally pay interest rates of 2% to 5%. A lot of private money has been chasing the higher returns of technology investing, but there are still plenty of sources of money like pension funds that are happy with long-term stable and predictable returns. All of the financiers I know say that they are seeing a renewed interest in long-term safe returns.

This Kentucky fund would be a perfect place to look for help with fiber projects. Kentucky is one of the states that still has huge amounts of its geography with poor or non-existing broadband. I would be surprised if the telcos in the state don’t show interest in the fund, assuming the fund would be interested in them.

Raising $150 million for infrastructure lending is only a drop in the bucket when looking at the big picture. But it’s a start and hopefully this will lure other banks and sources of debt and equity to give more consideration to infrastructure funding.

Bad Telecom Deals

FierceWireless recently published a short article listing the 10 worst telecom business moves of the last 10 years. And there are some clunkers on the list like Google’s purchase of Motorola, AT&T’s effort to buy T-Mobile and Time Warner Cable’s agreement to pay over $8 billion for the rights to broadcast the LA Dodgers.

One of the bad moves listed was Fairpoint’s purchase of Verizon’s customers and networks in Maine, New Hampshire and Vermont. Everything imaginable went wrong with that purchase that closed in 2007. The transition to Fairpoint was dreadful. There were numerous network outages as the cords were cut to the Verizon network. Customers lost email access. They couldn’t place long distance calls out of state and many couldn’t even call customer service. Customers abandoned the company in droves and in 2009 Fairpoint declared bankruptcy and recently sold the company to Consolidated.

There are other similar stories about companies that have bought large number of customers from the large telcos. Earlier this year there was reports of widespread customer dissatisfaction after Frontier bought a large swath of Verizon lines.

There are a number of lessons to be learned from the Fairpoint and similar transactions. First, it is exceedingly difficult to buy customers from the large telcos. The processes at the big companies are mind-numbingly complicated. I remember talking to a guy at AT&T years ago about the process of provisioning a new T1 to a customer. As we walked through the internal processes at the company I realized that nearly a dozen different departments at AT&T scattered across the country were involved in selling and connecting a single T1. It’s impossible for a new buyer to step into the middle of such complication – no matter what employees might come with the purchase of a property there will be numerous functions that the acquired folks don’t know how to do.

I recall helping a client buy a few exchanges from Verizon back in the 1990s. The buyer got literally zero records telling them the services that business customers were using. The buyer had to visit every business customer in the hopes of getting copies of bills, which were often undecipherable. I remember even years later that there were business customers that had working data circuits that the buyer didn’t entirely understand – they worked and their philosophy was to just never touch them.

The point of all of this is that the transition of a property from a big company always has major problems. No matter how long the transition process before conveying everything to the buyer, on the day the switch is thrown there are big holes. And this quickly leads to customer dissatisfaction.

The other issue highlighted by these transitions is that a buyer rarely has enough human resources ready to deal with the onslaught of problems that start immediately with the cutover. It can be massively time consuming to help even a single customer if you don’t have good enough records to know what services they have. Multiplying that times many customers spells disaster.

Not all sales of big telco properties are in massive piles and I’ve helped clients over the years to purchase smaller numbers of exchanges from the big telcos. I have several clients looking at potential purchases today, which highlights the other big problems with buying telco properties.

Today, any small buyer of a copper network probably only does so with a plan to convert the new acquisition to fiber-to-the-home. The condition of acquired copper plant is generally scarily bad. I can remember that Verizon let it be known for at least fifteen years that the whole state of West Virginia was for sale before Frontier finally bought it. Industry folks all knew that during that whole time that Verizon had largely walked away from making any investments in the state or even doing anything beyond putting band-aids on maintenance problems. Frontier ended up with a network that barely limped along.

So a buyer has to ask how much value there really is in a dilapidated copper network. If a buyer spends ‘market’ rates to buy a telco property and then spends again to upgrade the acquisition they are effectively paying for the property twice. I’ve crunched the numbers and I’ve never been able to find a way to justify this.

I think we may have reached the point where existing copper networks have almost zero market value. Even with paying customers, the revenues generated from older copper networks are not high enough to support buying the exchange and then spending again to upgrade it. This is something that prospective buyers often don’t want to hear. But as I always advise, numbers don’t lie, and it’s become obvious to me that it’s not a good economic deal to invest in old copper networks. It usually makes more sense to instead overbuild the property and take the customers.