Selling Transport to Small Cell Sites

A lot of my clients make money by selling transport to the big traditional cell sites. Except for a few of them that operate middle-mile networks, the extra money from cell site transport adds a relatively high-margin product into the last-mile network.

Companies are now wondering how hard they should pursue small cell sites. They keep reading about the real-estate grab in the major cities where a number of companies are competing to build small cell enclosures, hoping to attract multiple carriers. They want to understand the size of the potential market for small cells outside of the major metropolitan areas. It’s not an easy question to answer.

The cellular carriers are building small cell sites in larger markets because they have exhausted the capabilities of the traditional large cell sites. The cellular companies have pushed bigger data plans and convinced customers that it’s okay to watch video on cellphones, and now they find that they are running out of bandwidth capacity. The only two immediate fixes are to build additional cell sites (thus, the small cells) or else add more spectrum. They eventually will layer on full 5G capability that will stretch spectrum a lot farther.

There are varying estimates for the eventual market for small cell sites. For example, the CTIA, the lobbying group for the wireless industry, estimates that small cells will grow from 86,000 in 2018 to 800,000 by 2026. The Wall Street analyst firm Cowan estimates 275,000 small cells by the end of 2023.

The big companies that are in the cellular backhaul business are asking the same questions as my clients. Crown Castle is embracing the small cell opportunity and sees it as a big area of future growth. Its competitor American Tower is more cautious and only chases small cell opportunities that have high margins. They caution that the profit opportunity for small cells is a lot less than at big towers. Other companies like Zayo and CenturyLink are pursuing small cells where it makes sense, but neither has yet made this a major part of their growth strategy – they are instead hoping to monetize the opportunity by adding small cells where they already own fiber.

The question that most of my clients want to understand is if the small cell building craze that has hit major metropolitan areas will ever make it out to smaller cities. In general, the big cellular carriers report that the amount of data used on their cell sites is doubling every two years. That’s a huge growth rate that can’t be sustained for very long on any network. But it’s likely that this rate of growth is not the same everywhere, and there are likely many smaller markets where cell sites are still underutilized.

Metropolitan cell sites were already using a lot of broadband even before customers started using more data. We know this because the cellular carriers have been buying and using robust data backhaul to urban sites of a gigabit or more in capacity. One good way to judge the potential for small cell sites is to look at the broadband used on existing tall tower sites. If a big tower site is using only a few hundred Mbps of bandwidth, then the cell site is not overloaded and still has room to accommodate broadband growth.

Everybody also wants to understand the revenue potential. The analyst firm Cowan estimates that the revenue opportunity per small cell site will average between $500 and $1,000 per site per month. That seems like a high price outside of metropolitan areas, where fiber is really expensive. I’ve already been seeing the big cellular carriers pushing for much lower transport rates for the big cell sites and in smaller markets carriers want to pay less than $1,000 per big tower. It probably takes 5 – 7 small cells to fully replace a large tower and it’s hard to envision the cellular carriers greatly expanding their backhaul bill unless they have no option.

It’s also becoming clear that both Verizon and AT&T have a strategy of building their own fiber anyplace where the backhaul costs are too high. We’ve already seen each carrier build some fiber in smaller markets in the last few years to avoid high transport cost situations. If both companies continue to be willing to overbuild to avoid transport costs, they have great leverage for negotiating reasonable, and lower transport costs.

As usual, I always put pen to paper. If the CTIA is right and there will be 800,000 small cell sites within six years that would mean a new annual backhaul cost of almost $5 billion annually for the cellular companies at a cost of $500 per site. While this is a profitable industry, the carriers are not going to absorb that kind of cost increase unless they have no option. If the 800,000 figure is a good estimate, I predict that within that same 6-year period that the cellular carriers will build fiber to a significant percentage of the new sites.

Perhaps the most important factor about the small cell business is that it’s local. I have one client in a town of 7,000 that recently saw several small cell sites added. I have clients in much larger communities where the carriers are not currently looking at small cell sites.

The bottom line for me is that anybody that owns fiber ought to probably provide backhaul for small cells on existing fiber routes. I’d be a lot more cautious about building new fiber for small cell sites. If that new fiber doesn’t drive other good revenue opportunities then it’s probably a much riskier investment than building fiber for the big tower cell sites. It’s also worth understanding the kind of small cell site being constructed. Many small cells sites will continue to be strictly used for cellular service while others might also support 5G local loops. Every last mile fiber provider should be leery about providing access to a broadband competitor.

How’s CAF II Doing in Your County?

The CAF II program was tasked with bringing broadband of at least 10/1 Mbps to large parts of the country. I’ve been talking to folks in rural counties all over the country who don’t think that their area has seen much improvement from the CAF II plan.

The good news is that there is a way to monitor what the big telcos are reporting to the FCC in terms of areas that have seen the CAF II upgrades. This web site provides a map that reports progress on several different FCC broadband plans. The map covers reported progress for the following programs:

  • CAF II – This was the $11 billion subsidy to big telcos to improve rural broadband to at least 10/1 Mbps.
  • CAF II BLS – This was Broadband Loop support that was made available to small telcos. Not entirely sure why the FCC is tracking this using a map.
  • ACAM – This is a subsidy given to smaller telcos to improve broadband to at least 25/3 Mbps, but which many are using to build gigabit fiber.
  • The Alaska Plan. This is the Alaska version of ACAM. Alaska is extremely high cost and has a separate broadband subsidy plan.
  • RBE – These are the Experimental Broadband Grants from 2015.

Participants in each of these programs must report GIS data for locations that have been upgraded, and those upgraded sites are then shown on the map at this site. There is, of course, some delay between the time of completing upgrades and getting information onto this map. It’s now been 4.5 years into the six-year CAF II plan, and the carriers have told the FCC that many of the required upgrades are completed. All CAF II upgrades must be finished by the end of 2020 – and likely most will be completed sometime earlier next year during the summer construction season that dictates construction in much of the country.

The map is easy to use. For example, if you change the ‘Fund’ box at the upper right of the map to CAF II, then all of the areas that were supposed to get CAF II upgrades are shown in light purple. In these areas, the big telcos were supposed to upgrade every residence and business to be able to receive 10/1 Mbps or better broadband.

The map allows you to drill down into more specific detail. For example, if you want to see how CenturyLink performed on CAF II, then choose CenturyLink in the ‘Company Name’ box. This will place a pin on the map for all of the locations that CenturyLink has reported as complete. As you zoom in on the map the upgraded locations will show as dark purple dots. You can zoom in on the map to the point of seeing many local road names.

The map has an additional feature that many will want to see. Down on the left bottom of the map under ‘Boundaries’ you can set political boundaries like County borders.

Most counties are really interested in the information shown on the map. The map shows the areas that were supposed to see upgrades along with areas that have been upgraded to date. This information is vital to counties for a number of reasons. For example, new federal grants and most state grant programs rely on this data to determine if an area is eligible for additional funding. For example, the current $600 million Re-Connect grants can’t be used for areas where more than 10% of homes already have 10/1 Mbps broadband. Any areas on this map that have the purple dots will probably have a hard time qualifying for these grants. The big telcos will likely try to disqualify any grant requests that build where they say they have upgraded.

Probably the most important use of the map is as a starting point for counties to gather accurate data about broadband. For example, you might want to talk to folks that live in the upgraded areas to see if they can really now buy 10/1 Mbps DSL. My guess is that many of the areas shown on these maps as having CAF II upgrades are still going to have download speeds less than 10/1 Mbps. If you find that to be the case I recommend documenting your findings because areas that didn’t get a full upgrade should be eligible for future grant funding.

It’s common knowledge that rural copper has been ignored for decades, often with no routine maintenance. It’s not surprising to anybody who has worked in a DSL environment that many rural lines are incapable of carrying faster DSL. It’s not easy for a big telco to bring 10/1 Mbps broadband over bad copper lines, but unfortunately, it’s easy for them to tell the FCC that the upgrades have been done, even if the speed is not really there.

This map is just one more piece of the puzzle and one more tool for rural counties to use to understand their current broadband situation. For example, it’s definitely a plus if the big telcos really upgraded DSL in these areas to at least 10/1 Mbps – many of these areas had no DSL or incredibly slow DSL before. On the flip side, if the big telcos are exaggerating about these upgrades and the speeds aren’t there, they are going to likely block your region from getting future grant money to upgrade to real broadband. The big telcos have every incentive to lie to protect their DSL and telephone revenues in these remote areas. What’s not tolerable is for the big telcos to use incorrect mapping data to deny homes from getting better broadband.

Big ISP Customer Service Still at the Bottom

This time each year we get a peek at how customers view the telecom industry, and for many years running it’s not been a pretty story. The annual American Customer Satisfaction Index (ACSI) was recently published and shows ISPs still ranked at the bottom of all industries in terms of customer satisfaction.

The survey to create the ACSI rankings is huge and involves over 300,000 households and looks at services that households use the most,  considering 400 companies in 46 different industries across 10 economic sectors.

Customers really hate the big cable TV companies and big ISPs. The ACSI index ranks companies on a scale of 1 to 100 and the two lowest ranking industries are Subscription TV Services (62) and Internet Service Providers (62) – both with the same composite ranking as last year. All other industries have rankings in the 70s and 80s, with industries like breweries (85), TV manufacturers (83), soft drinks (82), food companies (82), and automobiles (82) at the top.

The companies ranked just above ISPs have much higher rankings and include the US Postal Service (70), Fixed-Line Telephone Service (71), and Social Media Companies (72).

The big cable companies rank from the low of Altice (55) to a high for AT&T U-Verse (69). The only other companies that rank higher than the industry average of 62 are Verizon FiOS (68), Dish Networks (67) and DirecTV (66). The biggest cable companies fare poorly – Charter (59) and Comcast (57).

Internet Service Providers didn’t fare any better than cable companies with the overall industry ratings at the same 62. The only three ISPs with rankings above the average are Verizon FiOS (70), AT&T Internet (69) and Altice (63). At the bottom of the rankings are Frontier (55), MediaCom (56), and Windstream (57). The big cable companies don’t fare well as ISPs – Charter (59) and Comcast (61).

This continues to be good news for competitive overbuilders that provide good customer service. It’s been obvious over the years that customers hate calling the big cable companies and ISPs because the process of navigating through live customer service is time-consuming and painful.

But these rankings go far deeper than that. At CCG we conduct surveys for our clients who are usually looking at entering a new market. We also interview a lot of telecom customers during the course of a year. The poor opinion of the big providers in our industry runs deep. I see customers that really dislike the process that many of these companies force upon customers who have to negotiate to get lower rates every year or two. People don’t like to find out that they are paying a lot more than their neighbors for the same services. People also dislike service outages which happen far more often than they should. In the last year, we had several headline-grabbing major outages, but more aggravating to customers are the small daily outages that can hit without notice. Households have come to rely on broadband as much as they do for other household necessities like electricity and water, so outages are becoming intolerable.

Competitive ISPs are not automatically better at customer service than the big companies. Some competitive providers also offer too many product options and are willing to negotiate rates with customers. Small ISPs can also fall into the trap of turning every phone call to the company into a sales pitch. Good ISPs are learning to deal with customers in ways tailored to each customer. I know I personally would be thrilled to have my entire ISP relationship be handled by email or text, as long as by doing so I could be assured that I’m getting a good price. Most ISPs still have a long way to go – although I doubt that any ISP is ever going to be liked more than beer!

Talk to Me – Voice Computing

Technologists predict that one of the most consequential changes in our daily lives will soon come from being able to converse with computers. We are starting to see the early stages of this today as many of us now have personal assistants in our homes such as Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana or Google’s Personal Assistant. In the foreseeable future, we’ll be able to talk to computers in the same way we talk to each other, and that will usher in perhaps the most important change ever of the way that humans interact with technology.

In the book Talk to Me: How Voice Computing Will Transform the Way We Live, Work, and Think the author James Vlahos looks at the history of voice computing and also predicts how voice computing will change our lives in the future. This is a well-written book that explains the underlying technologies in an understandable way. I found this to be a great introduction to the technology behind computer speech, an area I knew little about.

One of the first things made clear in the book is the difficulty of the technical challenge of conversing with computers. There are four distinct technologies involved in conversing with a computer. First is Automatic Speech Recognition (ASR) where human speech is converted into digitized ‘words’. Natural Language Understanding (NLU) is the process used by a computer to interpret the meaning of the digitized words. Natural Language Generation (NGR) is how the computer formulates the way it will respond to a human request. Finally, Speech Synthesis is how the computer converts its answer into audible words.

There is much progress being made with each of these areas. For example, the ASR developers are training computers on how humans talk using machine learning and huge libraries of actual human speech and human interactions from social media sites. They are seeing progress as computers learn the many nuances of the ways that humans communicate. In our science fiction we’ve usually portrayed future computers that talk woodenly like Hal from 2001: A Space Odyssey. It looks like our future instead will be personal assistants that speak to each of us using our own slang, idioms, and speaking style, and in realistic sounding voices of our choosing. The goal for the industry is to make computer speech indistinguishable from human speech.

The book also includes some interesting history of the various voice assistants. One of the most interesting anecdotes is about how Apple blew its early lead in computer speech. Steve Jobs was deeply interested in the development of Siri and told the development team that Apple was going to give the product a high priority. However, Jobs died on the day that Siri was announced to the public and Apple management put the product on the back burner for a long time.

The book dives into some technologies related to computer speech and does so in an understandable way. For instance, the book looks at the current status of Artificial Intelligence and at how computers ‘learn’ and how that research might lead to better voice recognition and synthesis. The book looks at the fascinating attempts to create computer neural networks that mimic the functioning of the human brain.

Probably the most interesting part of the book is the last few chapters that talk about the likely impact of computer speech. When we can converse with computers as if they are people, we’ll no longer need a keyboard or mouse to interface with a computer. At that point, the computer is likely to disappear from our lives and computing will be everywhere in the background. The computing power needed to enable computer speech is going to have to be in the cloud, meaning that we just speak when we want to interface with the cloud.

Changing to voice interface with the cloud also drastically changes our interface with the web. Today most of us use Google or some other search engine when searching for information. While most of us select one of the choices offered on the first or second page of the search results, in the future the company that is providing our voice interface will be making that choice for us. That puts a huge amount of power into the hands of the company providing the voice interface – they essentially get to choreograph our entire web experience. Today the leading candidates to be that voice interface are Google and Amazon, but somebody else might grab the lead. There are ethical issues associated with a choreographed web – the company doing our voice searches is deciding the ‘right’ answer to questions we ask. It will be incredibly challenging for any company to do this without bias, and more likely they will do it to drive profits. Picture Amazon driving all buying decisions to its platform.

The transition to voice computing also drastically changes the business plans of a lot of existing technology companies. Makers of PCs and laptops are likely to lose most of their market. Search engines become obsolete. Social media will change drastically. Web advertising will take a huge hit when we don’t see ads – it’s hard to think users will tolerate listening to many ads as part of the web interface experience.

The book makes it clear that this is not science fiction but is a technology that will be maturing during the next decade. I recently saw a video of teens trying to figure out how to use a rotary dial phone, but it might not be long before kids will grow up without ever having seen a mouse or a QWERTY keyboard. I will admit that a transition to voice is intimidating, and they might have to pry my keyboard from my cold, dead hands.

Should Rural Fiber be a Utility?

I’ve heard or read half a dozen people in the last month say that the way we get rural fiber everywhere is to make fiber a regulated utility. This idea certainly has appeal for the many rural places that don’t have fiber today. On the surface this sounds like a way to possibly get fiber everywhere, and it’s hard to see a downside to that.

However, I can think of a number of hurdles and roadblocks to this concept that might be hard to overcome. This blog is too short to properly explore most of these ideas and it would require a 40-page whitepaper to give this topic justice. With that caveat, here are some of the big issues to be solved if we wanted to create rural fiber utilities.

What About Existing Fiber? What would we do about all of those who have already built rural fiber? There are small telcos, cooperatives, and rural communities that have already acted and found a way to fund a rural fiber network. Would we force someone who has already taken the commercial risk to somehow convert those existing fiber properties into a utility? Most small companies that have built rural fiber took on a huge debt burden to do so. Rural communities that have built fiber likely put tax revenues on the line to do so. It seems unfair to somehow force those with vision to already tackle this to somehow transform into a regulated utility.

What About Choice? One of the most important goals of almost every community I have worked with is to have broadband choice. One of the key aspects of a fiber utility is that it will almost certainly be a monopoly. Are we going to kick out WISPs in favor of a fiber utility? Would a fiber monopoly be able to block satellite broadband? .

The Definition of Rural. What areas are eligible to be part of a regulated fiber utility? If the definition is defined by customer density, then we could end up with farms with fiber and county seats without fiber. There’s also the more global consideration that most urban areas don’t have fiber today. Do we ask cities that don’t have fiber to help subsidize rural broadband? It’s impractical to think that you could force city networks to become a utility because that would financially confiscate networks from the big cable companies.

Who Pays for It? Building fiber in rural America would probably require low-interest loans from the government for the initial construction – we did this before when we built rural electric grids, so this can be made to work. But what about keeping fiber utilities solvent for the long run? The rural telephone network functioned so well because revenues from urban customers were used to subsidize service in rural places. When the big telcos were deregulated the first thing they did was to stop the internal subsidies. Who would pay to keep fiber networks running in rural America? Would urban ISPs have to help pay for rural broadband? Alternatively, might this require a tax on urban broadband customers to subsidize rural broadband customers?

Who Operates It?  This might be the stickiest question of all. Do we hand utility authority to local government, even those who are reluctant to take on the responsibility? Would people favor a fiber utility if the government handed over the operations to AT&T, Verizon, CenturyLink or Frontier? What do we do about cooperatives where the customers want to own their fiber network? Do we force existing fiber owners to somehow sell or give their networks to a new utility?

What About Carrier of Last Resort? One of the premises of being a utility is the idea that everybody in the monopoly service area can get service. Would we force fiber utilities to serve everybody? What about a customer who is so remote that it takes hundreds of thousands of dollars of construction to reach them? Who gets to decide who gets service? Does a fiber utility have to build to reach every new home?

What About Innovation? Technology never sits still. How do force fiber utilities to upgrade over time to stay current and relevant? Upgrading infrastructure is an expensive problem for existing utilities – as I found out recently when a water problem uncovered the fact that my local water utility still has some of the original main feeder pipes built out of wood. The common wisdom is that fiber will last a long time – but who pays to replace it eventually like we are now doing with the wooden water pipes? And what about electronics upgrades that happen far more often?

Government’s Role. None of this can be done without strong rules set by and enforced by the government. For example, the long-term funding mechanisms can only be created by the government. This almost certainly would require a new telecom act from Congress. Considering how lobbyists can sideline almost any legislative effort, is it even possible to create fiber utility that would work? Fiber utilities would also require a strong FCC that agrees to take back and strongly regulate and enforce broadband regulations.

Summary. I’ve only described a partial list of the hurdles faced in creating rural fiber utilities. There is no issue on this list that can’t be solved – but collectively they create huge hurdles. My biggest fear is that politics and lobbying would intervene, and we’d do it poorly. I suspect that similar hurdles faced those who created the rural electric and telephone companies – and they found a way to get it done. But done poorly, fiber utilities could be a disaster.

Summary Conclusions for Designing an FCC Broadband Grant

The earlier series of blogs looked at a number of ideas on how the FCC could create the most effective federal grant program for the upcoming $20.4 billion of announced grants. Following is a summary of the most important conclusions of those blogs:

Have a Clearly Defined Goal. If a federal grant’s program goal is something soft, like ‘improve rural broadband’ then the program is doomed to failure and will fund solutions that only incrementally improve broadband. The grant program should have a bold goal, such as bringing a permanent broadband solution to a significant number of households. For example, done well, this grant could bring fiber to 4 – 5 million homes rather than make incremental broadband improvements everywhere.

Match the Grant Process with the Grant Goals. Past federal grants have often had grant application rules that didn’t match the goals. Since the results of grants are governed by the application rules, those are all that matter. Stated goals for a grant are just rhetoric if those goals are not realized in the grant application requirements. As an example, if a grant goal is to favor the fastest broadband possible, then all grant application rules should be weighted towards that goal.

Match Speed Requirement with the Grant Construction Period. The discussion for the proposed $20.4 billion grant contemplates a minimum speed goal of 25/3 Mbps. That’s a DSL speed and is already becoming obsolete today. A goal of 25/3 Mbps will be badly outdated by the time any grant-funded networks are built. The FCC should not repeat their worst decision ever that gave out $11 billion for CAF II funding to build 10/1 Mbps networks – a speed that was obsolete even before the grants were awarded. The FCC should be requiring future-looking speeds.

Make the Grants Available to Everybody. FCC grant and loan programs often include a statement that they are available to every kind of entity. Yet the actual award process often discriminates against some kinds of applicants. For example, grants that include a loan component make it generally impossible for most municipal entities to accept the awards. Loan rules can also eliminate non-RUS borrowers. Grant rules that require recipients to become Eligible Telecommunications Carriers – a regulatory designation – discriminate against open access networks where the network owner and the ISP are separate entities. If not written carefully, grant rules can discriminate against broadband partnerships where the network owner is a different entity than the operating ISP.

Reverse Auction is not a Good Fit. Reverse auctions are a good technique to use when taking bids for some specific asset. Reverse auctions won’t work well when the awarded area is the whole US. Since reverse auctions favor those who will take the lowest amount of funding a reverse auction will, by definition, favor lower-cost technologies. A reverse auction will also favor parts of the country with lower costs and will discriminate against the high-cost places that need broadband help the most, like Appalachia. A reverse auction also favors upgrades over new construction and would favor upgrading DSL over building faster new technologies. From a political perspective, a reverse auction won’t spread the awards geographically and could favor one region, one technology or even only a few grant applicants. Once the auction is started the FCC would have zero input over who wins the funds – something that would not sit well with Congress.

Technology Matters. The grants should not be awarded to technologies that are temporary broadband band-aids. For example, if the grants are used to upgrade rural DSL or to provide fixed cellular broadband, then the areas receiving the grants will be back at the FCC in the future asking for something better. It’s hard to justify any reason for giving grants to satellite providers.

States Need to Step Up. The magnitude of the proposed federal grant program provides a huge opportunity for states. Those states that increase state grant funding should attract more federal grants to their state. State grants can also influence the federal awards by favoring faster speeds or faster technologies.

This blog is part of a series on Designing the Ideal Federal Broadband Grant Program.

Is it Time for ‘Do Not Track’?

Josh Hawley, the freshman Republican Senator from Missouri has introduced legislation that would allow consumers to opt out of being tracked on the web. He envisions this as the first major step towards Internet privacy legislation.

You may recall a voluntary version of do not track rules in the US about a decade ago. Many web sites had a Do Not Track button and some big companies like Twitter honored the consumer request to not be tracked. But over time, since there were no penalties for tracking, most web companies began tracking customers and the Do Not Track buttons disappeared from web sites. There are still a few web businesses like Mozilla that offer some protection through their browser. A lot of people now use ad blockers which can cut down on some tracking cookies, but which don’t really stop much of the tracking.

It’s not surprising that the voluntary methods got nowhere since there are gigantic dollars involved now in web advertising. It was recently announced by IAB and PwC that the digital advertising market hit $107.5 billion in 2018, up from $88.3 billion in 2017. Web and cellphones advertising is becoming the preferred way to reach younger consumers. The online advertising market is heavily reliant on targeted advertising that is aimed directly at the most likely consumers for any particular ad – and that requires tracking customers to build profiles of each one of us.

Sen. Hawley proposes the establishment of a ‘Do Not Track’ list that would be the equivalent of the FCC’s ‘Do Not Call’ list. There would be substantial fines for companies that violate the list. The bill suggests fines that create a major incentive to comply – it suggests the minimum fine should be $100,000 with the maximum fine being as much as $1,000 per day per person who is improperly tracked.

There are currently no rules governing how Internet companies track us. Without rules, the big web companies basically track everything they can about us. There are a number of practices that Sen. Hawley points out as being particularly troublesome:
• Google has admitted that Android phones track users even when customers turn off location tracking.
• Facebook tracks people who aren’t part of its platform and creates ‘shadow profiles’ of non-Facebook consumers.
• It’s a widespread industry practice to place cookies and other tracking tools on web site visitors as a way to track customer web browsing. Such data is widely traded on the open data market.

The European Union has struggled with the same problem and introduced an updated version of similar rules that went into effect in May 2018. That legislation gives consumers the chance to opt out of cookies and tracking on every web site visited. Companies that violate the EU rules can be fined the greater of 4% of worldwide revenues from the previous year or 20 million euros.

Interestingly, a lot of consumers won’t opt out of the tracking. I have a friend that employs a staff of programmers in their early 20s and they like the convenience of being tracked. They enjoy having specialized ads aimed directly at them and they think that enhances their web experience. This younger generation grew up with the web and they buy into the idea that the online world is not private. There are certainly older consumers who also like to get relevant web advertising. The purpose of the proposed legislation is not to end the tracking of customers, but rather to allow people to opt out. Perhaps over time, most people will value the benefits of being tracked more than their privacy. Web sites are certainly going to offer inducements to customers who agree to be tracked.

This legislation is only the first step towards a comprehensive set of privacy rules. For example, there are numerous other ways that information is gathered about us, like with the IoT devices in our homes. Many merchants and credit card companies are selling information about our buying habits.  Limiting privacy rules to opting out of web sites is a start, but data gathering from numerous sources is becoming an industry unto itself.

State’s Role in Broadband Grants

This is another blog in the series looking at the upcoming FCC $20.4 billion broadband grant program. Today I discuss how states might best take advantage of that large federal grant program.

A lot of states either have or are considering the establishment of state broadband grant programs. The grant programs vary widely in terms of the amount of annual grant, eligibility to receive the grants, etc. Most state programs award at least some grant dollars to help to pay to bring broadband to unserved and underserved places in a state.

Unless the FCC chooses a different mechanism, it looks like the federal grants might be awarded using a reverse auction. That means that ISPs that can accept the least amount of federal grant funding will receive the grants. That implies that ISPs that enter the federal auction that already have state grant money will have an advantage over other ISPs with a similar business plan. This means that states can attract a greater amount of federal grant dollars by coordinating state grants with the federal grant cycle.

The obvious way to do this is to award state grants to go along with projects that get the FCC grants. That sounds easy, but my guess is that figuring out the mechanics and the timing to do this right is going to be complicated. Getting that timing right is vital because awarding the state grants before the federal grants will improve ISP’s chances of winning a federal award. Timing is also vital because, in many cases, an ISP that wants to build fiber is going to need both both state and federal grants to make an economic business case. An ISP that gets only one of the two grants is likely to return the grant money rather than proceed. We’ve seen funding returned to both state and federal grant programs and that process is messy and benefits nobody.

The circumstances of these grants are different than the normal government grants. There is a routine coordination between state and federal grants for a wide array of purposes such as building bridges or roads. The normal process is that the entity receiving the grants apply to both the state and federal grant programs and they only proceed when they’ve received both grants – timing is not normally an issue and it doesn’t matter if the state or federal grant is approved first. If the FCC uses a reverse auction then there is no guarantee that an applicant will receive the federal funding – it’s vital that the state grant is in place first in order to increase the chance of winning the federal grant.

An ISP needs to know before the federal reverse auction that they can rely on state grants. Most state grant programs are awarded on some sort of merit scale, with a scoring system. That process might make it difficult to award state grants to those most likely to win federal grant funding. This likely means somehow changing the mechanism of the state grant program to favor projects that can win federal funding.Following are a few ideas of how states might do this – I’m sure there are other ways.
• States could have a formula that guarantees a fixed amount of state grant for anybody that wins a federal broadband grant. The formula might be something like, ‘the state will match the lesser of 50% of the amount of the federal grant award or $1 million’. With that kind of formula, an ISP could count on state money during the federal reverse auction. The downside to this idea is that if a state attracts a lot of federal grants they might owe more in matching than their grant budget.
• It might be as simple as awarding state grants before the federal reverse auction and requiring that anybody receiving a state grant must also apply for a federal grant. The downside to this is that there will likely be returned state grants for those that don’t win the federal grants.

The states also have an opportunity to influence the technology solutions they want to see. For example, a state might only offer matching grants for fiber and refuse to match grants for technologies like DSL, satellite or fixed cellular. Or states could promote fast speeds by awarding grants only to projects that provide 100 Mbps or faster.

Finally, the best way for a state to take advantage of the $20.4 billion in federal grants is to step up their game. Many state grant programs today have annual budgets of $5 million to $20 million per year. That is a paltry amount when the FCC is awarding $2 billion annually in grants. States that want to attract a bigger share of the federal grant money can do so by increasing the annual size of the state grant awards. If a state is serious about finding broadband solutions they want to attract as much of the federal grant money as possible. The federal grant program will likely reward states that are bold enough to get serious about also funding broadband. I know state legislators like to pat themselves on the back because they have created a state broadband grant program. However, state grant programs that only award $20 million annually in grants might need to make such awards for fifty years or longer to actually solve their rural broadband gap. States can do better and the federal grant program makes it easier to justify larger state grants.

This blog is part of a series on Designing the Ideal Federal Broadband Grant Program.

Squirrels and Fiber

Most of us don’t realize the damage done every year to fiber and to other wired networks by animals.

Squirrels. These cute rodents are the number one culprit for animal damage to aerial fiber. To a lesser degree, fiber owners report similar damage by rats and mice. Squirrels mainly chew on cables as a way to sharpen their teeth. Squirrel teeth grow up to 8 inches per year and if squirrels aren’t wearing their teeth down from their diet, they look for other things to chew. There has been speculation that squirrels prefer fiber to other cables due to some oil or compound used in the fiber manufacturing process that attracts them.

Before Level 3 was part of CenturyLink, they reported that 17% of their aerial fiber outages were caused by squirrels. A Google search turns up numerous network outages caused by squirrels.

Companies use a wide variety of techniques to try to protect from squirrel damage – but anybody that has ever put out a bird feeder knows how persistent squirrels can be. One deterrent is to use hardened cables that are a challenge for squirrels to chew through. However, there have been cases reported where squirrels partially chew through such cables and cause enough damage to allow in water and cause future damage.

A more common solution is some sort of add-on barriers to keep squirrels away from the cable. There are barrier devices that can be mounted on the pole to block squirrels from moving higher. There are also barriers that are mounted where cables meet a pole to keep the squirrels away from the fiber. There are companies that have tried more exotic solutions like deploying ultrasonic blasters to drive squirrels away from fiber. In other countries the fiber providers sometimes deploy poison or obnoxious chemicals to keep squirrels away from the fiber, but these techniques are frowned upon or illegal in the US.

Gophers. For buried fiber, the main animal culprit in parts of the US are pocket gophers. There are thirteen species of pocket gophers in the US that range from 5 to 13 inches in length. The two regions of the country with pocket gophers are the Midwest plains and the Southwest. Gophers live on plants and either eat roots or pull plants down through the soil.

Pocket gophers can cause considerable damage to buried fiber. These rodents will chew almost anything and there have been reported outages from gophers that chewed through gas, water, and buried electric lines. Gophers typically live between 6 and 12 inches below the surface and are a particular threat to buried drops.

There are several ways to protect against gophers. The best protection is to bury fiber deep enough to be out of gopher range, but that can add a lot of cost to buried drops. I have a few clients that bore drops rather than trench or vibrate them for this reason. Another protection is to enclose the fiber in a sheath that is over 3 inches in diameter. Anything that large and tubular is generally too big for a gopher to bite. Again, this is an expensive solution for buried drops. Another solution is to surround the buried fiber with 6 – 8 inches of gravel of at least 1-inch size – anything smaller gets pushed to the side by the gophers.

A recent blog by the fiber material vendor PPC highlights even more exotic animal damage to fiber. The most interesting example (and one that is easy to picture) is when farmers cut fiber while burying dead livestock. They typically bury dead animals where they find them, and if that’s in a right-of-way they can easily cut buried fiber.

PPC also reports that birds can do damage to aerial fiber. Large birds with sharp talons can create small cuts in the sheath and introduce water. Flocks of birds sitting on a fiber can cause sag and stretching of the fiber. I can remember when living in Florida seeing flocks of birds sitting shoulder-to-shoulder on cables and that has to add a lot of weight over a 200-foot span between poles.

Technology and FCC Grants

This is the next in the series of blogs looking at the upcoming $20.4 billion FCC grant program. I ask the question of how the FCC should consider technology in the upcoming grant program.

Should Satellite Companies be Eligible? I think a more fundamental question is if the current generation of high-orbit satellites really deliver broadband. Over the last few years I’ve talked to hundreds of rural people about their broadband situation and I have never met anybody who liked satellite broadband – not one person. Most people I’ve talked to have tried it once and abandoned it as unworkable.

This goes back to the basic definition of broadband. The FCC defines broadband by download speeds of at least 25/3 Mbps. In their original order in 2015 the FCC discussed latency, but unfortunately never made latency part of the broadband definition. As a reminder, the standard definition of latency is that it’s a measure of the time it takes for a data packet to travel from its point of origin to the point of destination.

A few years ago, the FCC did a study of the various last mile technologies and measured the following ranges of performance of last-mile latency, measured in milliseconds: fiber (10-20 ms), coaxial cable (15-40 ms), and DSL (30-65 ms). Cellular latencies vary widely depending upon the exact generation of equipment at any given cell site, but 4G latency can be as high as 100 ms. In the same FCC test, satellite broadband was almost off the chart with latencies measured as high as 650 ms.

Latency makes a big difference in the perceived customer experience. Customers will rate a 25 Mbps connection on fiber as being much faster than a 25 Mbps connection on DSL due to the difference in latency. The question that should be asked for federal grants is if satellite broadband should be disqualified due to poor latency.

I was unhappy to see so much money given to the satellite providers in the recent CAF II reverse auction. Even ignoring the latency issue, I ask if the satellite companies deserve broadband subsidies. There is no place in rural America where folks don’t already know that satellite broadband is an option – most people have rejected the technology as an acceptable broadband connection. It was particularly troubling seeing satellite providers getting money in a reverse auction. Once a satellite is in orbit it’s costs are fixed and that means that the satellite providers will be happy to take any amount of federal subsidy – they can bid lower than any other grant applicant in a reverse auction. I have to question the wisdom of providing federal subsidies to companies that are already failing at marketing.

I don’t have enough information to know how to feel about the upcoming low-orbit satellites that are just now being tested and launched. Because of lower orbits they will have lower latency. However, the satellite companies still have a huge advantage in a reverse auction since they can bid lower than anybody else – a satellite company would be happy with only a few dollars per potential customer and has no bottom limit on the amount of grant they are willing to accept. If the new satellite companies can bid in the same manner as everybody else we could end up with the situation where these companies claim 100% of the new grant funds.

What About DSL? My nightmare scenario is that the FCC hands most or all of the $20.4 billion to the big telcos to upgrade rural DSL from 10/1 Mbps to 25/3 Mbps. This is certainly within the realm of possibility. Remember that the first CAF II program was originally going to be open to everybody but at the last minute was all given to the big telcos.

I find it troublesome that the big telcos have been quiet about the announced plans for this grant. The money will be spent in the big telco service areas and you’d think they be screaming about plans for federal money to overbuild them. Recall that the big telcos recently were able to derail the Re-Connect grants by inserting the rule that only 10% of the grant money could be used for customers who receive at least 10/1 Mbps broadband. This FCC clearly favors the big telcos over other ISPs and could easily hand all of this money to the big telcos and call it CAF III.

Even if they don’t do that, the question is if any federal grant money should be used to upgrade rural DSL. Rural copper is in dreadful condition due to the willful neglect of the big telcos who stopped doing maintenance on their networks decades ago. It’s frankly a wonder that the rural copper networks even function. It would be a travesty to reward the telcos by giving them billions of dollars to make upgrades that they should have routinely made by reinvesting customer revenues.

I think when the dust clears on CAF II we’re going to find out that the big telcos largely cheated with that money. We’re going to find that they only upgraded the low-hanging fruit and that many households in the coverage areas got no upgrades or minor upgrades that won’t achieve the 10/1 Mbps goals. I think we’ll also find that in many cases the telcos didn’t spend very much of the CAF II funds but just pocketed it as free revenue. I beg the FCC to not repeat the CAF II travesty – when the truth comes out about how the telcos used the funding, the CAF II program is going to grab headlines as a scandal. Please don’t provide any money to upgrade DSL.

This blog is part of a series on Designing the Ideal Federal Broadband Grant.