Is it Time to Sell?

A lot of ISPs hope to someday cash in on their sweat equity by selling the business. There have been some surprisingly high recent valuations in parts of the industry which raises the question if this is a good time to sell an ISP?

Anybody that has considered selling in the last decade years knows that valuation multiples have been stagnant and somewhat low by historic standards. A lot of properties have changed hands during that time with multiples in the range of 4.5 to 6.5 times EBITDA (earnings before interest, taxes, depreciation, and amortization). Some ISP properties have sold outside of that range based upon the unique factors of a given sale.

In November, Jeff Johnston of CoBank posted a long blog talking about how valuations might be on the rise – particularly for companies with a lot of fiber or with other upsides. He pointed to three transactions that had valuations higher than historic multiples for the sector.

  • Zayo sold their network of 130,000 route miles of fiber transport for a multiple of 11.1 times EBITDA.
  • Bluebird Network in Missouri and nearby states sold a 6.500-mile fiber transport network for a multiple of 10.4 times EBITDA.
  • Fidelity Communications of Missouri sold an ISP with nearly 135,000 customers for a multiple of 11.7 times EBITDA.

Johnston doesn’t say that these high multiples are the new standard for other ISPs. However, he does surmise that the high multiples probably indicate an uptick in valuation for the whole sector. That’s something that’s only proven over time by seeing higher valuations coming from multiple and smaller transactions – but the cited transactions raise the possibility that we’re seeing an increase in valuation for fiber-based businesses.

It’s important to ask why any buyer would pay 10 or 11 times EBITDA. A buyer paying that much will take a decade to recoup their investment if the purchased business continues to perform at historic levels. Nobody would pay that much for a business unless they expect the margins of the acquired business to improve after acquisition – that’s the key to higher valuations. The buyers of these three businesses are likely expecting significant upsides from the purchased properties.

Buyers often see a one-time bump in margin from the increased efficiency of adding an acquisition to their existing business. This is often referred to as an economy of scale improvement – overheads generally become more affordable as a business gets larger. However, buyers rarely will reward a seller for the economy of scale improvements, so this is rarely built into valuation multiples.

A buyer is usually only willing to pay a high multiple if they foresee the possibility of significant growth from the purchased entity. The purchased company needs to be operating in a footprint with upside potential, or else the purchased company needs to demonstrate that they know how to grow. A buyer must believe they can grow the acquired business enough to recoup their purchase price and also make a good return. For a fiber ISP to get a high valuation they have to be able to convince a buyer that the business has huge upside potential. An ISP needs to already be growing and they need to be able to demonstrate that the growth can be ongoing into the future.

One of the more interesting aspects of getting a high valuation multiple is that a buyer might expect the core management team to remain intact after a sale. That often means that part of the compensation from the sale might be incentive-based and paid in the future based upon post-sale performance.

To summarize, an ISP can get a higher valuation if they can convince a buyer that there is future upside to the business. ISPs that don’t have growth potential will not see the higher valuation multiples cited above – although many potential sellers will think these multiples apply to them. The bottom line is that if your ISP is growing and can keep growing, and you can paint that picture to a buyer, your business might be worth more than you expected.

Is Your Home Listening to You?

When I was a teenager, science fiction books envisioned a future where people talked to their home to take care of mundane tasks. For somebody willing to spend the money on new appliances and devices that future is here today.

Just consider the Amazon Alexa voice assistant, which is installed in the largest number of devices. GE has built Alexa into its new stoves, refrigerators, wall ovens, dishwashers, washers and dryers, and air conditioners. Samsung has built Alexa into refrigerators, washers, dryers, air conditioners, and vacuums. Alexa is built into smart light bulbs, smart wall plugs, televisions, thermostats, smart door locks, security cameras, speakers, and numerous other devices. The chips and/or software to add Alexa to devices are getting cheap and it shouldn’t be long until the app is built into most electronics you might buy.

The convenience of talking to home devices is not without a cost, and companies like Amazon, Apple, and Google are listening to you through the devices. Like other voice assistants, Alexa listens all of the time waiting for a ‘wake word’ that activates the app. There are major privacy and security concerns related to the constant listening. We have to trust the company controlling the device not to listen to us all of the time because there is nothing stopping them from doing so.

Amazon swears they don’t listen or record except for a short period of time after the wake word is spoken. They also swear that they only preserve those recordings in an effort to improve Alexa’s responses to questions. If you are going to use Alexa in your home, you are trusting that Amazon is telling the truth. Back in 2017 Samsung got a huge black eye when they were unable to make that promise concerning their smart TVs.

The other big concern is hacking. There is zero chance that all of the companies making devices that include a voice assistant have iron-clad security. While Amazon really might not be listening to you, a hacker will surely be willing to do so.

To make matters even more uncomfortable, a lot of lawyers and privacy experts believe that if a person knowingly installs a device that listens and transmits information to a third party, that person has waived their Fourth Amendment privacy rights and any rights granted by the Electronic Communications Privacy Act. The concept has not yet been challenged in a court, but if it’s true, then people have no recourse against Amazon or anybody else using the information gathered from a voice assistant device.

My house has four Amazon Echos that we bought when the devices first hit the market. They are convenient and I use them to listen to music, check the weather or news, check the hours at stores or restaurants, and to make the occasional reminder in the middle of the night. My family has gotten uncomfortable with being listened to all of the time and we now unplug the devices when we aren’t using them. This kills all of the spontaneous uses of the devices, but for now, that feels safer than being listened to.

I’m going to be leery about buying any new household appliance that can listen to me. If I can’t disable the listening function, I’m not going to buy the device. It’s impossible to feel secure with these devices right now. It’s impossible to take the word of big company that such devices are safe. You only have to look at the current experiences with the hacking of Ring cameras to know that smart home devices are currently anything but safe.

Small ISPs have never worried much about the devices that people hang off their networks. ISPs provide the bandwidth pipe, and how people use data has not been a concern for the ISP. However, that is slowly changing. I have a lot of clients that are now offering smart thermostats, smart security systems, and other smart devices as a way to boost revenue. ISPs need to be careful of any claims they make to customers. Somebody advertising safety for a smart security system might have liability if that system is hacked and the customer exploited.

Maybe I’m being overly cautious, but the idea of somebody I don’t know being able to listen to everything said in my house makes me uncomfortable. As an industry person who has been following the history of IoT devices, I’m even more uncomfortable since it’s now obvious that most smart home devices have lousy security. If you don’t think Amazon is listening to you, I challenge you to activate Alexa and say something vile about Jeff Bezos, then see how much longer it takes to get your next Amazon shipment. Go ahead, I dare you!

The End of Free Conference Calling

Like many of you reading this blog, I have been using the service Free ConferenceCall.com for many years. I got an email from them last week warning that their service will likely go dark, and they wanted users of the service to call Congress to help keep them in business.

Their issue stems back to an FCC order issued in September of last year that seeks to stop the practice of access arbitrage. This FCC summary of the order describes the situation well. Some small telcos have been making money by billing access on ‘free’ minutes generated by services like free conference calling. The process of making money from free calling services has been known in the industry as access arbitrage.

The FCC tried to stop access arbitrage in 2011. At that time, small rural telcos billed a rate of as much as a penny or two per minute to originate or terminate a long-distance call. Some telcos that were allowed to bill the high rates were making a lot of money by originating calls for free outgoing call center services or by terminating calls from 800 numbers, conference calling services, or free chat lines.

In the 2011 order, the FCC eliminated the access fees associated with terminating a call, migrating to what the FCC called ‘bill and keep’, and they hoped that eliminating the access revenues would kill the arbitrage practices. The FCC order was largely effective and chat lines and other free arbitrage services quickly disappeared.

However, the 2011 order didn’t kill all access charges, and over time the folks who make money with arbitrage found another way to make money with free calling. One of the few access charges left untouched in 2011 was transport, which compensates telcos for the use of fiber networks connecting telcos to the outside world. I’ve noticed that the caller ID for FreeConferenceCalling.com numbers is mostly from Iowa and South Dakota, and I have to assume those calls are being terminated at switches that are remote and that can still bill significant miles of transport.

The access fees billed to terminate calls are paid by the carrier that originates the call. This means that most remaining terminating access is paid today by long-distance carriers like AT&T, Sprint and CenturyLink, which together still sell the bulk of long-distance telephone services. The dollar magnitude of access arbitrage is much smaller than a decade ago. The FCC estimates arbitrage is currently a $40 – $60 million problem, whereas it was hundreds of millions before the FCC’s 2011 order. But those fees are being billed to the long-distance companies that get no benefit from the transaction (thus the term arbitrage – the companies are billing the fees because the rules allow a loophole to do so).

FreeConferenceCalling.com is not the only company doing this, and it’s likely that many conference calling services rely wholly or partially on the arbitrage. It’s worth noting that conference call services that use the Internet to place calls will not be affected by this change – because those calls don’t invoke access charges. The carriers billing for the access on the conference calling may or may not be sharing the revenues with companies like FreeConferenceCalling.com – in either case those carriers no longer have any financial reason to continue the practice.

Companies like FreeConferenceCalling.com don’t automatically have to go out of business, but the FCC order means a drastic change to the way they do business. For instance, the company could start charging a monthly fee for conference calling – likely forcing this particular company to change its name. They might sell advertisements for those sitting waiting for a conference call. They could charge for services like recording calls.

It’s more likely that companies like FreeConferenceCalling.com will quietly die or fade away. I tried using the service yesterday and it already seems to be broken. This latest FCC order probably puts the final nail into the coffin of access arbitrage – although I’ve learned to never say never. As long as there are any fees for calling based upon regulatory orders, there is a chance that somebody will find a way to generate lots of calls that fit the circumstance and get enriched by the arbitrage.

The RDOF Grants – The Good and Bad News

The FCC recently approved a Notice of Proposed Rulemaking that proposes how they will administer the $16 billion in RDOF grants that are going to awarded later this year. As you might imagine, there is both good news and bad news coming from the grant program.

It’s good news that this grant program ought to go a long way towards finally killing off large chunks of big telco rural copper. Almost every area covered by these grants is poorly served today by inadequate rural DSL.

The related bad news is that this grant award points out the huge failure of the FCC’s original CAF II program where the big telcos were given $11 billion to upgrade DSL to at least 10/1 speeds. The FCC is still funding this final year of construction of CAF II upgrades. The new grant money will cover much of the same geographic areas as the original CAF II deployment, meaning the FCC will spend over $27 billion to bring broadband to these rural areas. Even after the RDOF grants are built, many of these areas won’t have adequate broadband. Had the FCC administered both grant programs smartly, most of these areas could be getting fiber.

Perhaps the best good news is that a lot of rural households will get faster broadband. Ironically, since the grants cover rural areas, there will be cases where the RDOF grant brings faster broadband to farms than will be available in the county seat, where no grant money is available.

There is bad news on broadband speeds since the new grant program is only requiring download speeds of 25/3 Mbps. This means the FCC is repeating the same huge mistake they made with CAF II by allowing federal money to spend on broadband that will be obsolete before it’s even built. This grant program will be paid out of ten years and require deployment over six years – anybody paying attention to broadband understands that by six years from now a 25/3 Mbps broadband connection will feel glacial. There is grant weighting to promote faster data speeds, but due to the vagaries of a reverse auction, there will be plenty of funding given to networks that will have speeds close to 25/3 Mbps in performance.

There is further bad news since the FCC is basing the grants upon faulty broadband maps. Funding will only be made available to areas that don’t show 25/3 Mbps capability on the FCC maps. Everybody in the industry, including individual FCC Commissioners, agrees that the current maps based upon 477 data provided by ISPs are dreadful. In the last few months, I’ve worked with half a dozen counties where the FCC maps falsely show large swaths of 25/3 broadband coverage that isn’t there. It’s definitely bad news that the grant money won’t be made available in those areas where the maps overstate broadband coverage – folks in such areas will pay the penalty for inadequate broadband maps.

There is a glimmer of good news with mapping since the FCC will require the big ISPs to report broadband mapping data using polygons later this year. Theoretically, polygons will solve some of the mapping errors around the edges of towns served by cable TV companies. But there will only be time for one trial run of the new maps before the grants, and the big telcos have every incentive to exaggerate speeds in this first round of polygon mapping if it will keep this big pot of money from overbuilding their copper. I don’t expect the big telco mapping to be any better with the polygons.

Another area of good news is that there will be a lot of good done with these grants. There will be rural electric cooperatives, rural telcos, and fiber overbuilders that will use these grants as a down-payment to build rural fiber. These grants are not nearly large enough to pay for the full cost of rural fiber deployment, but these companies will borrow the rest with the faith that they can create a sustainable broadband business using fiber.

The bad news is that there will be plenty of grant money that will be used unwisely. Any money given to the traditional satellite providers might as well just be burned. Anybody living in an area where a satellite provider wins the grant funding won’t be getting better broadband or a new option. There is nothing to stop the big telcos from joining the auction and promising to upgrade to 25/3 Mbps on DSL – something they’ll promise but won’t deliver. There are likely to be a few grant recipients who will use the money to slap together a barely adequate network that won’t be fast and won’t be sustainable – there is a lot of lure in $16 billion of free federal money.

It’s dismaying that there should be so many potential downsides. A grant of this magnitude could be a huge boost to rural broadband. Many areas will be helped and there will be big success stories – but there is likely to be a lot of bad news about grant money spend unwisely.

Federal Subsidies for Satellite Broadband

In December, the FCC awarded $87 million from the CAF II Reverse auction held last summer for satellite broadband. The bulk of the satellite awards went to Viasat, which will supposedly use the money to bring broadband to 123,000 homes in seventeen states. The grant awards are meant to bring 25/3 Mbps broadband to areas that don’t have it today.

I have several problems with this award. First is that the satellite companies already cover these areas today and have been free to sell and market in these areas. The federal grant money doesn’t bring a new broadband alternative to anybody in rural America.

Second, the satellite companies aren’t required to connect any specific number of new customers as a result of the grant awards. They are largely free to just pocket the grants directly as profits. Even when they do connect a new customer, they don’t build any lasting broadband infrastructure, but only install an antenna at each new customer.

Third, rural residents don’t seem to want satellite broadband. In a large survey by the Census Bureau in 2017, 21% of people in the US described their neighborhood as rural (52% chose suburban and 27% said urban). In the quarter ending in June 2019, Viasat claimed 587,000 rural customers in the US, which represents only 2.2% of the 128 million households in the country.  If those customers are all in rural America, then the company has roughly a 10% market penetration.

CCG has been doing broadband surveys for twenty years and I don’t know that we’ve ever talked to a satellite customer who was happy with their broadband. In every survey, we seem to encounter more people who dropped satellite service than those that still have it. Customers complain that satellite costs too much – Viasat claimed in their most recent financial report that the average residential broadband bill is $84.26. Customers also hate the high latency, which can be 10 to 15 times higher than terrestrial broadband. The latency is due to the satellite which is parked almost 22,200 miles above earth – it takes a while for a round trip communication over that distance.

The primary complaints about satellite broadband are tiny monthly data caps. The company’s products that would satisfy the FCC grant speed requirements start with the Unlimited Silver 25 plan at $70 with speeds up to 25 Mbps with a monthly data cap of 60 gigabytes of data usage. The fastest plan is the Unlimited Platinum 100 plan for $150 with speeds up to 100 Mbps and a data cap if 150 gigabytes. Unlike cellular plans where a customer can buy more broadband, the Viasat plans throttle customers to speeds reported to be less than 1 Mbps once a customer reaches the data cap. To put those plans into perspective, OpenVault announced recently that the average US home uses 274 gigabytes of data per month. The average cord cutting home uses 520 gigabytes per month. The satellite broadband is impractical for anybody with school students in the home or for anybody that does even a modest amount of video streaming.

Viasat won the grant funding due to a loophole in the grant program. The program funding was available to anybody that offers broadband of at least 25 Mbps. The grant program intended to deliver a new broadband alternative to rural households – something that satellite broadband does not do. The funding was provided under a reverse auction, and the satellite companies likely placed bids for every eligible rural market – they would have been the default winner for any area that had no other bidder. Even where there was another bidder, a reverse auction goes to the lowest bidder and there is no amount that is too small for the satellite companies to accept. The satellite companies don’t have to make capital expenditures to satisfy the grants.

Giving money to satellite providers makes no sense as broadband policy. They don’t bring new broadband to anybody since the satellite plans are already available. The plans are expensive, have high latency and low monthly data caps.

The much larger RDOF grant program will award $16.4 billion in 2020 for rural broadband and the satellite companies must be ecstatic. If the FCC doesn’t find a way to keep the satellite companies out of this coming auction, the satellite companies could score a billion-dollar windfall. They can do so without offering any products that are not already available today.

To put these grants into perspective, the $87 million grant award is roughly the same size as the money that has been awarded over several years in the Minnesota Border-to-Border grant program. The Minnesota grants have helped funds dozens of projects, many of which built fiber in the state. There is no comparison between the benefits of the state grant program compared to the nearly total absence of benefit from handing federal money to the satellite companies.

US Has Poor Cellular Video

Opensignal recently published a report that looks around the world at the quality of cellular video. Video has become a key part of the cellular experience as people are using cellphones for entertainment, and since social media and advertising have migrated to video.

The use of cellular video is exploding. Netflix reports that 25% of its total streaming worldwide is sent to mobile devices. The new Disney+ app that was just launched got over 3 million downloads of their cellular app in just the first 24 hours. The Internet Advertising Bureau says that 62% of video advertisements are being seen on cellphones. Social media sites that are video-heavy like Instagram and Tik-Tok are growing rapidly.

The pressure on cellular networks to deliver high-quality video is growing. Ericcson recently estimated that video will grow to be almost 75% of all cellular traffic by 2024, up from 60% today. Look back five years, and video was a relatively small component of cellular traffic. To some extent, US carriers have contributed to the issue. T-Mobile includes Netflix in some of its plans; Sprint includes Hulu or Amazon Prime; Verizon just started bundling Disney+ with cellular plans; and AT&T offers premium movie services like HBO or Starz with premium plans.

The quality of US video was ranked 68 out of 100 countries, the equivalent of an F grade. That places our wireless video experience far behind other industrialized countries and puts the US in the same category as a lot of countries from Africa, and South and Central America. One of the most interesting statistics about US video watching is that 38% of users watch video at home using a cellular connection rather than their WiFi connection. This also says a lot about the poor quality of broadband connections in many US homes.

Interestingly, the ranking of video quality is not directly correlated with cellular data speeds. For example, South Korea has the fastest cellular networks but ranked 21st in video quality. Canada has the third-fastest cellular speeds and was ranked 22nd in video quality. The video quality rankings are instead based upon measurable metrics like picture quality, video loading times, and stall rates. These factors together define the quality of the video experience.

One of the reasons that US video quality was rated so low is that the US cellular carriers transmit video at the lowest compression possible to save on network bandwidth. The Opensignal report speculates that the primary culprit for poor US video quality is the lack of cellular spectrum. US cellular carriers are now starting to implement new spectrum bands into phones and there are more auctions for mid-range spectrum coming next year. But it takes 3-4 years to fully integrate new spectrum since it takes time for the cellular carriers to upgrade cell sites and even longer for handsets using a new spectrum to widely penetrate the market.

Only six countries got an excellent rating for video quality – Norway, Czech Republic, Austria, Denmark, Hungary, and the Netherlands. Meanwhile, the US is bracketed on the list between Kyrgyzstan and Kazakhstan.

Interestingly, the early versions of 5G won’t necessarily improve video quality. The best example of this is South Korea that already has millions of customers using what is touted as 5G phones. The country is still ranked 21st in terms of video quality. Cellular carriers treat cellular traffic differently than other data, and it’s often the video delivery platform that is contributing to video problems.

The major fixes to the US cellular networks are at least a few years away for most of the country. The introduction of more small cells, the implementation of more spectrum, and the eventual introduction of the 5G features from the 5G specifications will contribute to a better US cellular video experience. However, with the volume of US cellular broadband volumes doubling every two years, the chances are that the US video rating will drop more before improving significantly. The network engineers at the US cellular companies face an almost unsolvable problem of maintaining network quality while dealing with unprecedented growth.

Seattle Tackles MDU Broadband

TR007241

Our industry takes it as general wisdom that urban areas have better broadband than rural areas, and as a general premise it’s true. But within urban areas, the segment of the community with the widest range of broadband coverage are apartment buildings. I think you can go to every big city and find some apartments with gigabit speeds while other apartment buildings have little or no broadband.

There are several reasons for the wide variance in broadband coverage. First, landlords have always had a say about what carriers they will allow in their buildings. I’ve seen numerous cases of landlords that include slow broadband into the rent and don’t let faster ISPs into their building. Some buildings don’t have broadband due to what can only be described as redlining where ISPs avoided poor or troubled neighborhoods. Finally, some older apartment buildings are expensive to rewire due to the way they were originally constructed.

The City of Seattle is tackling the issue in an interesting way. Over half of the living units in the city are in MDU’s (multi-dwelling units), meaning apartments, condominiums, and townhouses. In 2019 almost 81% of new units built in the city are in MDUs. The city views the ability of MDU tenants to have the same broadband quality and options as single family homes as a digital equity issue.

The city has been gathering facts about MDU broadband for several years and came to understand the wide variance of broadband in different buildings. They found that MDUs routinely don’t have the same broadband options as nearby single-family homes. The city conducted a survey in 2017 that found that 95% of MDU tenants have access to broadband of at least 25/3 Mbps. However, the survey found that few tenants in the city have a competitive choice between multiple ISPs at speeds of 100 Mbps or faster. The tenants who have the choice of multiple fast ISPs were the most satisfied with their broadband. The city concluded from the survey that choice was just as important to MDU tenants as broadband speeds.

Probably the most important finding of Seattle’s research is that there is a wide variance among landlords in terms of understanding their broadband options. They found landlords who know very little about broadband up to landlords that have sophisticated tech plans. The city found that many landlords have relied on the advice from ISPs – which clearly can be self-serving and not in the benefit of landlords and tenants.

The city concluded that one way that they could help improve MDU broadband was by helping to educate landlords. The Seattle Office of Cable Communications launched an initiative they call B4B – Build for Broadband. Their goal is to create awareness of the importance of broadband for landlords and to provide educational content for the many landlords that can’t afford telecom planners and consultants.

The city has undertaken an initiative to provide information about broadband to landlords. They’ve started a series of webinars covering topics of interest to landlords. I should disclose that I helped the city with a webinar that compared wired and wireless technologies. The city is also gathering other information for landlords on their website.

This initiative makes a lot of sense in Seattle since it has one of the highest percentages of MDU residents in the country. However, any city that has MDUs could consider something similar. I’ve done broadband feasibility studies for cities that have between 20 to 50 MDU complexes, and inevitably they are as widely disparate as the ones in Seattle. There usually are a few that have little or no broadband and a few that have been wired with fiber and that offer gigabit broadband.

Cities are often surprised by the wide variance in broadband availability and speeds at different MDUs. Cities are also often surprised to hear that even if they find a broadband solution for improving broadband for single-family homes and businesses, that the solution will not necessarily apply to MDUs. I know of many fiber overbuilders that skip past MDUs due to the cost of rewiring the buildings, the reluctance of landlords to let them in, or the marketing challenge of keeping up with tenant churn.

It’s not hard for smaller cities to take an inventory of the state of broadband in their MDU community. It’s normally as simple as to visit each apartment complex to find out what’s available to tenants. While smaller cities are not going to undertake an educational process with the scope of Seattle’s, cities can assist MDUs with poor broadband to find a better solution. Sometimes it’s as easy as helping to match competitive ISPs and landlords. It might mean getting landlords talking to each other. One thing is for sure – no solutions can be found until the problems are identified.

Taking Advantage of the $9B 5G Fund

The FCC will be moving forward with the $9 billion 5G Fund – a new use of the Universal Service Fund – that will be providing money to expand cellular coverage to the many remote places in the US where 4G cell coverage is still spotty or nonexistent. There is a bit of urgency to this effort since the big cellular companies all want to shut down 3G within a year or two. This money will be made available to cellular carriers, but the funding still opens up possible benefits for other carriers and ISPs.

Some of this funding is likely to go towards extending fiber into rural places to reach cell towers, and that opens up the idea of fiber sharing. There are still a lot of places in the country that don’t have adequate fiber backhaul – the data pipes that bring traffic to and from the big hubs for the Internet. In the last six months alone I’ve worked with three different rural projects where lack of backhaul was a major issue. Nobody can consider building broadband networks in rural communities if the new networks can’t be connected to the web.

By definition, the 5G Fund is going to extend into rural places. If the FCC was maximizing the use of federal grant funds, they would demand that any fiber built with this new fund would be available to others at reasonable rates. This was one of the major provisions of the middle mile networks built a decade ago with stimulus funding. I know of many examples where those middle mile routes are providing backhaul today for rural fixed wireless and fiber networks. Unfortunately, I don’t see any such provisions being discussed in the 5G Fund – which is not surprising. I’m sure the big cellular companies have told the FCC that making them share fiber with others would be an inconvenience, so this idea doesn’t seem to be included in the 5G Fund plan.

I think there is a window of opportunity to partner with wireless carriers to build new fiber jointly. The cellular carriers can get their portion of new fiber funded from the 5G Fund and a partner can pick up new fiber at a fraction of the cost of building the route alone. This could be the simplest form of partnership where each party owns some pairs in a joint fiber.

This is worth considering for anybody already thinking about building rural fiber. The new routes don’t have to be backhaul fiber and could instead be a rural route that is part of a county-wide build-out or fiber being built by an electric cooperative. If somebody is considering building fiber into an area that has poor cellular coverage, the chances are that there will be 5G Fund money coming to that same area.

It has always been challenging to create these kinds of partnerships with AT&T and Verizon, although I am aware of some such partnerships. Both Sprint and T-Mobile have less rural coverage than the other carriers and might be more amenable to considering partnerships – but they might be consumed by the possibility of their merger.

There are a lot of other cellular carriers. The CTIA, the trade association for the larger cellular carriers, has thirty members that are facility-based cellular providers. The Competitive Carriers Association (CCA) has over one hundred members.

Ideally, a deal can be made to share fiber before the reverse auction for the 5G Fund. Any carrier that has a partner for a given route will have a bidding advantage since cost-sharing with a partner will lower the cost of building new fiber. It might be possible to find partnerships after the auction, but there could be restrictions on the newly built assets as part of the grants – we don’t know yet.

My recommendation is that if you are already planning to build rural fiber that you look around to see if one of the cellular carriers might be interested in serving the same area. Both parties can benefit through a cost-sharing partnership – but the real winners are rural customers that gain access to better cellular service and better broadband.

Maximizing the Benefits of Fiber

I recently talked to Deb Socia who is now the CEO of the Enterprise Center in Chattanooga, Tennessee. Anybody who knows Deb knows that she’s worked for decades to find solutions for digital inclusion and was also the founder of Next Century Cities. It’s not a surprise to see her now as the head of the Enterprise Center, which is a non-profit that is working to leverage the city’s fiber network to benefit a number of sectors of the community.

The organization has three primary areas of focus. First, the Enterprise Center helped to establish and now is at the forefront of promoting Chattanooga’s Innovation District. This is a section of town that is focused on promoting new business start-ups and to create collaboration between creative thinkers, entrepreneurs, the local University community, and technology gurus. This effort involves numerous initiatives such as a high-tech business incubator program, co-working space for new businesses, and a host of services to help new businesses succeed.

The second area of emphasis is the Smart City Collaborative. This currently involves a 1.5-mile section of the city that is fully-sensored with a wide range of smart devices. The area is a testbed for smart city applications and has attracted institutions like the Oak Ridge National Laboratory, the National Science Foundation, U.S. Ignite, and the University of Southern California, along with numerous vendors and entrepreneurs. The collaboration has a number of goals. One is to test new smart city ideas in a field environment that is wired with gigabit fiber. The collaboration also concentrates on smart city applications that don’t violate citizen privacy. The long-term goal of the Enterprise Center is to spread the best of the smart city applications to the rest of Chattanooga.

Finally, Deb returned to her roots and is promoting digital equity through various programs such as Tech Goes Home – something she worked on in her past. Chattanooga is known for having fiber available everywhere, but like most cities still has many households that can’t afford broadband. The digital equity effort works to provide the three necessary components of digital inclusion – connectivity, computer hardware, and the training needed to use broadband. Deb reports that the demand for computer training is far exceeding predictions.

All of this is made possible to some degree by the fact that Chattanooga has a municipal broadband network. Deb says that City-owned ISP is key for her success. That can be seen in other communities like Wilson, North Carolina where the city has leveraged broadband to make big strides in revitalizing downtown, attracting businesses, invigorating the arts, and helping to solve the digital divide. Eugene, Oregon has leveraged gigabit fiber to create an economic boom by enabling a sizable community of software developers.

I’ve always been mystified why more cities don’t follow the lead of cities like Chattanooga, Wilson, and Eugene. There are now a few hundred communities that have built municipal fiber networks and many of them have not taken the next step past using the network to provide faster broadband. Faster and better broadband is important and can alone bring big benefits to a City such as increased incomes from citizens working from home, or from citizens levering broadband to start new businesses. The Enterprise Center has made the bold statement that broadband alone is not nearly enough, and a City has to expend effort to get the full benefits out of a broadband network.

Even more puzzling is that it’s rare to see this same effort in cities that have broadband networks provided by commercial ISPs. There are now many cities served by Google Fiber or other fiber overbuilders. I can’t think of anything that stops such cities from duplicating the efforts undertaken by the Enterprise Center and its many partners in the City.

I think cities that don’t tackle these issues are missing a huge opportunity. My guess is that twenty years from now the City of Chattanooga will be able to point to major employers that got their start through the business incubator effort. The city is likely to have benefitted hugely by keeping some of its brightest entrepreneurs at home rather than have them move to the handful of big tech centers around the country. It’s almost impossible to calculate the gigantic community benefit that can come from helping low-income households join the digital world. In a decade Chattanooga will start seeing young professionals and entrepreneurs that were aided by the efforts made today to solve the digital divide.