Gigabit LTE

Samsung just introduced Gigabit LTE into the newest Galaxy S8 phone. This is a technology with the capability to significantly increase cellular speeds, and which make me wonder if the cellular carriers will really be rushing to implement 5G for cellphones.

Gigabit LTE still operates under the 4G standards and is not an early version of 5G. There are three components of the technology:

  • Each phone has as 4X4 MIMO antenna, which is an array of four tiny antennae. Each antenna can make a separate connection to the cell tower.
  • The network must implement frequency aggregation. Both the phone and the cell tower must be able to combine the signals from the various antennas into one coherent data path.
  • Finally, the new technology utilizes the 256 QAM (Quadrature Amplitude Modulation) protocol which can cram more data into the cellular data path.

The amount of data speeds that can be delivered to a given cellphone under this technology is going to rely on a number of different factors:

  • The nearest cell site to a customer needs to be upgraded to the technology. I would speculate that this new technology will be phased in at the busiest urban cell sites first, then to busy suburban sites and then perhaps to less busy sites. It’s possible that a cellphone could make connections to multiple towers to make this work, but that’s a challenge with 4G technology and is one of the improvements promised with 5G.
  • The amount of data speed that can be delivered is going to vary widely depending upon the frequencies being used by the cellular carrier. If this uses existing cellular data frequencies, then the speed increase will be a combination of the impact of adding four data streams together, plus whatever boost comes from using 256 QAM, less the new overheads introduced during the process of merging the data streams. There is no reason that this technology could not use the higher millimeter wave spectrum, but that spectrum will use different antennae than lower frequencies.
  • The traffic volume at a given cell site is always an issue. Cell sites that are already busy with single antennae connections won’t have the spare connections available to give a cellphone more than one channel. Thus, a given connection could consist of one to four channels at any given time.
  • Until the technology gets polished, I’d have to bet that this will work a lot better with a stationary cellphone rather than one moving in a car. So expect this to work better in downtowns, convention centers, etc.
  • And as always, the strength of a connection to a given customer will vary according to how far a customer is from the cell site, the amount of local interference, the weather and all of those factors that affect radio transmissions.

I talked to a few wireless engineers and they guessed that this technology using existing cellular frequencies might create connections as fast as a few hundred Mbps in ideal conditions. But they could only speculate on the new overheads created by adding together multiple channels of cellular signal. There is no doubt that this will speed up cellular data for a customer in the right conditions, with the right phone near the right cell site. But adding four existing cellular signals together will not get close to a gigabit of speed.

It will be interesting to see how the cellular companies market this upgrade. They could call this gigabit LTE, although the speeds are likely to fall far short of a gigabit. They could also market this as 5G, and my bet is that at least a few of them will. I recall back at the introduction of 4G LTE that some carriers started marketing 3.5G as 4G, well before there were any actual 4G deployments. There has been so much buzz about 5G now for a year that the marketing departments at the cellular companies are going to want to tout that their networks are the fastest.

It’s always an open question about when we are going to hear about this. Cellular companies run a risk in touting a new technology if most bandwidth hungry users can’t yet utilize it. One would think they will want to upgrade some critical mass of cell sites before really pushing this.

It’s also going to be interesting to see how faster cellphone speeds affect the way people use broadband. Today it’s miserable to surf the web on a cellphone. In a city environment most connections are more than 10 Mbps today, but it doesn’t feel that fast because of shortfalls in the cellphone operating systems. Unless those operating systems get faster, there might not be that much noticeable different with a faster connection.

Cellphones today are already capable of streaming a single video stream, although with more bandwidth the streaming will get more reliable and will work under more adverse conditions.

The main impediment to faster cellphones really changing user habits is the data plans of the cellular carriers. Most ‘unlimited’ plans have major restrictions on using a cellphone to tether data for other devices. It’s that tethering that could make cellular data a realistic substitute for a home landline connection. My guess is until we reach a time when there are ubiquitous mini-cell sites spread everywhere that the cellular carriers are not going to let users treat cellular data the same as landline data. Until cellphones are allowed to utilize the broadband available to them, faster cellular data speeds might not have much impact on the way we use our cellphones.

Walking Away from Cable

Several different events in the last week got me thinking about an interesting trend in the cable industry. First, in my community there is a Redbox outlet in a neighborhood grocery store. My wife and I were discussing how busy they seem to be in renting out movie DVDs. All of the Blockbuster and other movie rental outlets have closed. Until I moved to this neighborhood recently I hadn’t notice any video stores or related outlets in a long time. But this Redbox seemed to have a lot of business.

I also saw an article in FierceCable that noted that only 5% of US households have subscribed to a vMVPD – an online cable provider like Sling TV, DirecTV Now or Playstation Vue. My first thought is that a 5% market penetration seems pretty phenomenal for an industry that is barely two years old. But the article notes that while 5% of households are current subscribers to online programming, another 8% of the market has tried and dropped one of the services. Since only about 20% of the total households don’t have traditional cable service it makes you wonder what the real upper potential for this market might be – it might be a lot smaller than the vMVPDs are hoping for.

I also went to a Superbowl party. The half dozen families attending are from my neighborhood and it turns out all of the households are cord cutters and don’t subscribe to traditional cable service. I was the only one that used a vMVPD and I currently have a subscription to Playstation Vue. None of them had tried a vMVPD and they seemed to have no interest in doing so. (I only use Playstation Vue because it’s the cheapest way to get Big10 sports and Fox Sportsnet so I can watch Maryland sports teams – I rarely watch the other linear programming).

National broadband penetration rates are now at 84% of all households. I’ve seen many of the opponents of spending money to build rural broadband say that households just want broadband to watch video. Netflix has made a huge dent in the market and served nearly 55 million US homes at the end of 2017. Add to that some percentage of the 90 million homes that subscribe to Amazon Prime, and it seems like there might be some truth in that.

But if households are cutting the cord, why aren’t more of them buying one of the on-line cable alternatives? Those services have packages that carry only the most popular cable channels at half the price of buying traditional cable.

I think the answer is a combination of two factors. One of the predominant factors is price. Every family at the neighborhood party has kids and they dropped traditional cable because it was too expensive. That has to be the factor that explains why the Redbox outlet is doing so well. Most of the movies available from Redbox are also available online. But getting online means also having an Internet-enabled TV or else buying a Roku or other web interface. And even then, watching many of these newer movies means subscribing to yet a different online service. I think there is a cost barrier, or perhaps a technology barrier that is keeping households using a traditional DVD player and Redbox.

Two different households at the party told me that they were satisfied with just watching Netflix and the free programming available on YouTube. And that is the second important trend. Households are getting used to watching just a subset of the programming that is available to them. When somebody drops cable TV and doesn’t buy a vMVPD service it means they have walked away from all of the content that is available only in those two media.

Most of my neighbors still watch the major networks using rabbit ears (something I don’t do). So they are still watching whatever is available on local CBS, NBC, ABC and FOX. But the families on my street are learning to live without the Game of Thrones, or the Walking Dead. They are no longer watching ESPN, Discovery, Comedy Central, the Food Network and the hundreds of channels that make up traditional cable TV.

This means their kids are not growing up watching traditional cable networks, and thus are not developing any brand loyalty to those networks or their programming. If you don’t learn to love a network when you are a teenager, will you decide to watch it when you are older?

I don’t have any answers to these questions, and obviously I can’t define a trend just from talking with some of my neighbors. But I found it intriguing that they all had dropped traditional cable and had not replaced that programming with something online. This tells me that there must be a lot of people who are not enamored with linear programming whether it’s on cable or online. And a lot of people are convincing themselves that it’s okay to walk completely away from the big pile of programming that is offered by the cable TV networks.

This is potentially a watershed phenomenon, because somebody that walks away from traditional programming is probably not coming back. These folks are cord cutters who are literally walking away from most of the programming available on traditional cable. Those networks and their programming are going to become irrelevant to them. But interestingly they are still going to consume a lot of video content – just not that created by the traditional cable networks. In my mind these households are looking a lot like Generation Z in that they are foregoing traditional programming and watching something else.

The vMVPDs are banking that people will transition down to their smaller packages when they leave a cable TV provider. But will they? This is a phenomenon that you can’t determine from industry-wide statistics, other than perhaps by seeing the dropping number of paid subscriptions to the various cable networks. People like my neighbors are dropping cable due to the expense, but they are quickly learning to live without traditional cable programming and aren’t chasing the online alternatives.

Millennials and Media

I’ve read a lot recently in various trade articles talking about the percentage of Millennials that are watching (or not watching) traditional TV content. The various polls and studies show that Millennials are far less interested in watching linear TV than older generations. They are far less likely to buy a traditional cable TV subscription.

Millennials are starting to have a huge impact on our society. They now make up 32% of all adults in the US. They are more educated than earlier generations and 40% of Millennials between the ages of 25 and 29 have completed a bachelor’s degree compared to 32% for Generation X and smaller numbers for Baby Boomers and the Silent Generation.

But I have rarely read anything that describes what Millennials are doing in place of watching traditional TV. We now know from a study by Nielsen that part of the answer lies in the fact that Millennials read a lot more digital content than older generations. Digital content is content generated by online sites. Nielsen says digital content is now the primary source of news, sport, fashion trends and general knowledge for this generation – to a far greater extent than older generations.

Nielsen has begun tracking digital content and has begun to rate it much like they do for television viewing. Since advertising is shifting towards the web this tracking is of great value to potential advertisers. Historically we’ve been ranking websites by the number of ‘hits’ on their website. But the Nielsen digital tracking goes much deeper and measures time spent at each web site – which is what advertisers want to know. Advertisers have been able to get this kind of information from huge sites like Facebook, but never for everything else on the web.

Here are just a few of the things that Nielsen found about Millennials and digital media:

  • In terms of volume, the leading website used by Millennials is BuzzFeed. This site reaches 83% of US Millennials each month. The content on BuzzFeed is aimed at Millennials and the average BuzzFeed viewer sees an astronomical 38 videos on the site per month. Users don’t have to go to BuzzFeed to see the content, which is widely distributed through the various social media platforms. The platform carries news, the many videos, quizzes and the popular Millennial food site Tasty.
  • Just behind BuzzFeed is Group Nine Media. This company has four web brands including NowThis, The Dodo, Seeker and Thrillist. The companies content is aimed at younger audiences and now reaches 81% of Americans in their 20s. The platform has grown quickly to 1 million minutes per month of streamed content.

Another popular digital content site is MIC. This site offers news aimed at younger viewers and reaches over 25% of people between 21 and 34 years old each month. They are now attracting over 40 million unique viewers per month. Perhaps the most interesting thing about the site to advertisers is that 56% of their viewers are female.

Refinery29 is a site aimed at young women. It’s a mix of fashion, beauty, entertainment and money news. The platform is a mix of text articles and videos and reaches 62% of women between 18 and 34 each month, but a huge 88% of women between 21 and 24. In 2017 Adweek reported that the site reached 500 million viewers worldwide.

Another web site that caters to Millennials has an interesting distribution network. Rather than maintain a web site, VIX distributes content on social medial sites like Facebook, Instagram and YouTube. The site carries video content on lifestyle tips, entertainment, food and life hacks. 62% of VIX viewers are female and VIX reaches 40% of US women between 18 and 49 each month.

All of this is bad news for companies that advertise on TV. Statistics show that linear TV audiences are aging quickly as younger viewers abandon watching real-time TV and its associated ads. Anything that is bad for TV advertisers is ultimately bad for the TV product and anybody that sells it. But the reality is that younger generations are abandoning the programming made for and watched by older generations. This is almost inevitable and is a market reality that the whole industry needs to come to grips with.

A Managed WiFi Product

A number of my clients are now selling a managed WiFi product. But the product they are offering customers under that name varies widely, and so today I thought I’d discuss a few of the different products being sold under this name.

The simplest product is one that I would call a WiFi network. Historically, ISPs that provided WiFi placed a single WiFi router near to where the broadband connection terminated into the home. And it was typical to include the WiFi functionality directly embedded into the DSL or cable modem router. This product has been around for a while and I got my first WiFi router when Verizon supplied an all-in-one router on my FiOS connection nearly 15 years ago.

But as homes have added numerous connected WiFi devices, a single WiFi router is often inadequate. With today’s greater demand for bandwidth by devices a single WiFi router often can’t reach to all parts of the home or connect smoothly to numerous devices. Most of my clients tell me that WiFi problems are now the biggest cause of customer dissatisfaction and in in many cases have surpassed cable TV issues. Many customers supply their own WiFi routers and ISPs get frustrated when a customer’s inadequate WiFi device or poor router placement ruins a good broadband delivery to the home.

Today there are numerous brands of WiFi network devices available. These systems deploy multiple WiFi routers around the home that are connected with each other to create one ubiquitous network. The routers can be connected wirelessly in a mesh or hard-wired to a broadband connection. These devices are widely available and many customers are now installing these networks – I’ve connected an eero network in my home that has vastly improved my WiFi quality.

I have a number of clients that sell the WiFi networks. They will place the WiFi units in the home in a manner that maximizes WiFi reception. The revenue play for this product is simple equipment rental and they charge each month for the devices. ISPs generally set up the routers so that they can peer into them for troubleshooting since customers inevitably will unplug a router, move one to a less than ideal place or place some big object near one that blocks the WiFi signal. But that’s about all that comes with the product – expert placement of routers and simple troubleshooting or replacement if there are problems.

At the other end of the spectrum are a few clients who really manage the customer WiFi experience. For example, customers can call when they buy a new WiFi device and the NOC technicians will connect the device to the network and maximize the WiFi connection. They will assign devices to different frequencies and channels to maximize the WiFi experience. These ISPs have invested in software that tracks and keep records of all of the devices connected to the WiFi network, meaning they can see a history of the performance of each customer device over time.

The ISPs monitor the WiFi performance and are usually proactive when they see problems, in the same manner than many ISPs track performance of fiber ONTs. The WiFi network moves the ISP deeper into the customer home and allows the ISP to make certain that customers are getting the bandwidth they are paying for.

Nobody know what to charge for this yet and I see monthly rates for the managed WiFi that range from $10 to almost $25 per month. I don’t have enough experience with this to yet suggest the right price. Like any new product the success is going to be due mostly to the marketing effort expended. I have a few clients who have already gotten penetration rates of 25% or more with prices in the $15 – $20 range.

But this product isn’t for everybody. For example, I have clients that don’t want to take on the product due to the extra truck rolls. But almost all of my clients have worries about eventually becoming dumb pipe providers and the managed WiFi product provides a tangible way to maintain contact with a customer to demonstrate the ISPs value proposition. And like with any equipment rental play the revenue stream is good. Once the cost of the hardware and initial installation have been recovered the product is almost all margin.



The Rush to vMVPDs

To those of you not familiar with the industry lingo, a vMVPD is a virtual multichannel video programming distributor, or virtual cable company. This term is being used to describe OTT providers that offer a version of the same channels offered by cable companies. This sector includes Sling TV, DirecTV Now, Playstation Vue, Hulu Live, YouTube TV and a few others. These providers stream networks on the same linear schedule as is shown on cable TV. Providers of alternate programming like Netflix or Amazon Prime are not considered as vMVPDs.

Industry analysts say that the vMVPDs as a group gained over 900,000 customers in the recently ended third quarter. That is a startling number and represents almost one percent of the whole traditional cable TV market, all captured in just one quarter. We’ll have to wait a bit to see how the whole cable market performed. But we already know that Comcast lost over 150,000 cable customers for the quarter. Since they had been hanging onto cable customers better than the other cable companies I think we can expect a bloodbath.

This kind of explosive growth is perhaps the best harbinger for the slow death knell for traditional cable TV. This new industry is still less than three years old with Sling TV having launched in February 2015. The industry started slowly and had only a few hundred thousand customers at most by the end of 2015.

But it’s now obvious that a lot of people are deciding that they don’t want to pay the big monthly bill for the giant channel line-up. The analysis from Nielsen shows that most households only watch a handful of channels. While no vMVPD is probably going to give households exactly the channels they most want to watch, they are obviously providing enough channel choices to lure people away from the cable companies.

It’s an interesting transition to watch. To some degree the programmers are contributing to their own demise. When people leave a cable line-up of 200 channels to instead watch an vMVPD line-up of less than 50 channels there are obviously a lot of networks that are no longer collecting customer fees. Practically every network is bleeding customers and this shift to OTT viewing is going to kill off a lot of network channels. I read an interview a few months ago with the head of programming at Fox who believed that his company would shut down the majority of their cable networks within a few years.

Another thing I find interesting about this shift is that the vMVPDs are not particularly easy to use. I’ve now tried four of them – Sling TV, DirecTV Now, Playstation Vue and Fubo TV, and I will get around to trying them all eventually. None of them have the ease of use of a cable settop box. You can’t just surf through channels easily to see what’s on and you have to instead navigate through menus that take several steps compared to a simple ‘channel up’ command on a cable remote.

These four services also have channel guides of a sort, but they are also cumbersome to use. I’ve found that it can easily take three or four minutes to change between two shows, and that’s when you know what you want to watch. The guides on these services are not yet friendly for looking hours or days ahead to see what you might want to watch later. And at least one of the services, Playstation Vue, is so confusing that I often get lost in its menus.

And yet nearly a million people changed to one of these services in the last quarter. The biggest appeal for these services is price along with a total ease to subscribe or unsubscribe. After years of dealing with big cable companies I was apprehensive the first time I tried to unsubscribe to Sling TV – but it took less than a minute to do on-line and was not a hassle. The services differ in features like the number of people who can watch different programming at the same time on an account, but they are all becoming more people friendly over time.

At this point AT&T might be the only company that is getting this right. The company lost 385,000 customers in the third quarter between DirecTV satellite service and U-verse. But they gained 296,000 DirecTV Now customers to make up for a lot of those losses. At this point nobody is talking about the margins on vMVPD service, but it can’t be a whole lot worse than the shrinking margins on traditional cable TV.

I believe we are seeing the future of TV in the vMVPD product. We’ll probably look back five years from now and laugh at these hard-to-use first generation services. I’m sure that over time they will get far easier to use and I’m getting ready to experiment using my Amazon Echo to navigate through Playstation Vue. When it becomes simple to use vMVPDs, then  traditional cable TV might have become passe.

A New Vision of Economic Development


Photo by Drew C. Wilson of the Wilson Times

I attended a forum in Wilson, North Carolina last week that talked about how fiber is transforming their city. They talked about how they are trying a new model for economic development.

The traditional economic development model concentrated on searching for big piles of jobs. Communities made efforts to attract major employers and worked hard to keep companies from leaving their town. But it’s pretty obvious when looking around rural America that this model stopped working somewhere along the line. I visit far too many communities that have lost big employers and that are not finding anybody to replace them. This is due to some degree to the overall huge decrease in US manufacturing jobs. But it also is due in part to the general decline of businesses located in smaller communities.

Wilson is a community of around 50,000. Historically the city was known as the ‘world’s greatest tobacco market’ in the 19th century and tobacco was huge in the area until a few decades ago. Wilson was also the birthplace to BB&T bank, which is still the largest employer in the city. But like happened with many US cities, Wilson also went through a decline. Some small manufacturers closed and the tobacco business died. In a scene that is familiar across the country the downtown business district dried-up as retail moved to other places.

Wilson started its fiber optic business in 2008 under the tradename of Greenlight. They were one of the first cities in the country to offer gigabit broadband to residents. And that fiber network was the linchpin for the city in developing their new vision of economic development.

The concept behind Wilson’s vision sounds simple. They figure that that the best way to attract jobs to the community is by working to make their community a place where people want to live. They want visitors to the city to like it enough that some of them will want to move there. And they figure that when they reach that goal that businesses will naturally want to locate there. So they are looking to grow their economy by concentrating on and improving the assets they already have.

Of course, this is anything but simple. Many cities have tried this and only a few have found a way to rebound from the decaying downtowns we see all over the country. Wilson is making the turn by concentrating on the downtown area. They lured the Wilson Times, a local daily newspaper, to refurbish an old building and move back into downtown. They raised the money to renovate an old theater to create the Edna Boykin Cultural Center. There is a project to build new housing downtown next to the whirligig park (the picture accompanying this blog). They attracted Peak Demand to make a $2.6 M investment to manufacture electrical components in an old tobacco processing plant. And these investments are bringing back other businesses. There are new restaurants and two brew pubs that have opened in the downtown.

Wilson is using an approach that other cities should consider. They involve all of the stakeholders in the community in the effort to improve quality of life there. That includes working with Barton College, a 1,200-student liberal arts university and nursing school. They challenged the arts community to move and grow downtown and have a thriving art scene. They put an emphasis on buying local, which we all know has a tremendous local economic multiplier effect. The various constituencies in the city meet often to discuss ways to make the city better.

But they credit the fiber network for being the change that started everything. While big companies and big employers are important to every community, Wilson understood that the work-from-home entrepreneur movement is creating a lot of jobs and a lot of wealth. And so they foster innovation in a number of different ways and strive to make small and new businesses successful.

The big shame is that the North Carolina legislature passed a law to prohibit other communities in the state from following the Wilson model. Cities are no longer allowed to become retail ISPs in North Carolina. If they build fiber it has to be operated by somebody else – and we know that is a far harder model to make work. One only has to look at what’s happening in Wilson to understand that fiber is an important component these days for economic vitality. But fiber alone is not a guarantee for economic success. It takes a community-wide effort like the one in Wilson to take advantage of what fiber offers. Wilson still has a way to go, but you can feel the excitement in the community – and that is what makes any city a place where people want to live.

5G Networks and Neighborhoods

With all of the talk about the coming 5G technology revolution I thought it might be worth taking a little time to talk about what a 5G network means for the aesthetics of neighborhoods. Just what might a street getting 5G see in new construction that is not there today?

I live in Asheville, NC and our town is hilly and has a lot of trees. Trees are a major fixture in lots of towns in America, and people plant shade trees along streets and in yards even in states where there are not many trees outside of towns.

5G is being touted as a fiber replacement, capable of delivering speeds up to a gigabit to homes and businesses. This kind of 5G (which is different than 5G cellular) is going to use the millimeter wave spectrum bands. There are a few characteristics of that spectrum that defines how a 5G network must be deployed. This spectrum has extremely short wavelengths, and that means two things. First, the signal isn’t going to travel very far before the signal dissipates and grows too weak to deliver fast data. Second, these short wavelengths don’t penetrate anything. They won’t go through leaves, walls, or even through a person walking past the transmitter – so these frequencies require a true unimpeded line-of-sight connection.

These requirements are going to be problematic on the typical residential street. Go outside your own house and see if there is a perfect line-of-sight from any one pole to your home as well as to three or four of your neighbors. The required unimpeded path means there can be no tree, shrub or other impediment between the transmitter on a pole and each home getting this service. This may not be an issue in places with few trees like Phoenix, but it sure doesn’t look very feasible on my street. On my street the only way to make this work would be by imposing a severe tree trimming regime – something that I know most people in Asheville would resist. I would never buy this service if it meant butchering my old ornamental crepe myrtle. And tree trimming must then be maintained into the future to keep new growth from blocking signal paths.

Even where this can work, this is going to mean putting up some kind of small dish on each customer location in a place that has line-of-sight to the pole transmitter. This dish can’t go just anywhere on a house in the way that satellite TV dishes can often be put in places that aren’t very noticeable. While these dishes will be small, they must go where the transmitter can always see them. That’s going to create all sorts of problems if this is not the place in the home where the existing wiring comes into the home. In my home the wiring comes into the basement in the back of the house while the best line-of-sight options are in the front – and that is going to mean some costly new wiring by an ISP, which might negate the cost advantage of the 5G.

The next consideration is back-haul – how to get the broadband signals into and out of the neighborhood. Ideally this would be done with fiber. But I can’t see somebody spending the money to string fiber in a town like Asheville, or in most residential neighborhoods just to support wireless. The high cost of stringing fiber is the primary impediment today for getting a newer network into cities.

One of the primary alternatives to stringing fiber is to feed neighborhood 5G nodes with point-to-point microwave radio shots. In a neighborhood like mine these won’t be any more practical that the 5G signal paths. The solution I see being used for this kind of back-haul is to erect tall poles of 100’ to 120’ to provide a signal path over the tops of trees. I don’t think many neighborhoods are going to want to see a network of tall poles built around them. And tall poles still suffer the same line-of-sight issues. They still have to somehow beam the signal down to the 5G transmitters – and that means a lot more tree trimming.

All of this sounds dreadful enough, but to top it off the network I’ve described would be needed for a single wireless provider. If more than one company wants to provide wireless broadband then the number of devices multiply accordingly. The whole promise of 5G is that it will allow for multiple new competitors, and that implies a town filled with multiple wireless devices on poles.

And with all of these physical deployment issues there is still the cost issue. I haven’t seen any numbers for the cost of the needed neighborhood transmitters that makes a compelling business case for 5G.

I’m the first one to say that I’ll never declare that something can’t work because over time engineers might find solutions for some of these issues. But where the technology sits today this technology is not going to work on the typical residential street that is full of shade trees and relatively short poles. And that means that much of the talk about gigabit 5G is hype – nobody is going to be building a 5G network in my neighborhood, for the same sorts of reasons they aren’t building fiber here.

Smart Cities and Fiber

I’ve noticed that a lot more cities are talking about becoming ‘smart cities.’ Only a few years ago this was something that only NFL cities talked about, but now I see it as a goal for much smaller cities. ‘Smart city’ is an interesting concept. If you listen to the various vendors pushing the idea this means investing in massive amounts of sensors and the computing power to make sense of them. But there are also a lot of lower-tech ideas that fit under this same umbrella.

I’ve had discussion with folks at cities who think that they need fiber in order to have a smart city. Nobody is a bigger proponent of fiber than I am, but fiber is not necessarily needed for many of the concepts that are part of this high-tech vision.

Having smarter traffic flow is generally at the top of everybody’s list. It’s common sense that having vehicles needlessly waiting for lights wastes fuel and wastes time. Smarter traffic lights in cities would improve the quality of life and the economy. A decade ago a lot of cities built fiber networks just to provide a real-time connection to each traffic signal. Those fiber networks allowed the city to change signal timing in reaction to emergencies and similar events, but the whole effort is largely still manual.

But with AI starting to become a realistic technology it looks like truly smart traffic lights are a possibility in the near future. A smart traffic system could change lights on the fly in response to real-life traffic to reduce the average time that vehicles wait for a green light. But the question that must be asked is if this really requires fiber? A decade ago it did. Fiber was needed just to provide the traffic cameras needed to allow somebody at traffic headquarters to eyeball the situation at a given intersection.

But we are now seeing a revolution in sensing devices. We are not too many years removed from the big push to do all heavy-computing in the cloud. A decade ago the vision was that a smart traffic light system would rely on cloud computing power. But faster computers have now reversed that trend and today it makes more sense to put smart computers at the edge of network. In the case of traffic lights, smart computers at the edge reduces the need for bandwidth. Sensors at an intersection no longer need to broadcast non-stop and only need to relay information back to the central core when there is some reason to do so.

For example, one of the uses of a smart traffic system is to identify problem intersections. Sensors can be programmed to record every instance when somebody runs a red light or even a late yellow light and this can alert authorities to problems long before a tragic accident. But these sensors only need to send data when there is an actionable event, and even that doesn’t require a gigantic burst of data.

The same goes for smart traffic control. The brains in the device at an intersection can decide to allow for a longer green for a turn lane if there are more cars than normal waiting to turn. That doesn’t need a big continuous bandwidth connection. The city will want to gather data from intersections to know what the devices are doing, but with smart edge devices a wireless connection provides adequate broadband and a lower cost solution for data gathering.

This same trend is happening with other kinds of sensors. Sensors that listen for gunshots, smart grid sensors used to monitor water and electric networks, and smart sensors used to provide smarter lighting all can be done wirelessly and do not need a fiber connection.

The real purpose behind the concept of a smart city is to provide better government service to constituents. Many of the best ideas out there don’t involve much bandwidth at all. For example, I recently watched a demo of a system in a mid-western city that allows citizens to see, in real time, the location on a map all of the snow plows and trash trucks operating in the city – much like is done when you can see a Lyft ride coming to pick you up. This will drastically cut down on calls during snowstorms since citizens can see a plow making its way towards their street. (And watching the plow inch towards you on a snowy day is good entertainment!)

Cities are undertaking all sorts of other initiatives to improve quality of life. I see cities working on computer systems that put all government forms and processes online, making it easier to get a permit or to report a problem to the city. Cities are reducing pollution by passing ordinances that promote roof-top gardens, that require that new high-rises that are energy self-sufficient and that promote safe bicycling.

There are still big corporations out pitching the expensive smart city vision. But there are now smaller boutique smart city vendors that working towards more affordable and reasonably-priced sensors to spread around a city.

Like anyone who lives in a city I would love to see my city implement smart city ideas that improve the quality of life. But as much as I am a fiber-proponent, I am finding it hard to make a case that a lot of urban fiber is needed to implement the best smart-city ideas.

Cellular WiFi Handoffs

If you use anybody except Verizon you may have noticed that your cellphone has become adept at handing your cellular connections to a local WiFi network. Like most people I keep my smartphone connected to WiFi when I’m at home to save from exhausting my cellular data cap. I have AT&T cellular service and I’ve noticed over the last year that when I’m out of the house that my phone often logs onto other WiFi networks. I can understand AT&T sending me to their own AT&T hotspots, but often I’m logged on to networks I can’t identify.

When I lived in Florida I was a Comcast customers and so when I was out of the house my phone logged onto Comcast hotspots. Even today my phone still does this, even though I’m no longer a Comcast customer and I assume there is a cookie on the phone that identifies me as a Comcast customer. I understand these logins, because after I the first time I logged onto a Comcast hotspot my phone assumed that any other Comcast hotspot is an acceptable network. This is something I voluntarily signed up for.

But today I find my phone automatically logged onto a number of hotspots in airports and hotels which I definitely have not authorized. I contrast this with using my laptop in an airport or hotel. With the laptop I always have to go through some sort of greeting screen, and even if it’s a free connection I usually have to sign on to some terms of service. But my phone just automatically grabs WiFi in many airports, even those I haven’t visited in many years. I have to assume that AT&T has some sort of arrangement with these WiFi networks.

I usually notice that I’m on WiFi when my phone gets so sluggish it barely works. WiFi is still notoriously slow in crowed public places. Once I realize I’m on a WiFi network I didn’t authorize I turn the WiFi off on my phone and revert to cellular data. Every security article I’ve ever read says to be cautious when using public WiFi and so I’d prefer not to use these connections unless I have no other option.

There was a major effort made a few years back to create a seamless WiFi network for just this purpose. The WiFi Alliance created a protocol called Hotspot 2.0 that is being marketed under the name of Passpoint. The purpose of this effort was to allow cellular users to automatically connect and roam between a wide variety of hotspots without having to ever log in. Their ultimate goal was to enable WiFi calling that could hand off between hotspots in the same way that cellular phones hand-off between cell sites.

It’s obvious that AT&T and other cellular carriers have implemented at least some aspects of Hotspot 2.0. In the original vision of Hotspot 2.0 customers were to be given the option of authorizing their participation in the Passpoint network. But AT&T has never asked my permission to log me onto WiFi hotspots (unless it was buried in my terms of service). AT&T has clearly decided that they want to use these WiFi handoffs in a busy environment like an airport to protect their cellular networks from being swamped.

It’s interesting that Verizon is not doing this. I think one reason for this is that they don’t want to give up control of their customers. Verizon foresees a huge future revenue stream from mining customer data and I’m guessing they don’t want their customer to be shuttled to a WiFi network controlled by somebody else, where they can’t track customer behavior. Verizon is instead pushing forward with the implementation of LTE-U where they can direct some data traffic into the WiFi bands, but all under their own control. While LTE-U uses WiFi frequency, it is not a hotspot technology and is as hard to intercept or hack as any other cellular traffic.

Most new cellphones now come with the Passpoint technology baked into the chipset. I think we can expect that more and more of our cellular data connections will be shuttled to hotspots without notifying us. Most people are not going to be bothered by this because it will reduce usage on their cellular data plans. I’m just not nuts about being handed off to networks without some sort of notification so that I can change my settings if I don’t want to use the selected network. I guess this is just another example of how cellular companies do what they want and don’t generally ask for customer permission.

Local Government Funding for Fiber

There is an interesting new trend where local government acts as the banker for rural broadband projects. It’s an interesting new twist on public / private partnerships and is a model that more communities should consider.

Consider these rural broadband projects in Minnesota.

  • First is RS Fiber. This is a new broadband cooperative that serves most of Sibley County and some of Renville County in Minnesota. Bonds were approved to fund 25% of a broadband project and those bonds are backed by the counties, some small cities and also by townships that are getting the fiber. The expectation is that the project will make the bond payments.
  • Next is in Swift County Minnesota. Federated Telephone Cooperative, an existing telephone company, was awarded $4.95 million to build fiber to rural homes in the county. The county approved general obligation bonds of $7.8 million to complete the project, or 60% of the funding.

Both projects are classic examples of a public private partnership. In these particular cases the company that will own and operate the network is a cooperative, but these same agreements could have been made with a for-profit telco or some other telecom provider as well.

These kinds of projects make sense for a number of reasons:

  • The process of approving bond financing is far faster than securing traditional funding for these kinds of projects.
  • Bonds for fiber can be financed over a long period of time – 20 to 30 years, while loan terms for commercial loans are usually shorter. Just like with a home mortgage, borrowing for a longer time period means lower annual debt payments, which is essential to make these projects financially feasible.

In both cases the Counties and other local government entities have taken on the role of banker. The local governments will have no operational role in running the fiber business (a role they did not want). The Counties expect for the bond payments to be covered by the fiber project. And since these networks are being built in rural areas with few other broadband alternatives the new fiber ventures should get high customer penetration rates. But if the ventures fail then the local governments are on the hook to cover any shortfalls in the bond payments.

These are both cases of local governments deciding that the need for rural broadband was great enough to risk taxpayer money to get this done. They also decided that the risk of not getting paid is low. The business cases show that even in the worst case the revenues from the projects should cover almost all costs, meaning that the downside risk to the Counties is minimal. In the case of RS Fiber, as a start-up new cooperative, they would not have been able to get any traditional funding without the seed money from the local governments.

This is a model that the rest of rural America should consider. Small ISPs like these cooperatives stand ready to serve a lot of rural America, but they often don’t have the financial wherewithal to do so. In these cases, a public private partnership with local government as the banker seemed to be the only way to make this happen.

Everywhere I travel in rural America homeowners and farmers want good broadband. They understand that it’s costly to build fiber to farms and small rural towns. But they also seem willing to help pay to make this work. I think if more rural counties would listen to their constituents they would take a harder look at this model.

Of course, a county needs to do their homework up front and make sure they know it’s a sound project and that the estimated cost of building the broadband network is accurate. But assuming there is a solid business plan, perhaps the most valuable role a county can tackle is that of being the banker to help new broadband builds get off the ground