Thinking Exponentially

Exponential GrowthWe are at an interesting point in human history where there is rapid growth in a number of different areas that are all having or will soon have a profound impact on society. And by rapid growth I am talking about exponential growth, because most people assume that even fast growth is straight-line and linear.

Most things around us grow over time with linear growth, which is growth done at a consistent rate. But exponential growth happens with a repeated multiplication of the rate of growth. Linear growth results in straight-line growth while exponential growth results in explosive growth.

An example of exponential growth is the old Chinese story about a man who did a favor for an emperor and asked to be paid in rice. He wanted one grain the first day, two grains the second day, and so on for a month. The emperor though this sounded like a great idea until a few weeks into the process it became clear that he would soon be paying with all of the rice in China.

We’ve had a few examples of exponential growth in the US economy in the past. Consider the growth of televisions in households. These went from being in a very few homes in the late 40’s until practically every home in the country had a TV by the mid-50s.

We have one example of exponential growth in the broadband industry which is that the growth in the amount of data downloaded by the average home, which has been doubling roughly every three years since the late 80s. And we’ve seen the result of this growth manifested by the quickness with which any new broadband technology gets overwhelmed and obsolete within a relatively short time after hitting the market. Consider DSL. When we all got our first 1 Mbps DSL connection it felt extravagantly fast. I remember talking about how wonderful it felt to have a T1 in my house. But that excitement faded quickly when within a few short years that DSL felt inadequate.

The human mind does not easily grasp the idea of exponential growth. I’ve seen this many times with network planning. Engineers will plot out expected network growth linearly and will increase the size of the data electronics on a network only to find out, often within a very short time that the new facilities are full and overloaded. Exponential growth almost always surprises us.

We are now sitting at a time when there are a number of examples of exponential growth happening in different technology areas. Ray Kurzweil was one of the first to identify the impact of exponential growth in today’s world back in 2006 in his book The Singularity is Near. In that book he discussed five paradigms in the computing world that had grown exponentially in the 20th century: electromechanical, relay, vacuum tubes, discrete transistors, and integrated circuits.

Kurzweil has made very good predictions about the last decade and has made the following predictions about the next few decades:

  • Within a decade from now solar power will generate the majority of the world’s electricity;
  • By the late 2010s, glasses will beam images directly onto the retina. Ten terabytes of computing power (roughly the same as the human brain) will cost about $1,000.
  • By the 2020s, most diseases will have been cured by nanobots in our blood stream. Computers will easily pas the Turing test. Self-driving cars will be the norm and people won’t be allowed to drive on highways.
  • By the 2030s, virtual reality will begin to feel 100% real. We will be able to upload our mind/consciousness by the end of the decade.
  • By the 2040s, computers will be a billion times more capable than biological intelligence. Nanotech will enable us to make food out of thin air.
  • By 2045, people will be able to multiply our intelligence a billionfold by linking wirelessly from our brains to the cloud.

These predictions are all amazing and speak about a near-future world that is very different than today. But what they speak about even more is the power of exponential growth. In order for these predictions to be realized there needs to continual exponential growth in the fields of computing, artificial intelligence, biological sciences, etc.

The Anti-Competitive US Marketplace

President_sealThe US Council of Economic Advisers released a report on April 15th that cited examples of anticompetitive market power throughout many sectors of the US economy. The report says that in many sectors of the economy that the fifty largest companies control more of the market today than they did in 1997.

The conclusion of the report is that industry concentration has an overall negative impact on the economy. The largest companies in each industry tend to erect barriers that squelch start-ups, thwarts innovation and discourage real competition.

The president used this report to issue an executive order on April 30 that instructs all executive agencies and departments to submit a plan within 30 days of ways that they can promote competition. I don’t hold out a lot of hope of success for this action due to the fact that we are now deep into the last year of this presidency. But it’s still refreshing to see the government acknowledge that competition is better for our economy than having an economy controlled by large companies with huge market power.

There are few industries that demonstrate the negative aspects of anti-competitive behavior than telecom. The cellular part of the industry is probably the worst since four companies have the vast majority of customers in the country. There are not a lot of other cellular companies that own their own spectrum and many competitive alternatives to the big four, like Cricket, actually ride the networks of the bigger companies and buy wholesale minutes from them.

But right behind the concentration of cellular companies are the ISPs. A handful of largest cable companies and telcos have more than 90% of the broadband customers in the country. And even in markets where these providers overlap, the competition between these large companies can best be characterized as duopoly competition where the companies charge roughly the same prices and don’t compete in any meaningful way.

The only broadband markets in the country that have real competition are those where some outside party has entered the market with a competing network. That might be a municipal provider or else one of the handful of commercial providers that are building competitive networks.

Earlier this week I wrote about how the largest ISPs all attack municipal competition and the reason for this is clear. They don’t want there to be success stories where it can be shown that a city was effectively able to take over a market – because such an idea could spread to a whole lot of other cities.

The report goes on to show that government sometimes has been able to curb some of the worst abuses of anti-competitive behavior. There were a few government actions touted in the report as positive steps the government has taken to promote competition. One big action in the telecom space was blocking of the merge between AT&T and T-Mobile. Another was the FCC’s order of net neutrality and of placing broadband under Title II regulation.

But the feds also sometimes get it wrong. Right now the FCC is using the excuse of lack of competition as the motivation for ordering an opening of the settop box market. But I talk to folks in the industry all of the time and nobody I talk to thinks that settop boxes are much of a concern. If anything, the feeling is that new technology will naturally kill settop boxes and eliminate the need for them.

I was relieved to see the merger between Comcast and Time Warner die, but we are still seeing consolidation in the cable industry and the largest companies are getting more powerful. There is very little positive that can be gained from the Time Warner and Charter merger. And it’s always been disturbing to see large ISPs that also own programming content. That alone gives Comcast a huge advantage over anybody that tries to compete against them.

I don’t know how anybody can undo the consolidation in this industry. The big companies have locked up the market and the cost to build new networks to compete against them is a major barrier to entry. But perhaps having new networks built by municipalities and other commercial providers will chip away at enough of the market over time to make a difference.

 

A New Cable Network Architecture

coaxial cableThere seems to be constant press about the big benefits that are going to come when cable coaxial networks upgrade to DOCSIS 3.1. Assuming a network can meet all of the requirements for a DOCSIS 3.1 upgrade the technology is promising to allow gigabit download speeds for cable networks and provide cable companies a way to fight back against fiber networks. But the DOCSIS 3.1 upgrade is not the only technological path that can increase bandwidth on cable networks.

All of the techniques that can increase speeds have one thing in common – the network operator needs to have first freed up channels on the cable system. This is the primary reason that cable systems have converted to digital – so that they could create empty channel slots on the network that can be used for broadband instead of TV.

The newest technology that offers an alternative to DOCSIS 3.1 is being called Distributed Access Architecture (DAA). This solution moves some or all of the broadband electronics from the core headend into the field. In a traditional DOCSIS cable network the broadband paths are generated to customers using a device called a CMTS (cable modem termination system) at the core. This is basically a router that puts broadband onto the cable network and communicates with the cable modems.

In the most extreme versions of DAA the large CMTS in the headend would be replaced by numerous small neighborhood CMTS units dispersed throughout the network. In the less extreme version of DAA there would be smaller number of CMTS units placed at existing neighborhood nodes. Both versions provide for improved broadband in the network. For example, in the traditional HFC network a large CMTS might be used to feed broadband to tens of thousands of customers. But dispersing smaller CMTS units throughout the network would result in a network where fewer customers are sharing bandwidth. In fact, if the field CMTS units can be made small enough and cheap enough a cable network could start to resemble a fiber PON network that typically shares bandwidth with up to 32 customers.

There are several major advantages to the DAA approach. First, moving the CMTS into the field carries the digital signal much deeper into the network before it gets converted to analog. This reduces interference which strengthens the signal and improves quality. And sending digital signals deeper into the network allows support for higher QAM, which is the signaling protocol used to squeeze more bits per hertz into the network. Finally, the upgrade to DAA is the first step towards migrating to an all-digital network – something that is the end game for every large cable company.

There is going to be an interesting battle between fans of DOCSIS 3.1 and those that prefer the DAA architecture. DOCSIS 3.1 was created by CableLabs, and the large cable companies who jointly fund CableLabs tend to follow their advice on an upgrade path. Today DOCSIS 3.1 is still in first generation deployment and is just starting to be field tested and there is already a backlog on ordering DOCSIS 3.1 core routers. This opens the door for the half dozen vendors that have developed a DAA solution as an alternative.

While CableLabs didn’t invent DAA, they have blessed three different variations of network design for the technology. The technology has already been trialed in Europe and the Far East and is now becoming available in the US. It’s been rumored that at least one large US cable company is running a trial of the equipment, but there doesn’t seem to be any press on this.

Cable networks are interesting in that you can devise a number of different migration paths to get to an all-digital network. But in this industry the path that is chosen by the largest cable companies tends to become the de facto standard for everybody else. As the large companies buy a given solution the hardware costs drop and the bugs are worked out. As attractive as DAA is, I suspect that as Comcast and others choose the DOCSIS 3.1 path that it will become the path of choice for most cable companies.

Solving the Urban Digital Divide

old computerI can remember talking about the digital divide twenty years ago. At that time the main issue was to get computers to low income households so that they could buy DSL. There were some fairly successful programs around, mostly run by volunteers or with grant funding to try to make this work.

But now twenty years later most cities I visit are still trying to solve the digital divide. But today it’s a different divide and the urban divide is now mostly one of affordability. There are isolated pockets in many cities that don’t have broadband, but the vast majority of people in urban areas have physical access to broadband. But there are numerous surveys that show that somewhere between 10% and 20% of households in most cities say that they can’t afford broadband.

In the last twenty years broadband has gotten a lot more expensive. And I think we are headed for a time when it’s going to become even more expensive. The big telcos and cable companies are now looking for broadband to be their major source of revenue growth. The cable companies added over 3 million new broadband customers last year and are expected to do so again this year. But you don’t have to look very far into the future to foresee the time when growth will be slow for every ISP. They will be forced to raise broadband rates to meet Wall Street earnings expectations.

There are some cities that have built their own networks – cable HFC networks or fiber networks – but even these cities have not done a very good job of providing broadband to all of their low-income households. It’s expensive to build the last mile and particularly expensive to connect homes to a fiber system.

There are some solutions that can solve part of the problem:

  • There are a number of cities that have built to or purchase broadband for public housing projects. But generally this only covers a small percentage of the households that need broadband.
  • There are some large ISPs that bring broadband to public housing. I recall seeing announcements recently where both Google and AT&T have brought broadband to public housing in one or two cities, and of course they crowed loudly about it. And while these gestures are nice, they solve a tiny slice of the problem.
  • There are cities that have tried to build ubiquitous outdoor WiFi. But these networks are expensive to build and the technology doesn’t seem to last for many years. I know of a number of these networks that have been discontinued in the past.
  • There are also cities experimenting with trying to beam WiFi into low income homes, but this is even more expensive than building outdoor WiFi.
  • Communities everywhere have put broadband into libraries, figuring that having a place for people to get access to broadband is better than nothing at all.

But I see that the digital divide topic is back in vogue and a lot of cities are having the discussion again of how to bring broadband to where people need it. There was a time in the past where broadband was something that was nice to have, but today it is becoming a necessity for most people. And not having affordable broadband puts people at a major disadvantage. There are a lot of people today that use their smartphone for Internet access. This works for a lot of purposes, but it can quickly get dreadfully expensive if you actually use the broadband much.

I don’t have a solution. I was just in a city last week that owns their own cable network and I reminded them that using that network to solve the digital divide is by far the most cost effective way to do this. This city was extremely interested in the new federal lifeline program for data and that might be enough of an incentive for them to develop a lifeline product that can be afforded by a lot of the households in their city.

When I look around at the number of households that want broadband and the numbers that will be eligible for the federal program I wonder if the USF Lifeline Fund is large enough to help everybody who needs it. I saw that Congress is already trying to cap this fund, but if we want to get broadband everywhere then the USF fund might be a powerful tool for getting broadband into a lot more homes.

Squelching Competition

Bell_logo_1969I doubt that many people outside of the broadband industry understand how good the large incumbent cable companies and telcos are at squelching competition. They have a whole arsenal of strategies to make it hard for somebody small to compete against them.

There is no better example of this than the very successful war that these big companies have waged against municipal broadband. I’m not entirely sure why they have singled out municipalities other than they are somewhat of an easy target. Certainly the actions of the incumbents against cities is far out of proportion to the actual threat. On a nationwide basis the amount of competition from municipalities is miniscule. Following are just a few of the strategies they have used against municipalities:

Creating Laws to Prohibit or Curtail Muni Competition. There are now 23 states that have some kind of prohibition or major restriction against broadband competition. Many of these laws have been written by the incumbents and have been passed due to heavy lobbying and political contributions given by the incumbent providers.

Every year there are attempts to pass new restrictions, often using template language developed by ALEC (American Legislative Exchange Council). This group’s authors suggested statutes that benefit their corporate sponsors and in the broadband area they seem to focus on municipal competition. The FCC recently overturned some restrictive anti-muni state laws in Tennessee and North Carolina. The incumbents urged those states to appeal the FCC order, and even before the appeals have been decided in court those states are trying to pass new laws to replace the ones that were overturned.

Lawsuits or Threats of Lawsuits. I can’t recall a lawsuit where the incumbents have been successful in keeping a city out of the broadband business. But lawsuits are great delaying tactics and can cost a lot of money to defend. Lafayette, Louisiana for example was sued several times before it was able to float bonds to build its broadband network. Lexington, Kentucky was recently sued and they hadn’t even yet found a broadband partner. But lawsuits are a very effective threat and I know cities that have decided not to tackle broadband due to the fear of facing costly suits.

Constant Bad Press. The incumbents sponsor a never-ending barrage of whitepapers and policy research papers that demonstrate that municipal broadband does not work and is a failure. I can think of half a dozen such papers that have attacked Lafayette since they have been in business.

The problem with these papers is that they are full of lies and inaccuracies. While there have been a few municipal failures with broadband networks, most of the operating muni networks are happy with their results and are covering their costs of operations while bringing true competition to their community. The main purpose of these papers is to persuade politicians that muni broadband is a failure – even when it is not.

The anti-competitive tactics aren’t just used against municipalities. Any CLEC that has to interface with a large telco network can recite a long string of stories of how difficult the big companies make it for them to operate. Something as simple as ordering a new circuit or connecting with incumbents can take forever and can cost far more than its worth. The big telcos began almost immediately after the Telecommunications Act of 1996 to make the prescribed processes as non-functional as possible. The intransigence of the large telcos contributed to driving some CLECs out of business.

Even very large competitors are not immune to the delays that the big monopolies can throw at them. The issues that Google is having in California in getting access to poles shows that even big commercial companies are not immune from the tactics of the telcos and the cable companies to keep them out of their markets for as long as possible.

It’s hard and expensive to fight the incumbents. This short blog barely touches the many tactics they use to thwart competition. They have flocks of lobbyists in DC and at the state level. They make big political contributions and have many allies from statehouses down to city councils. And they don’t seem boldly lie if that might convince a politician to vote their way. There is no question that they are a formidable foe and have proven very good at protecting their monopoly and duopoly markets.

Google Looking at Wireless Drops

Wi-FiIn an interview with Re/code Craig Barrett, the CEO of Access for Alphabet said that Google is looking at wireless last mile technologies. Google is not the only one looking at this. The founder of Aereo has announced a new wireless initiative to launch this summer in Boston under the brand name of Starry. And Facebook says it is also investigating the technology.

The concept is not new. I remember visiting an engineer in Leesburg, Virginia back in the 90s who had developed a wireless local loop technology. He had working prototypes that could beam a big data pipe for the time (I’m fuzzily remembering a hundred Mbps back when DSL was still delivering 1 Mbps). His technology was premature in that there wasn’t any good technology at the time for bringing fast broadband to the curb.

As usual there will be those that jump all over this news and declare that we no longer need to build fiber. But even should one of these companies develop and perfect the best imaginable wireless technology there is still going to have to be a lot of fiber built. All of these new attempts to develop wireless last mile technologies share a few common traits that are dictated by the nature of wireless spectrum.

First, to get good the kind of big bandwidth that Google wants to deliver, the transmitter and the customer have to be fairly close together. Starry is talking about a quarter mile deliver distance. One characteristic of any wireless signal is that the signal weakens with distance. And the higher the frequency of the spectrum used, the faster the signal deteriorates.

Second, unless there is some amazing breakthrough, a given transmitter will have a fixed and limited number of possible paths that be established to customers. This characteristic makes it very difficult to connect to a lot of customers in a densely populated area and is one of the reasons that wireless today is more normally used for less densely populated places.

Third, the connection for this kind of point-to-multipoint network must be line of sight. In an urban environment every building creates a radio ‘shadow’ and block access to customers sitting behind that building. This can be overcome to a small degree with technologies that bounce the signal from one customer to another – but such retransmission of a signal cuts the both the strength of the signals and the associated bandwidth.

However, Google has already recognized that there are a lot of people unwilling or unable to buy a gigabit of bandwidth from them on fiber. In Atlanta the company is not just selling a gigabit connection and is hitting the street with a 100 Mbps connection for $50. A good wireless system that had access to the right kind of spectrum could satisfy that kind of bandwidth to a fairly reasonable number of customers around a given transmitter. But it would be technically challenging to try to do the same with gigabit bandwidth unless each transmitter served fewer customers (and had to be even closer to the customer). A gigabit wireless network would start looking a lot like the one I saw year ago in Virginia where there was a transmitter for just a few nearby customers – essentially fiber to the curb with gigabit wireless local loops.

But if Starry can do what they are shooting for – the delivery of a few hundred Mbps of bandwidth at an affordable price will be very welcome today and would provide real competition to the cable companies that have monopolies in most urban neighborhoods. But, and here is where many might disagree with me, the time is going to come in a decade or two where 200 Mbps of bandwidth is going to become just as obsolete as first generation DSL has become in the twenty years since it was developed.

Over the next twenty years we can expect the full development of virtual and augmented reality so that real telepresence is available – holographic images of people and places brought to the home. This kind of technology will require the kind of bandwidth that only fiber can deliver. I think we’ll start seeing this just a few years from now. I can already imagine a group of teenagers gathering at one home, each with their own headset to play virtual reality games with people somewhere else. That application will very easily require a gigabit pipe just a few years from now.

I welcome the idea of the wireless last mile if it serves to break the cable monopoly and bring some real price competition into broadband. It’s a lot less appealing if the wireless companies decide instead to charge the same high prices as the incumbents. It sounds like the connections that Starry is shooting for are going to fast by today’s standards, but I’m betting that within a few decades that the technology will fall to the wayside – like every technology that doesn’t bring a fast wire to the home.

IoT as a New Product Line

Light bulbLast week Google and Nest announced that they were discontinuing the Revolv IoT hub for the home. The hub is the smart device that sits at the core of an IoT network and is generally the device that lets a user communicate with any other devices in the network. The Revolv hub will still work for anybody that owns one, but there will be no further development on the hub and no new devices designed to work with it.

And this got me thinking about small carriers offering IoT as a product. Big companies like Comcast are now offering a home automation package. Comcast has integrated nine different devices together that range from security, smart locks, smart lights, smart thermostat, etc. Comcast reports that they are surpassing their early goals and have a penetration rate of over 5% of total broadband customers.

But I would think that a company as large as Comcast has developed their own proprietary IoT hub to work seamlessly with all of the various devices. But finding a reliable hub vendor, and working to get any hub to work with a core set of devices can be a daunting task for smaller carriers. And since there are not yet any industry standards for IoT, devices don’t automatically integrate into different brands of hubs and will not work at all in many cases.

The real fear for a small carrier is that you’d build a product line around some specific brand of hub and that hub would either be discontinued or the company that makes it might even disappear. If you can’t trust somebody as large as Google for an IoT hub, then who can you trust in an industry that doesn’t yet have any clear dominant IoT manufacturers?

There are other issues with the IoT business plan that have to be considered. Probably the most immediate and costly issue is the fact that supporting residential IoT means a lot of truck rolls. I’ve looked at the cost of a truck roll for some of my clients and it’s not unusual to see costs of $50 to $75 for a truck roll, and so any business plan has to compensate for a product that is going to require multiple visits to customers over time.

Another issue to consider is customer expectations. There is now a huge variety of smart devices on the market and the vast majority of them are not going to work with whatever hub you choose. I would expect that once customers have some IoT devices from an ISP that they are going to buy other devices and will be disappointed when they won’t work with the hub that they are already paying for. And it’s virtually impossible for a small ISP to integrate incompatible devices with their hub of choice.

Yet another issue that is still of concern for the whole industry is security. Smart devices tend to have very rudimentary operating software and IT experts say that hacking IoT networks is relatively easy. I don’t think many of us are too worried about somebody hacking into our smart coffee pot, but when you put your thermostat, front door locks and watering systems onto a network together there is a lot of chance for damage from malicious hacking.

But a greater security concern is that an IoT network can be a gateway to your entire network and can let in malware and other problems that can create havoc with finances and personal data stored on your computers.

There are certainly customers that will buy these services, as has been demonstrated by Comcast. We might be decades away from a time where there might be significant penetration rates like we see with triple play products. But there probably is an opportunity today to get a small, but potentially profitable product out into the market. But the risks and costs of offering residential IoT still looks to be out of the comfort zone of many small ISPs. Perhaps rather than try to offer a full suite of products like Comcast is doing, a more workable strategy might be to concentrate on a small handful of functions like security and smart thermostats.

Libraries in the Digital Age

LibraryToday’s blog was inspired by reading Libraries: Broadband Leaders of the 21st Century by Craig Settles, a well-known broadband advocate. As someone who hasn’t been to a library for many years his paper surprised me with the number of different ways that libraries are engaged in broadband today.

Probably the best known role of libraries is as a source of broadband for those who don’t have access anywhere else. Libraries today offer broadband at computers as well as WiFi for patrons to use on their own devices. A recent FCC report noted that in most cities anywhere from 15% to 25% of citizens don’t have broadband at home, and for many of them the library is a place they can get access to the web.  This access lets kids do homework, provides job training for those looking to change careers and gives access to government web sites that are increasingly moving input to social systems on-line.

But many libraries go a lot farther. For instance, there are libraries today that are lending mobile hot spots to enable people to have internet access outside the library for a few hours at a time. Many libraries are at the center of efforts to improve digital literacy and they have programs to train people in computer skills and to help them accomplish needed tasks on the web. Many library systems also have training programs in advanced computer skills like coding.

Libraries everywhere want larger faster broadband connections. In many communities the libraries get the same speeds of broadband that are available at homes. And while having a 100 Mbps connection sounds fast, when that much bandwidth is divvied up among a hundred patrons it slows to a crawl. And sadly, there are still a lot of libraries across the country that are served by only T1s or slow DSL connections.

The White House announced a goal in 2013 in the ConnectED initiative to get at least 100 Mbps connection to schools and libraries within five years, with the ultimate goal being gigabit bandwidth. And there has been a lot of progress, but the most recent FCC Broadband Progress Report says that 41% of schools and libraries still don’t have 100 Mbps connections.

Libraries can get assistance to build broadband facilities using the Schools and Libraries portion of the Universal Service Fund, and which is generally referred to as the E-rate program. This fund can be used to subsidize the monthly broadband bills, but can also be used for physical parts of the network like fiber to connect library branches or WiFi systems within a library.

Some communities have been able to really leverage E-rate funding by tying their schools and libraries together into an integrated network and by using libraries to meet educational goals of the schools. It’s generally easier to get funding for schools compared to libraries, but by networking them together you can bring some of that funding in to help improve the libraries and to make them an integral part of the education complex. This leverage can be expanded to be even stronger by linking networks to hospitals and leveraging funding available to improve broadband for healthcare.

Settles makes a case for allowing libraries to participate in the upcoming Lifeline program that will provide $10 monthly subsidies for broadband for qualifying low-income homes. Since libraries are the source of broadband for many low-income people an argument can be made that spending that subsidy at a library can benefit more people than spending it at one home. It’s an interesting concept and would take action by the FCC or USAC, the entity that administers the Universal Service Fund.

Since most cities are still far away from a time when there will be affordable broadband available to everybody, the libraries are likely to continue to be an important part of the broadband solution for most communities. It’s important for library administrators to understand the options available to them to maximize the funding they can get to provide public broadband. Papers like the one written by Settles are an important step in that process.

Verizon Bringing Fiber to Boston

fios vanEvery once in a while something in the industry comes as a true surprise and that happened last week when Verizon announced that it was going to invest $300 million to build FiOS in Boston. There hasn’t been any new FiOS constructed for many years and the company had announced at the end of 2011 that it was done with FiOS expansion. Then the company went on to sell a lot of customers to Frontier including a big chunk of the FiOS fiber network and it looked like Verizon was inching their way out of the residential landline business.

There was never any doubt that Verizon was interested in fiber to serve large businesses and to serve its own cellular towers. And this desire was emphasized a few months ago when the company announced the intention to buy XO Communications from Carl Icahn. That will provide a vast new fiber network throughout downtowns and business districts around the country.

Verizon says that there are a few reasons it wants to build Boston. Probably first on the list is a shift in the cellular business to add smaller neighborhood cell sites. The whole industry has started the migration from relying mostly on the big cell towers to smaller cell sites dispersed where there is demand. But these mini-cell sites need fiber. And so expanding the FiOS network in Boston will give the company fiber everywhere in the City and give it a competitive advantage over AT&T for providing cellular data.

Verizon also says that it wants to get into the ‘smart city’ business and it views Boston as an attractive market to pursue that goal. Verizon announced a smart city initiative last October and is working on plans to build things like smart traffic grids in cities. Again, this kind of big dollar business requires fiber throughout a city.

Verizon also says that it would like to tear down all of the copper in Boston. Of course, Boston’s copper is not older or in worse shape than the copper in other east coast cities and this justification doesn’t seem like a reasonable reason to invest $300 million in fiber. I’m betting that management took a new look at their existing FiOS business and saw how profitable it is now that broadband penetration rates keep climbing. Broadband is a very high margin business.

It’s also my guess that Verizon might be getting more realistic about the future of its cellular business. That business has thrived for a few decades due to astronomically high prices and margins compared to the cost of providing the service. And those margins are under attack throughout the industry as alternate cellular companies are offering cheaper rates. Even the new discounted rates are high margin, but they have forced Verizon and AT&T to bring their prices down out of the stratosphere. So perhaps the company is quietly going to build up the landline data business as a way to insure future profits.

One has to wonder what this means for other east coast cities. Verizon largely built FiOS in the suburbs and to a large extent ignored the downtowns of the major northeast cities. If there was any downtown fiber built it was spotty and only to neighborhoods where the construction costs were the lowest. There certainly would be a big sigh of relief if other cities could know that they were also going to finally get a fiber network to compete with Comcast.

One thing we’ve always known about big companies is that they can change strategies at will and something they say they will never do one day can end up as a major corporate initiative a few years later. Verizon gave every sign for the last few years that it was walking away from landline networks. One has to go many pages deep into their annual reports to even see that business mentioned.

But this Boston initiative is no small deal and requires a major investment. And the reasons why this benefits Verizon are just as true for many other cities. Verizon says that one reason they are willing to do this now is that city hall in Boston was receptive to making it easier to build fiber – something that has not been true in the past. Just like many cities are bending the old rules for Google, I imagine that there are discussions going on today in many east coast cities about what they might be able to do to get Verizon fiber too.

The Growth of 4K Video

4K CameraIt looks like 4K video is making it into the mainstream and is going to put a big strain on broadband networks serving residential customers. 4K video resolution is 3840 x 2160 pixels, or about 8 million pixels on a screen, which is about four times more resolution than an HD display. It takes a lot of bandwidth to stream that many pixels and with current compression technologies 4K video requires 15 – 20 Mbps download speeds. Google and others are working on better compression techniques that might cut that in half, but even so that would mean videos streams at 7 – 10 Mbps. That’s a whole new level of broadband demand that will increase the household need for faster data speeds.

Just a year ago it wasn’t easy to find 4K video on the web, but this year there is a lot of content being shot in the format. This includes:

  • Netflix is currently shooting most of its original content like House of Cards, Breaking Bad, Jessica Jones, Daredevil in 4K. It also has a big array of documentaries in the format as well as a number of classic movies being reformatted to 4K.
  • Amazon Prime is also filming new content like Alpha House, Transparent, Mozart in the Jungle and Man in the High Castle in 4K. They have a small library of movies in the format.
  • Sony probably has more mainstream movies in the 4K format than anybody. Rather than streaming you download Sony movies and a typical movie can take 40 GB of storage space. It doesn’t take too many movie downloads to blow the data caps of AT&T or Comcast.
  • M-Go has developed a small but growing 4K library in conjunction with Samsung. They also will be adding title from Fox.
  • Comcast offers a few movies in 4K online for customers in partnership with NBC Universal.
  • YouTube has a huge amount of user-generated 4K video of all different types. YouTube is also now producing original content sold under YouTube Red and which contains 4K content.
  • Ultraflix has a big library of 4K nature documentaries including some originally produced for IMAX. They are also carrying lot of Hollywood movies.
  • Vudu, which is owned by Walmart has a small, but high quality 4K set of content. They are the first to marry 4K video to Dolby surround sound.

If 4K follows the same migration path of standard definition video to HD video, then within a few years 4K content is going to be everywhere. Where just a few years ago there was little video on the web, video now seems to be everywhere. There are video ads on all sorts of websites and social media services like Facebook and Twitter spit out piles of video at a user these days.

One of the biggest problems with broadband regulation in this country is that it fails to recognize the ever-growing nature of broadband demand. Once households start using 4K video then the FCC’s newly minted definition of broadband at 25 Mbps download will already be getting stressed. The fact is that the household needs for broadband are just going to keep growing year after year and any regulatory definition of demand will be obsolete almost as soon as it is established.

Broadband demand has been growing steadily and doubling about every three years and there is no reason to think that we are anywhere close to the time when that growth curve is going to slow. 4K video is not the last new technology that will stretch our needs for broadband. When I read about where virtual and augmented reality are headed over the next five years it’s not to hard to see where the next big push for more broadband will come from.