Are You Ready for 5G?

Cell-TowerIn the last month I have seen several announcements of groups claiming they will be launching 5G cellular in the next few years. For example, both South Korea and Japan have announced plans to introduce 5G before they host the Olympics in 2018 and 2020. Three of the Chinese ministries have announced plans to jointly develop 5G. And the Isle of Man says they are going to have the first 5G network (and before you laugh, they had the second LTE network in the world).

I have written before about the inflation of claims in wireless technologies, and so I have to ask what these groups are talking about. There is nobody in the world today that is delivering wireless that comes close to meeting the 4G specification. That spec calls for the ability to deliver 100 Mbps to a moving vehicle and 1 Gbps to a stationary customer. What is being sold as 4G everywhere is significantly slower than those speeds.

For example, OpenSignal studies wireless speeds all over the world. In February 2014 they reported that the average speed of US 4G networks was only at 6.5 Mbps in the second half of 2013, down from 9.65 Mbps the year before. The US speeds have rebounded some in 2014, but even the fastest 4G networks, in Australia, average only 17 – 25 Mbps. That is a long way from 1 Gbps.

Moreover, there aren’t yet any specifications or standards for 5G, so these announcements mean nothing in since there is no 5G specification to shoot for. The process to create a worldwide 5G standard hasn’t even begun and the expectation is that a standard might be in place by 2020.

I am not even sure how much demand there is for faster wireless networks. It’s not coming from cellular data for smartphones. That business in the US has been growing about 20% per year, or doubling every five years and it’s expected to stay on that pace. New demand might come from the Internet of Things, from devices that want to use bandwidth from the cellular network. IoT usage of cellular networks is new and, for example, there are utilities now using cellular bandwidth to read meters. And while industry experts expect a huge increase in this machine-to-machine traffic by 2020 I’m not sure that it needs greater speeds.

The other thing we have to always remember with cellular traffic is that it handles only a tiny fraction of the total data used in the country today. Reports from Sandvine have shown that cellular traffic only carries about 1% of the total volume of data delivered to end users in the US today, and landline data usage is still growing faster than cellular data. This is probably due to the expensive data plans that cellular companies sell and which have taught customers to be frugal with smartphone data. But it’s also a function of the much slower speeds on 4G compared to many landline connections.

Another limiting factor on 4G, or 5G or any G getting faster is the way we allocate spectrum. In the US we dole out spectrum in tiny channels that were not designed to handle large data connections. Additionally, any given cell site is limited in the number of data connections that can be made at once.

So I am completely skeptical about these announcements of upcoming 5G networks. I am still waiting for a cellular company to actually meet the 4G standard – what we are calling 4G today is really a souped of version of 3G technology. It’s very hard to foresee any breakthroughs by 2020 that will let cell sites routinely deliver the 1 Gbps that is promised by 4G. My guess is by the time that somebody does deliver 1 Gbps to a cellphone that the breakthrough is going to be marketed as 10G.

I don’t think that any of the groups that are promising 5G by 2020 are anticipating any major breakthroughs in cellphone technology. Instead the industry is constantly making tweaks and adjustments that boost cell speeds a little more each time. All of these technology boosts are significant and we all benefit as the cellular network gets faster. But the constant little tweaks are playing hell with handset makers and with cellular companies trying to keep the fastest technology at all of their cell sites.

We are not really going to get a handle on this until we have fully implemented software defined networking. That is going to happen when the large cell companies migrate all of the brains in their networks to a few hub cell sites that will service all of the cellular transmitters in their network. This means putting the brains of the cellphone network into the cloud so that making an update to the hub will update all of the cell sites in the network. AT&T and Verizon are both moving in that direction, but it might be a decade until we see a fully cloud-based cellular network.

Are Computers Changing Us?

Fourth RevolutionI recently read the The Fourth Revolution: How the Infosphere is Reshaping Human Reality by Luciano Floridi. He is a leading figure in modern philosophy and in this book he looks at how our relationship with information technology is changing us. Floridi believes that mankind in the midst of profound change due to our interactions and immersion in computer technology.

He thinks that information technology is the fourth scientific breakthrough in our history that has fundamentally changed the way that we see ourselves in relation to the universe. The first transformational scientific breakthrough was when Copernicus shook mankind out of the belief that we were the center of the universe. The second was when Darwin showed mankind that it was not the center of the animal kingdom but had evolved alongside and was related to all other life on earth. The third revolution started when Freud showed us that we are not even transparent to ourselves and that we have an unconscious side that is not under our direct control. The fourth big change in our perception of our role in the universe has come through the development of computers and information technology. Our relationship with computers and data has shown mankind that people are not disconnected and individual agents, but instead with the web and computer technology we have become an integral part of the global environment.

He labels any technology that enables the transmission of information as an ICT (Information and Communication technology). The first ICT was writing, but we now have become inundated by ICTs such as the Internet of Things, Web 2.0, the semantic web, cloud computing, smartphone apps, augmented reality, artificial companions, driverless cars, wearble tech, virtual learning, social media and touch screens. ICTs are changing so rapidly that these ICTs will quickly become obsolete and will be replaced by many more that we can’t even imagine. Increasing computer power, smaller chip sizes and ways to handle big data mean that mankind is headed for a time when technology is indispensable to our lives, and will be integrated into our lives.

His most surprising conclusion is that this new technology and our interface with it is fundamentally changing us as people. I have recently read some literature about childhood development that corroborates this concept, in that kids who are immersed in advanced technology from birth develop differently than those before them. They literally develop different neural pathways and different brain characteristics than historical mankind. He thinks we are entering an age of not only new technology, but of a new mankind.

Floridi argues that the boundaries between life online and life offline are blurring and that our kids will always be online, even if not physically connected to a computer. We already see the beginning of this in that our roles in social networks and other online activities now don’t rely on us always being actively there. As computers become more and more a part of our we clearly will always be connected. Floridi labels this new phenomenon ‘onlife’.

Our onlife now defines a lot of our daily activities – how we shop, learn, care for our health, get entertainment, relate to other people. It affects the way we interface with the realms of law, politics, religion and finance. It even has changed the way we wage war. Floridi says that what we are experiencing as a society is more than us just using newer technologies and that the real significance is how these technologies are changing us. He says that ICTs are transforming the way that we interface with the world.

I found this book fascinating. It brings a way to understand a lot of the things we see in modern life. For instance, it gives is a way to understand why young kids seem to think differently than we do. If Floridi is right then the world is at a crucial point in its history. We still have a tiny number of primitive people in the planet that are living in pre-history. But most of the people in the planet are living in history, that is, they are from a mindset that we have had for thousands of years since the advent of writing and other forms of communication. But we also now have a generation of people who are moving into hyper-history and are becoming part of the infosphere. Children growing up in the infosphere and particularly their children will think differently than the rest of mankind. People of my generation are users of technology, but this next mankind is immersed in technology and is a part of that technology. It’s going to be interesting to see how the world deals with a generation that is fundamentally different than the rest of mankind.

Why Isn’t There a Cable Headend in the Cloud?

dish-731375I saw an article earlier this year that said that some smaller triple-play providers have decided to get out of the cable business. Specifically the article mentioned Ringgold Telephone Company in Georgia and BTC Broadband in Oklahoma. The article said that small companies have abandoned over 53,000 customers over the last five years, with most of this being recent.

I’m not surprised by this. I have a lot of small clients in the cable business and I don’t think any of them are making money with the cable product. There are a myriad of outlays involved such as programming, capital, technical and customer service staff and software like middleware and encryption  And all of these costs are climbing with programming increasing much faster than inflation. And there is pressure to keep up with the never-ending new features that come along every year like TV everywhere or massive DVR recorders. I have a hard time seeing any cable company that doesn’t have thousands of customers covering these costs.

But small cable providers are often in a bind because they operate in rural areas and compete head-to-head with a larger cable company. They feel that if they don’t offer cable that they might not survive. But it is getting harder and harder for a company who doesn’t have stiff competition to justify carrying a product line that doesn’t support itself.

I’ve written several blogs talking about how software defined networking is going to change the telecom industry. It is now possible to create one cable TV head-end, one cell site headend or one voice switch that can serve millions of customers. This makes me ask the question: why isn’t somebody offering cable TV from the cloud.

There are big companies that already are doing  headend consolidation for their own customers. For instance, it’s reported that AT&T supports all of its cable customers from two headends. A company like AT&T could use those headends to provide wholesale cable connections to any service provider that can find a data pipe to connect to AT&T – be that a rural telephone company, a college campus or the owner of large apartment complexes.

This wholesale business model would swap the cost of owning and operating a headend for transport. A company buying wholesale cable would not need a headend, which can still cost well over a million dollars, nor technical staff to run it. In place of headend investment and expense they would pay for the bandwidth to connect to the wholesale headend.

As the price of transport continues to drop this idea becomes more and more practical. Many of my clients are already buying gigabit data backbones for less than what they paid a few years ago for 100 Mbps connections. The only drawback for some service providers is that they live too far of the primary fiber networks to be able to buy cheap bandwidth, but the wholesale model could work for anybody else with access to reasonably priced bandwidth.

The wholesale concept could be taken even further. One of the more expensive costs of providing cable service these days is settop boxes. A normal settop box costs over $100, one with a big DVR can cost over $300 and the average house needs two or three boxes. The cost of cloud memory storage has gotten so cheap that it’s now time to move the DVR function into the cloud. Rather than put an expensive box into somebody’s house to record TV shows it makes more sense to store video in the cloud where a terabit of storage now costs pennies. Putting cable in the cloud also offers interesting possibilities for customers. I’ve heard that in Europe that some of the cable providers give customers the ability to look backwards a week for all programming and watch anything that has been previously broadcast. This means that they store a rolling week of content in memory and provide DVR service of a sort to all customers.

The ideal cloud-based cable headend would offer line-ups made up of any mix of the channels that it carries. It would offer built in cloud DVR storage and the middleware to use it. I think that within a decade of hitting the market that such a product would eliminate the need for small headends in the country. This would shift video to become a service rather than a facility-based product.

There would still be details to work out, as there is in any wholesale product. Which party would comply with regulations? Who would get the programming contracts? But these are fairly mundane details that can be negotiated or offered in various options.

It is my hope that some company that already owns one of the big headends sees the wisdom in such a business plan. Over a decade, anybody who does this right could probably add millions of cable lines to their headend, improving their own profitability and spreading their costs over more customers. AT&T, are you listening?

 

Predictions Ten Years Later

Alexander_Crystal_SeerI often report on how industry experts see the future of our industry. It’s an interesting thought experiment, if nothing else, to speculate where technology is moving. In 2004 the Pew Internet Project asked 1,286 industry experts to look ten years forward and to predict what the Internet would be like in 2014. I found it really interesting to see that a significant percentage of experts got many of the predictions wrong. Here are some of the specific predictions made in 2004:

66% of the experts thought that there would be at least one devastating cyberattack within the following ten years. While there have been some dramatic hacks against companies, mostly to steal credit card numbers and related information, there have been no cyberattacks that could be categorized as crippling. The experts at the time predicted that terrorists would be able to take over power plants or do other drastic things that have never materialized.

56% thought that the internet would lead to a widespread expansion of home-schooling and telecommuting. There certainly has been growth in telecommuting, but not nearly to the extent predicted by the experts. It’s the same with home schooling, and while it’s grown there is not yet a huge and obvious advantage of home schooling over traditional schooling. The experts predicted that the quality and ease of distance learning would make home schooling an easy choice for parents and that has not yet materialized.

50% of them thought that there would be free peer-to-peer music sharing networks. Instead the recording industry has been very successful in shutting down peer-to-peer sites and there are instead services like Spotify that offer a huge variety of free music legally, paid for by advertising.

Only 32% thought that people would use the Internet to support their political bias and filter out information they disagree with. Studies now show that this is one of the major consequences of social networking, in that people tend to congregate with others who share their world view. This finding is related to the finding that only 39% thought that social networks would be widespread by 2014. The experts en masse did not foresee the wild success that would be enjoyed by Facebook, twitter and other social sites.

52% said that by 2014 that 90% of households would have broadband that was much faster than what was available in 2004. At the end of 2013 Leichtman Research reported that 83% of homes had some sort of broadband connection. That number was lower than predicted by the majority of experts, but what was even lower is the average speed that people actually purchase. Akamai reports that the average connection speed in the US at the end of 2013 was 8.7 Mbps. But this was not distributed in the expected bell curve and that average consists of a small percentage of homes with very fast connections (largely driven by Verizon FiOS and other fiber providers) but with many homes with speeds that are not materially faster than what was available in 2004. For example, Time Warner just announced this past week that they are finally increasing the speed of their base product from 3 Mbps to 6 Mbps.

32% thought that online voting would be secure and widespread by 2014. There are now a number of states that allow on-line voter registration, but only a tiny handful of communities have experimented with on-line voting. It has become obvious that there is a real potential for hacking and fraud with on-line voting.

57% of them thought that virtual classes would become widespread in mainstream education. This has become true in some cases. General K-12 education has not moved to virtual classes. Many schools have adopted distance learning to bring distant teachers into the classroom, but there has been no flood of K-12 students moving to virtual education. Virtual classes, however, have become routine for many advanced degrees. For example, there are hundreds of master degree curriculums that are almost entirely on-line and self-paced.

But the experts did get a few things right. 59% thought that there would be a significant increase in government and business surveillance. This has turned out to be true in spades. It seems everybody is now spying on us, and not just on the Internet, but with our smartphones, with our smart TVs, and even with our cars and with the IOT devices in our homes.

The Pew Institute continues to conduct similar surveys every few years and it will be interesting to see if the experts of today can do better than the experts of 2004. What those experts failed to recognize were things like the transformational nature of smartphones, the widespread phenomenon of social networking and the migration from desktops to smaller and more mobile devices. Those trends are what drove us to where we are today. In retrospect if more experts had foreseen those few major trends correctly then they probably would have also guessed more of the details correctly. Within the sample of experts there were undoubtedly some experts who guessed really well, but the results were not published by expert and so we can’t see who had the best crystal ball.

New Tech – November 2014 Part I

microscopic cryltalI ran across so many cool new technologies recently that I will have to stretch talking about them over two blogs this month.

Twisted Lasers. Physicists at the University of Vienna have been able to transmit a twisted laser signal through the air. This is a fairly common practice in fiber optic cables where multiple beams of lights are sent through the fiber simultaneously and which twist around each other as they bounce off the walls of the fiber.

But this is the first time that this has been accomplished through the air. The specific technology involved is called orbital angular momentum (OAM) and refers to the ability to bend light. In this case the scientists were able to make light transmit in a corkscrew pattern and they were able to intermingle two different colored beams through the air for over two miles. The technology is important because it would result in the ability to pump more data through a single microwave path. It also increases security by having multiple light paths to untangle.

Terabit Fiber. A team of scientists from Eindhoven University of Technology in the Netherlands and the University of Central Florida have developed a technology that vastly increases the bandwidth on fiber. Today the fastest commercially available fibers can transmit at 100 Gbps, but this team has demonstrated a fiber that transmits 2,550 faster than that, or 2.55 Terabits per second.

They accomplished this by combining several different existing technologies. First, they used multi-mode fiber. Normal long-haul fibers are single-mode fiber, meaning that each fiber can only support a signal from one laser source. But they used a multi-mode fiber that contained seven separate ‘cores’ or available laser paths. For now, this kind of multi-mode fiber is expensive, but the cost would drop through mass production.

The team of scientists also used several data transmission techniques to boost the speed even faster. They leveraged a technique called spatial multiplexing (SM) where data signals from multiple sources are transmitted in parallel and which can boost the speed up to 5.1 terabits per path. This is somewhat akin to time division multiplexing used for T1s that open a slot for each data bit so that everything can be packed tightly together. The team also used wavelength division multiplexing (WDM) which separates and transmits different data streams using different wavelengths of light. Together these techniques allowed them to create 50 separate paths through the fiber. This kind of breakthrough is probably a decade or more away from commercial deployment, but it lets us foresee fiber paths that can handle vast amounts of data when that is really needed such as in undersea fiber routes and inside supercomputers.

Frozen Light. Another team of researchers from Princeton report that they have been able to freeze light into a crystal. They have been able to stop light and gather it into a crystalline form. This is the first time that anybody has ever been able to stop photons.

This was accomplished by building a structure made of superconducting materials which acted like an artificial atom. They placed the artificial atom close to a superconducting wire containing photons. By the rules of quantum mechanics, the photons on the wire inherited some of the properties of the nearby atom and they began interacting with each other, a bit like particles. So far this has only been done to create very tiny crystals. But the hope is that the technology might be used to create larger crystals which would lead to a whole new category of exotic materials that will have weird properties. And who knows what that might lead to?

Personal RFID. One a more down to earth note, Robert Nelson decided to implant an NFC RFID chip into his hand. After it healed he programmed it to unlock his cell phone. All he has to do is hold the phone near his hand and it unlocks. He is investigating adding more chips and is working towards implementing activities like opening the garage door with a wave of the hand or unlocking and starting his car. He sees this technology as the ultimate in personal security since only your own chip would be able to control your devices.

Smartphone Spectrometer. Finally, I saw a device that creates a spectrometer for your cellphone. A company called Public Lab has introduced a product called the Homebrew Oil Testing kit, and the first use for this device is to find if your drinking water has any contaminants from fracking. It consists of a refractor that connects over the camera lens on a cellphone. The device uses a Blu-ray laser to shed light on the water sample you want to test. Shining the light into your water sample creates a spectrometer image which is captured by your phone.

Of course, unless you are a chemist you don’t know how to read spectrometer images, but there is an on-line database that can be quickly used to identify any contaminants in your water. Over time the device can be used to test a far wider range of pollutants and other substances, but for now the manufacturers seem to be concentrating on the fear that many people have about fracking.

Reinvesting in Rural America

C-SpireOver the last few weeks C-Spire has begun rolling out gigabit fiber in Mississippi. Unlike Google which is mostly concentrating on large and fast-growing cities, C-Spire is rolling fiber out to small and rural towns throughout Mississippi. The C-Spire story is an amazing success. C-Spire is part of a holding company that includes a large wireless company, a large CLEC and a significant fiber network throughout the region. The company got started by the Creekmore family. Wade and Jimmy Creekmore are two of the nicest people in the telephone industry and they started out working at the family business which consisted of Delta and Franklin Telephone Companies, two small rural ILECs.

When the FCC distributed some of the first cellular spectrum they did so through a lottery. The company won spectrum through this lottery and started Cellular South. It’s grown to become the sixth largest cellular company in the country and many people are surprised when they visit Mississippi and find that C-Spire is more dominant there than AT&T or Verizon. The company was rebranded a few years ago as C-Spire Wireless. There used to be a number of other sizable independent wireless carriers like Alltel that have been swallowed by the two big wireless companies.

A little over a year ago C-Spire announced that it was going to roll out gigabit fiber optics to towns in the region. They modeled this after Google and towns that signed up enough potential customers qualified to get fiber. There are a number of towns that have now qualified and many others striving to get onto the C-Spire list. In the last few weeks the company began turning up gigabit services in small towns like Starkville and Ridgeland.

The company is offering 1 Gbps data service for $70 a month, a combined Internet and home phone for $90 per month, Internet and HD digital TV for $130 per month and $150 a month for the full triple play. These are bundled prices and customers who do not have C-Spire wireless will pay an additional $10 a month for each package.

It is really refreshing to see somebody investing back into the communities that supported them for many years. It’s pretty easy to contrast this to the big telcos and cable companies which are not reinvesting. C-Spire is building needed infrastructure, creating jobs and bringing a vital service. I view this as a true American success story. This is a win for both the Creekmores and for the people of Mississippi.

This is not the only place in the country where telephone company owners are reinvesting back into their community. There are hundreds of independent telephone companies and cooperatives around the country who are quietly building fiber and bringing very fast internet to some of the most rural places in the country. For example, Vermont Telephone Company grabbed headlines when they announced gigabit fiber for $35 per month. There are wide swaths of places like the Dakotas where fiber has been built to tiny towns and even to farms.

What these companies are doing is great. They are doing what businesses are expected to do, which is to modernize and grow when the opportunity is there. This is especially what regulated utilities should be doing since they have benefitted for decades from guaranteed profits from their businesses. But unfortunately for rural America most of them are served by AT&T, Verizon, CenturyLink and other large telephone companies like Frontier, Windstream and Fairpoint. These companies share at least one thing in common, which is that they are public companies.

It seems like public companies in this country are unable to pull the trigger on investing in infrastructure. The exception is Verizon who has invested many billions in FiOS, but even Verizon has stopped building new fiber and they are not investing in the small towns like Starkville. Rather than investing in rural America, the large companies are doing what they can to hold down costs there. In fact, AT&T has told the FCC that they would like to cut down all of their rural copper lines within a decade and replace them mostly with cellular.

The Creekmores aren’t building fiber just because it’s the right thing to do. They are doing this because they see a solid business plan from investing in fiber. They will make money with this venture, which is the way it is supposed to work. But the public companies like AT&T only seem to invest in fiber when they face a big competitive threat, like AT&T in Austin Texas. I get a sense that CenturyLink would build fiber if they had the financial resources, but most of the big companies are doing the opposite of reinvesting in rural places.

Unfortunately, the big companies are driven by stock prices and dividends. They don’t want to take the negative hit from making large investments because it depresses profits for a few years while you are building. And that is the real shame, because in the long run these large companies would increase profits if they reinvested the billions that they instead pay out as dividends. They would end up with fresh new networks that would make profits for the next century.

It’s going to be interesting to see how gigabit fiber transforms the small pockets of rural America that are lucky enough to get it. The broadband map in the country is a real hodgepodge because right next to some of these areas that have fiber are areas that often have no broadband at all other than cellular or satellite.

It is also going to be interesting over twenty years to see how the two different types of areas fare economically. There is a company in Minnesota, Hiawatha Broadband, that has been building fiber to small towns for a decade and they claim that every town where they have built a network has been growing while every surrounding town has shrunk in population. They have been at this about as long as anybody, and so their evidence is some of the early proof that having fiber matters. Within another decade we are going to have evidence everywhere and we will be able to compare the economic performance of rural areas with and without fiber.

Forbearance and Net Neutrality

Network_neutrality_poster_symbolAs the FCC crawls slowly towards a decision on net neutrality, I thought it would be useful to talk a bit about forbearance. Forbearance means restraining from doing something, and all of the proposals to protect net neutrality that involve Title II regulation require forbearance to some of the FCC’s rules.

The FCC is somewhat unique when it comes to regulation because Congress has given them the right for forbearance, meaning the FCC can selectively decide when to apply certain laws and regulations. Most federal agencies don’t have this power. But this makes sense for the FCC since they are regulating such diverse companies such as cable companies, telephone companies, cellular companies, fiber networks, microwave companies and a number of other niche technologies. It’s always been obvious that rules that make sense for one of these industries might not make any sense when applied to another one.

If the FCC was to put broadband providers under Title II this means subjecting them to all of the rules that are still in place from the Telecom Act of 1934 as well as many of the rules in the Telecom Act of 1996. It is the fear of having to comply with all of these rules that is causing the harsh reaction of ISPs to the idea of being regulated. (Well, that, or just the idea of being regulated at all).

Let’s look at one example of the kinds of rules that are required by the Telecom Act of 1934. That Act requires all telephone companies to issue tariffs. People tend to think of tariffs as a price list and a description of the products offered by a telephone company. But tariffs are much more than that. Tariffs include details of the way that a carrier must interact with its customer. Tariffs define things like how much notice you have to give a customer before you can disconnect them for non-payment. Tariffs require a carrier to give notice before changing rates, meaning that rates can’t be changed on the fly, but must wait for a period of time before being implemented. Tariffs also require nondiscrimination between customers, and that might be the biggest part of tariffs that scare ISPs, who routinely offer different deals to customers every day.

Additionally, every state has developed specific rules for what must be contained in tariffs filed in that state. This means that a nationwide ISP would have to file a different tariff in each state and follow different rules in each state. If forbearance is not applied to these parts of Title II then ISPs would not just be regulated by the FCC, but by each of the fifty states.

There are many parts of Title II that would also not make sense to apply to ISPs. For example, there are sections of the various Acts that look at things like protecting customers from obscene phone calls or the requirement to provide operator services that obviously don’t apply to data services.

But there are other requirements that have the ISPs running scared. For example, the Telecom Act of 1996 requires the large telephone companies to unbundle their networks and to give access of their networks to competitors. And this does not just apply to telephone lines but also to DSL. There is no reason why this could not be applied to cable companies to bring competition into the data market. And there are related rules that regulate things like collocation and that require interconnections between carriers that exchange voice and data traffic.

There are yet other portion of the Title II rules where it is not clear if forbearance ought to be applied. For example, the FCC requires jurisdictional separation of revenues and costs to determine what is under the control of the FCC versus the control of states. Would the FCC just declare broadband to be an Interstate service to keep it all under their control? That is what has been done with DSL, and yet the states are still involved in many aspects of regulating DSL.

It appears to me like the idea of forbearance in this case is going to be extremely complicated. There are repercussions for deciding to forbear or not to forbear different parts of the existing telecom rules. It’s a huge puzzle to solve, and I am going to guess that every decision to forbear or not forbear will present a chance for legal challenge.

But the FCC forbears things all of the time. In fact, there is a legal process that allows for carriers to ask for forbearance from a specific rule, and if the FCC does not act within a year then the forbearance is assumed to be granted.

We already know that Verizon and AT&T are threatening to sue the FCC should they try to regulate broadband under Title II. Even should the FCC be able to win such a challenge, they would have to expect a decade where ISPs are constantly asking for additional forbearance from whatever regulation the FCC chooses to apply to broadband. If nothing else, this sounds like a full employment act for telecom lawyers.

Is it Time for New Telecom Law?

Capitol_domeA number of articles published since the recent election claim that both the House and Senate are ready to tackle telecom reform. There have been subcommittee meetings and discussions in both chambers about the topic for several years, but the idea hasn’t gone much yet past talking about it.

We certainly need telecom reform. After all, the Telecommunications Act of 1996 was passed in a very different time. That’s the year where AOL was still the largest ISP in the US and the year they introduced new modems that doubled dial-up speeds from 28 kbps to 56 kbps. 1996 was also the year that kicked off a decade of major investment by cable companies in HFC networks. There were 10 million cable modem customers by 2002, but in 1996 there were almost none. And DSL was just hitting the market in 1996 and didn’t get serious traction until a few years later.

Probably the biggest fault with the 1996 Act is that it treated telcos and cable companies very differently. The Act imposed significant unbundling requirements on the telcos while leaving the cable companies largely untouched. Perplexingly, this was done at a time when everybody in the industry knew that cable companies and HFC networks would soon become major players in both the telephone and high-speed data markets. In 1996 engineers and technologists understood that cable modems were going to soon be faster than DSL due to inherent advantages of coaxial cable over telephone cable. At the time everybody I knew just assumed that the cable company had good lobbyists.

But the Act did impose new rules on cable companies concerning programming. And it is clear that those rules are now growing quickly obsolete with the migration of video to the web. A revised Act certainly needs to address OTT video and the entire gamut of on-line video issues.

In essence the Act of 1996 created a whole lot of rules in silos – separate rules for the cable and telco industries. In doing so the Act almost completely ignored the burgeoning cellular industry. Now that people use every data on every kind of network it’s time for rules that are updated to recognize the new reality.

One of the stickiest points of implementing a new telecom law is going to be net neutrality. If the FCC does not soon put this issue to bed, then net neutrality will be one of the biggest issues of a re-written Telecom Act. Both chambers of Congress have set a goal to have a new Act enacted by the end of 2016. That means that the Republic lawmakers are still going to need the agreement from a Democratic president to get it signed, and that means the two sides have got to somehow come together on net neutrality.

Some of the problems we have today with net neutrality come directly from the 1996 Act. In that Act the Congress very clearly decided to not impose Title II regulations on the budding high-speed cable modem market. They decided to let cable companies and telcos undertake a different path, with the telco path much more regulated than the cable path. We ended up with a decade that required DSL unbundling and DSL resale, but with no attempts to make cable modems open to competition.

There are some parts of the 1996 Act that have been very effective and I hope they remain intact. For example, the Act set forth the concept that any company with the technical knowledge and financial wherewithal ought to be allowed to compete in the telco market. That has given rise to thousands of CLECs and there are numerous markets with vigorous telephone competition. One can hope that a new Act will make it clear that there ought to be competition in every telecom market, and with a lot more products than just telephone.

One can also hope that a new Act will get rid of the silos. It is time to stop having different rules for different parts of the industry. For example, the distinction between traditional voice services and VoIP needs to be eliminated, since in practical terms customers can’t tell the difference. There should not be different rules concerning the provision of cable TV service by different types of providers.

Certainly one of the biggest challenges Congress will face is what to do with the web and video. Hopefully a new Act is forward looking. There is convergence everywhere in the industry and people want to partake of telecom services from any platform using a plethora of devices. The Congress needs to make it as okay to watch a movie on your cellphone as it is to watch it on your TV. The next Act needs to look at programmers as hard as it looks at service providers.

Rewriting the telecom Act is a massive undertaking because every lobbyist from the telecom industry is going to have ideas that they think must be included. And there will be plenty of naysayers and voices from outside the industry arguing on the side of the public. The problem is that there is no one right answer. I think if you sent twenty panels of industry experts off to list what should be in a new Act that you’d get twenty different answers.

The FCC to Unbundle Fiber?

FCC_New_LogoChairman Wheeler at the FCC announced last week that he would be bringing two proposals to the FCC meeting on November 21 associated with the IP Transition. The first involves some rules that will insure that 911 continues to function on an IP network and there ought to be no controversy with that. But his second idea is going to be very controversial, which is to give competitors access to RBOC fiber networks in the same manner that they have access today to the copper network.

The Chairman says that he doesn’t want customers, particularly business voice customers, to lose competitive options – and he believes that the unbundled network elements that are in place for copper today have brought competition to that market.

Let me step back and look at this idea at the big picture level. What the Chairman is proposing is a form of arbitrage. In general, telecom arbitrage comes when regulators force an artificial price on a product or service. In this specific case, the arbitrage would come from having the FCC or state commissions define the price, terms and conditions for a competitor to gain access to a fiber network. Arbitration is not necessarily good or bad, but if the price is set too low then there is an larger demand for the product than ought to be expected.

The industry does not have a very good history over the last two decades of dealing with arbitrage and the last mile network. There have been three times when FCC-administered arbitrage turned out bad for a lot of the industry and the public. First was unbundled network elements on copper that the Chairman is now acknowledging – the primary one being the unbundled T1. This was incredibly popular in the late 90’s and dozens of huge CLECs were funded to compete in this business. I had an office then in a business high-rise near the DC beltway and I remember a dozen different CLECs knocked on my door trying to sell a bundled T1/data connection.

After that came UNE-P. This was a creation that was a virtual unbundling of the network. With UNE-P a competitor didn’t have to collocate to get access to the RBOC copper. Instead they just bought all of the UNE elements and reconstructed a network. Finally, there was resale which forced the RBOCs to give set discounts on many retail products. Both UNE-P and resale were mostly used to compete for residential customers and some giant companies grew in the space. I remember Talk America, for example, that had well over a million residential customers on resale.

But for the most part all of the companies that leaped into these arbitrage situations failed. I remember well over a dozen UNE CLECs that went public with a few dozen more hoping to do so. Heck, the telecom industry was so juiced in the late 90s that there were even several telecom consulting firms that worked for the large CLECs who tried to go public. But in the end, the arbitrage opportunity became less attractive, as always happens, and all of these companies crashed and burned. The same thing happened with UNE-P and resale and all of the companies that tried to make a business using these arbitrage opportunities ultimately failed.

Arbitrage is rarely permanent and this makes it almost impossible to build a business plan to take long-term advantage of an arbitrage opportunity. The main reason for this is that the RBOCs are really good at resisting and challenging arbitrage. They file law suits and lobby and within 5-7 years after the start of an arbitrage situation they largely get it killed, or at least weakened to the point of being useless for a competitor.

Now we are looking at a new arbitrage opportunity of allowing competitors to get access to fiber networks. I have dozens of questions about how this might work, because it’s not as obvious on a physical basis how one unbundles a fiber network in the same way that has been done for copper. With copper, in essence, the copper line from a customer is physically redirected to a CLEC. But that is not going to easily work for a fiber connection.

How big this opportunity might be depends upon how the FCC implements it. For example, if they only allow fiber interconnection in places where there had once been a copper UNE connection then this is going to be very limited in scope. But it’s hard to see how they can stop there. After all, CLECs that compete using RBOC copper have always been allowed to grow, and if a competitor can’t ever add a new customer then this form of competition will be nearly worthless.

But if all of the fiber in the RBOC network comes available to competitors, then we are looking at the possibility of a whole new major push of competition. Competitors have largely been kept from the RBOC fiber network and this opens up huge market possibilities.

My advice to my clients is going to be to be cautious about leaping into this kind of opportunity. History has shown us that AT&T and Verizon will be working to kill this kind of arbitrage from the minute that it’s proposed – and so it’s likely that this will only remain lucrative for a few years before those companies squeeze the ability to use unbundled fiber.

Don’t get me wrong. As a consultant this opens up all sorts of new work for me. But having lived through the last arbitrage trials in the industry, my alarm bells are already going off and I am going to be advising caution. If the FCC tilts the arbitrage opportunity enough in favor of the competitor then there is going to be money to be made, but I will be reminding everybody that whatever the FCC giveth they can also someday take away.

How Valuable are Your Worst Customers?

Numismatics_and_Notaphily_iconEverybody who sells cable is used to the fact that some of your customers have a hard time paying their bills. Such customers will get disconnected a few times a year, but since they really want cable they usually come back when they have enough money to pay the disconnect and reconnect fees.

There is another small percentage of your customers who are deal shoppers. They will bounce between you and your competitor and take the latest and best deal they can find. These are the customers who will call and insist on a new deal the day after their special pricing deals ends.

I’ve always wondered about how valuable such customers really are to a cable company. To some degree cable companies spend most of their advertising budget chasing these customers. You have to wonder how valuable that advertising is in a mature market where new installs come mostly from people moving back and forth between competitors.

I have clients who have decided to stop spending big money on advertising, and almost universally they lost customers over time. But that is not automatically a bad thing if what a company loses are the customers who churn, those who don’t pay their bills or customers who chase specials.

There are a few recent examples of bigger companies that have tightened credit policies and who have stopped chasing the marginal customer. Let’s look how that affected them.

The first example is Cablevision. During the most recent quarter they implemented policies that are aimed at getting rid of marginal customers. They significantly tightened credit limits, stopped offering lucrative win-back incentives and eliminated any promotional pricing for customers who pay late. Just during this last quarter these policy changes impacted Cablevision customers and they lost 56,000 video customers, 33,000 voice customers and 23,000 high-speed Internet customers in the quarter.

These losses are significant for Cablevision. They lost 2% of cable customers in just the third quarter while losses for all of 2013 were 2.7%. They lost 1% of both voice and data customers when those customer bases had been growing for the past several years. So clearly the new policies is chasing away customers at a much faster pace than historical and in fact has turned growth of voice and data customers into losses.

The other big company that recently tightened credit policies is DirectTV. They report that they lost 28,000 customers in the third quarter of this year and they attribute most of that to the change in credit policies. To put this into perspective, DirectTV had 20.25 million customers at the end of 2013. But in the fourth quarter of 2013 the company added 93,000 customers compared to the customer loss in this recent quarter.

So it’s clear from these two companies and from the anecdotal evidence that I’ve gotten from my clients that having tight credit policies will shrink growth or even cause a customer loss. But does that mean it’s not worth it?

If most of the advertising budget is spent going after these marginal customers, and if promotional pricing is given to the same small percentage of customers over and over again, does it make sense to spend as much on advertising or to be aggressive with win-back programs? Additionally, my clients tell me that marginal and bad debt customers cause most of the activity and effort in their customer service groups.

Every company is different and there is no right answer for everybody. Cablevision has obviously done the math and their rationale for the change is that it is shedding the customers with the lowest margins that also require the biggest effort to maintain. They think in the long-run that they will end up a little smaller but with a more stable and profitable customer base. That’s a bet they can probably afford to make with over 2 million customers, but it’s a lot riskier for a smaller company to contemplate taking this same position.

But this is food for thought. Every company has customers that cause more effort than they are worth and perhaps every company would be better off with tougher policies. It’s worth asking yourself if you should have win-back programs with huge discounts that try hard to never lose a customer. It’s worth asking if you really are better off keeping customers who don’t pay you three or four time a year.

These are even more important questions for fiber providers to ask. In networks where it costs $1,000 to $2,000 in sunk costs to add a new customer a fiber provider can’t afford too many mistakes. Fiber providers ought to consider having credit checks, required deposits, install fees, term contracts or any other tool that helps to insure that a customer stays long enough to pay for the cost of adding them. Perhaps Cablevision is right and what matters is not that you get every customer possible, but that you get the right customers.