Wireless is Not a Substitute for Wireline

Cell-TowerAny time there is talk about government funding for broadband, arguments arise that wireless broadband is just as good as wireline broadband. But it is not the same and is not a substitute. I love wireless broadband and it is a great complement to having a home or business broadband connection, but there are numerous reasons why wireless broadband ought not to be funded by government broadband programs.

The most recent argument for wireless broadband comes the Minnesota House which is currently in session. In last year’s legislative session, Minnesota approved a $20 million grant program to help expand broadband in rural areas of the state. That grant was distributed to a number of broadband projects, all wireline, which required a significant matching fund from an entity building the wireline facilities. The 2014 funding, which mostly went to independent telephone companies, is being used to bring broadband to thousands of rural residents as well as 150 rural businesses and 83 rural schools and libraries.

But the chairman of the House Job Growth and Energy Affordability Committee in Minnesota killed an additional state grant; it’s been left out of this year’s House budget. Rep. Pat Garofalo, R-Farmington, said that wired broadband is too costly in sparsely populated areas and believes that wireless and satellite technologies are more financially effective.

In another case, Verizon recently got the New Jersey State Board of Public Utilities to agree that it could use LTE data plans as substitutes for homes that are losing their copper or DSL services.

Another place where this same argument is being made concerns the upcoming funding from the Connect America Fund, which is part of the federal Universal Service Fund, and that is being directed towards expanding rural broadband. As written several years ago, the Fund is allowed to consider investing in wireless as well as wireline broadband networks.

There have been numerous parties lobbying to try to get these billions get directed towards landline networks and not towards wireless networks. The NTCA, which is now called the Rural Broadband Association, sponsored a report from Vantage Point Solutions that compares wireless and wireline technologies, and which argues that government funding should only be used to fund wireline networks. This whitepaper makes many of the same arguments I have been making for years about the topic, and included a few I had not considered. Here are some of the major arguments made by the whitepaper:

  • Even without considering the cost of spectrum, it costs far more to build a wireless network when comparing construction cost per megabit that can be delivered to end users. Modern fiber networks rarely cost more than $10 per Mbps capacity created, and often far less than that, while it costs several hundred dollars per effective megabit to construct a wireless network using any of the common technologies like LTE.
  • From a physics perspective, the amount of frequency available through US allocated spectrum is not large enough to deliver large symmetrical bandwidth, which is the goal of the National Broadband Plan. This limitation is a matter of physics and not of technology. That limitation is still going to be there with 5G or later wireless technology unless the FCC massively reworks the way it allows frequency to be used.
  • At least in today’s world, the prices charged to customers are drastically different for wireless and wireline data. Already today, 25% of residences are downloading more than 100 gigabits per month in total data. That can be affordable on wireline, but almost every current wireless provider has monthly data caps that range upward from just a few gigabits per month. A customer on a capped data plan who uses 100 gigabits in a month would face an astronomical monthly bill.
  • The report also made the economic argument that the shelf-life for wireless equipment and networks is relatively short, in the range of seven years, while fiber networks can have an incredibly long economic life. The report argues that the Connect America Fund should not be investing in technology that will obsolete and potentially unusable just a few years after it’s built. There certainly is no guarantee that the large wireless carriers will make needed future investments once they stop getting a federal subsidy.
  • The report also made all of the normal comparisons between the two technologies in terms of operating characteristics such as available bandwidth, latency times, and high reliability, all of which tilt in favor of landline.

I agree with this report wholeheartedly. I know that when I first read the language in the Connect America Fund my initial reaction was that the money would all go to cellular companies who would use the money to build rural cell towers. But fiber technology has gotten far more efficient in just the few years since that order. Also, the wireless businesses of Verizon and AT&T are the two most profitable entities in telecom, by far, and it makes no sense to flow billions of federal dollars to them to build what they will probably build anyway with their own money.

Certainly, expanding rural LTE would get some broadband to more people, but in the long run we would be better off directing that same money to bring a permanent solution to some rural areas rather than a poor solution for all of it.

Economic Development and Fiber

sibley1898One of the main reasons smaller communities give for wanting fiber networks is economic development. They believe that fiber will help them attract new jobs or keep existing jobs. There are examples where fiber networks have led to these two things directly, but it’s not always so clear cut

I know one rural town that can attribute over 700 new jobs directly to fiber. The call centers and defense firms that came to the town said that fiber was the main reason they chose that community. And I know of another town that built fiber and was able to convince the major employer in town not to relocate elsewhere.

But economic development is a funny thing and fiber projects often don’t lead to these kinds of direct home runs — where fiber is the major reason for economic improvement. I saw an announcement this morning that shows there’s often a more tenuous line between cause and effect. The city of Gaylord, Minnesota just announced that a medical school is going to be built there. Gaylord is a small city in a rural county in Minnesota that is known more for growing Del Monte corn and producing ethanol than they are for attracting things like a medical school.

But Gaylord is in Sibley County which has been actively pursuing fiber for over five years. They are within months of completing financing and starting construction of a county-wide fiber network that is going to pass homes and businesses in 10 towns and the surrounding farms. Gaylord learned about the potential for attracting the medical school as part of their investigation into building fiber, and without the fiber initiative they would never have known about or pursued the medical school.

You can’t draw a direct line between the medical school and fiber because there are certainly reasons other than bandwidth why a medical school would locate in a rural location. But by the same token, the fiber is a very important part of why the area was chosen. The desire for fiber clearly shows how progressive the area is in terms of being friendly to technology. And it certainly is not going to hurt in attracting staff and students to a medical school if they know that they can get gigabit fiber at home.

Clearly, something like a medical school will be great for a small community. It will attract high-paying jobs, which is going to boost real estate and rentals for students. It’s likely to attract a new hotel, and one can imagine that there are numerous businesses in town that will do better simply because of the presence of the medical school – it’s going to be good for local banks, car dealers, restaurants, grocery stores – you name it. I would certainly put this into the category of success due to fiber because it’s clear that without the fiber initiative this probably would never have happened. But the impact of fiber on rural communities is often more subtle than this.

One thing that you don’t hear mentioned in public discussions of economic development is the phrase brain drain. But every rural town and county in the country is extremely aware of this phenomenon. Rural families almost universally lament that there are not enough jobs for their kids, and it’s routine for high school or college graduates from rural areas to relocate elsewhere to find work. Fiber can’t, by itself, solve this problem for a rural community. But it helps.

A fiber network generally creates a few new technical jobs. But more importantly, it gives people the ability to work from home. I’ve noted numerous times in this blog how poor the bandwidth is in rural America. There are numerous jobs today that people can do at home. My consulting company went virtual a few years ago and we all live in small towns and work from home. Bandwidth allows writers, architects, engineers, salespeople, and all sorts of employees to create a home base in a place they want to live while working from home.

So, for every success of a town getting something amazing like a new medical school, there are hundreds of small successes in the form of people using bandwidth as the tool needed to live where they choose rather than have to move to a big city to find work. The income and tax base that such people bring to the rural communities that have bandwidth probably has an overall larger impact than the occasional home run. It’s just very hard to measure because it happens under the radar.

. . . In Which the Blogger Gets to Play Lawyer

Network_neutrality_poster_symbolNow that numerous lawsuits are being filed against the net neutrality order, I thought I would give my own take on some of the issues being raised by the various lawsuits. I’m certainly no lawyer, but I have been following the FCC closely since 1980 and have seen how numerous challenges to their regulations have gone over the years. So following is my take on the major arguments being made against them.

Procedural Issues There are two procedural problems that could be troublesome for the FCC. First, the final order bears almost no resemblance to the original proposal that the Commission floated when seeking public comments. Even though it took a few hundred pages to explain it, the final net neutrality order boils down to a handful of actual new principles defining the way net neutrality is going to work.

The problem is that those few provisions include new ideas and terminology like prohibiting paid prioritization and not allowing throttling of web traffic. Those concepts were not included explicitly in the original notice. The FCC and other similar agencies have a long history of adhering to specific procedures for making new rules, and that process involves presenting the proposed ideas to the public and then soliciting comments.

I can see a valid argument that there was not adequate opportunity to discuss what was actually ordered. Of course, if that’s the only thing that the courts find wrong about the order, it’s a pretty easy fix, and at worst the FCC would have to go through another public notice and comment period.

More troubling is that at the last minute the FCC tossed in the idea of regulating Internet interconnection agreements and peering arrangements. That was a surprise to most of the industry and even made a few proponents come out against the final order. That is a major change in FCC authority and I can’t find any prior notice of the FCC’s intent to do this to the extent contained in the order.

Reversing Major Precedents Generally, agencies like the FCC must rely on the intent of Congress and existing legislation in creating their overall framework of rules. In this order the FCC reversed a lot of prior work it did on the topic of regulating the Internet. Obviously federal agencies are allowed to change their interpretation of the law, but in this case there is a mountain of prior legal opinions from FCC lawyers saying that Congress did not intend to treat the Internet as a public utility under the 1996 Telecommunications Act.

That makes it hard, under external scrutiny from courts, for the FCC to now reverse itself and say that the Internet is a utility and should be regulated as such. The net neutrality order spends a whole lot of words trying to defend its change in direction. And rightfully so, because I would guess that this is going to be at the crux of any court review of the FCC’s authority to make this new set of rules.

Use of Forbearance to Modernize Regulatory Law I thought the way that the FCC chose to implement net neutrality was clever, by bringing it under Title II rules but then forbearing against rules there that they don’t choose to apply. But that cleverness is a point of legal attack. Obviously choosing which parts of Title II to forbear was somewhat arbitrary. But worse, if forbearance is an acceptable method of regulating the Internet, there would be nothing to stop the FCC in the future from changing the list of things they are forbearing from.

Regulating the Internet is a major new undertaking, and one would normally expect an agency to publish a new coherent set of rules laying forth how they will accomplish such a new undertaking. They would publish the gist of the new rules and ask for comments. In this case, by choosing forbearance, there was no public discussion of which Title II rules should or should not apply, nor is there any particular reason to think that the forbearance choices made are permanent and cast in stone.

Applying the Rules to Small ISPs One of the lawsuits attacks the FCC for applying the network neutrality rules to all ISPs, including small ones that don’t have any market power. Certainly small ISPs, on their own, cannot undertake the large deals that would give priority to some content over others. But this doesn’t mean that small ISPs can’t be bad actors. It certainly is within the ability of small ISPs to block access to content that they find unacceptable, even if such content is lawful. Further, small ISPs often work as part of larger consortiums that might have the market power to undertake arrangements that would violate net neutrality. For example, there are several large ISP clearinghouses in the country that provide the servers, software, and ISP functions for millions of end users.

The courts have an interesting challenge with this order. To some degree, in the FCC’s defense, they undertook regulating the Internet in a manner that the last court review suggested they should consider. But the way they went about it is unorthodox, and in regulatory law that always creates a challenge.

Latest on Security Breaches

DARPA_Big_DataIn one of the more interesting reads this year, Verizon recently released its 2015 Data Breach Investigative Report, which can be downloaded at this link. Verizon works with seventy security firms from around the world to compile and document major security breaches. This report is fascinating and provides both the big picture of how the bad guys are attacking us, as well as interesting statistics about the details of the attacks. I highly recommend the report if you have a spare hour.

The report looks at nearly 80,000 security incidents including 2,122 confirmed security breaches in the last year, many of which hit the news. One thing that Verizon saw was that almost all of those breaches (96%) were the result of nine different types of attacks used by hackers. Those nine types of attacks are: point-of-sale intrusions, payment card skimmers, crimeware, web app attacks, denial-of-service attacks, physical theft, insider misuse, cyber-espionage, and miscellaneous errors.

The most common external cause of major breaches last year was from attacks by web applications (things like phishing and malware), which caused 458 breaches. This was followed by attacks on point-of-sale systems in stores which caused 419 breaches and attacks by state-sponsored espionage units which accounted for 290 breaches.

Some of the statistics in the report are really interesting:

  • A little more than 20% of breaches come from inside an organization where an employee or trusted contractor steals credit card numbers or corporate secrets. This percentage has remained consistent since 2010.
  • In 2010, over 95% of attacks came from compromised credentials (somebody stealing login information from employees and using it to gain entry to systems) or spyware of some sort. In 2014, the threat from direct spyware has largely disappeared as a corporate threat and companies are getting good at combatting common malware from the web. But the bad guys have changed tactics and the two new major malware threats are from RAM scraping and phishing. (RAM scraping is using malware to steal unencrypted credit card data in the few milliseconds between the time that a credit card is swiped at a retail location and the data is encrypted).
  • Verizon’s study shows that 23% of recipient employees in businesses open phishing messages and 11% click on the infected attachments. Nearly 50% of phishing emails are opened within the first hour of receipt. The three big groups within companies that fall prey to phishing are communications, legal, and customer service. Unfortunately it often only takes one phishing breach to infect a network.
  • Hackers are really good at what they do and in 60% of the breaches they were inside company systems within minutes of the onset of the attack.
  • Sadly, almost all of the exploited vulnerabilities happen after the industry as a whole has found a way to block or patch against the threat, with many of these breaches coming a year or more after a patch was created. There is obviously a big gap between the fixes being developed by security experts and the time it takes to get these fixes into business systems.
  • The shelf life of the vast majority of malware is about a month. Within that time a way to block the malware is developed and distributed to the companies that scrub web traffic on the Internet before it gets to end users. But there are always tons of new malware, and it’s a constant battle between hackers and security companies.
  • There are still very few effective hacks against cell phones. Verizon estimates that only 0.03% of cellphones are infected with truly malicious software.
  • There is a big difference in the amount of malware aimed at different industries. For instance, the average financial institution sees 350 malware attempts per day, the average retail location sees 801 and the average education location sees 2,332. A lot of malware is very specific to an industry or even to a specific location.
  • The industry has touted the cost to a business for a compromised record at $0.58 per record. This was calculated by looking at insurance claims and is conservative since very large companies often self-insure. Verizon estimates that the true cost to a business is between $52 and $87 per compromised record. The bigger the breach, the larger the cost per compromised record.

The main thing I get from this report is a reminder each year of how many bad guys there are in the world trying to steal credit card numbers, corporate data, and other valuable information. It’s also interesting to see over time how the methods of attacking networks change in the never-ending cat and mouse game between hackers and security systems. It’s also interesting to look through the list of the companies who participate in this report since they are the Who’s Who of Internet security around the world.

Broadband and Real Estate

Polk County SignBy now many of you have probably seen the articles about a guy, Seth, who bought a home in Kitsap County Washington only to find out that it didn’t have broadband. Seth works from home and needs broadband access. He did his homework first and was told by employees at both the Comcast and Xfinity phone numbers that the address had service previously and that he would be able to get broadband there. Here is Seth’s blog, and as someone who works at home I can certainly feel his pain.

If you work at home then having broadband is no small matter – it’s your lifeline. This is what I find so dreadful about the thousands of rural communities with little or no broadband. The people in those places do not have the same opportunities as the rest of us. It would be an inconvenience to not be able to watch streaming video, but it would be economically devastating if you couldn’t take a good-paying job because you don’t have broadband.

In this case I hope Seth knows a good lawyer, because Comcast directly caused him great financial harm. Multiple Comcast employees told him that the house had service in the past and that he could get broadband there, which turns out to be untrue. Instead, the home had never been served by Comcast and they were going to have to build cable to serve it. As anybody knows who has ever tried to get Comcast to build cable strand, it’s like trying to get water to run uphill.

I have my own similar Comcast story with a happier ending. When I moved to my house in Florida I knew Comcast was all over the neighborhood and my new house even had a Comcast pedestal in the driveway. But it took what felt like 40 calls to Comcast to get them to come out and give me a 40 foot drop wire. We started out with them not knowing if they serve my neighborhood until finally they decided to charge me $150 to verify that I could get service. Even with that it took me over a month from the first call until I had working broadband – and a lot of people are not willing to suffer through that ordeal. I know it soured me on Comcast and no matter what good they ever do for me I will always have in the back of my mind how I had to practically threaten them to get them to give me service.

Over a decade ago when I moved to the Virgin Islands, the first thing on my ‘must have’ list was broadband. Every real estate agent there lied to me and told me that the house I wanted could get DSL from the local telephone company. But luckily I understood that for a home that was 10 miles from the nearest town they were probably wrong. I found through knocking on my potential neighbors’ doors that the only broadband there was wireless, but that it was good enough for my needs (in those days about 2 Mbps download). If I had relied on what the real estate agents all told me, and if there had not been wireless, then I would have been in the same situation as Seth. It turns out that the copper lines at that house were so bad that they couldn’t even support a telephone call let alone broadband.

Seth’s troubles were further multiplied when he found out that he also couldn’t get DSL from CenturyLink. While they served his neighborhood, they had a ‘network exhaust’ situation, meaning that all of the wires in the telco cables are being used. I have lived in such neighborhoods and you have to get on a waiting list to buy a second line or add a burglar alarm. Sadly, there are numerous older neighborhoods where the copper network is totally full. Over the years some pairs of copper go bad and so the inventory of potential working lines slowly drops as the network ages.

The final insult to Seth is that the FCC would have told him he has options there. According to the National Broadband Map, that part of Kitsap County shows 10 options for broadband. That is a phenomenally large number of choices and even includes fiber from the local electric company. Yet none of these options were actually available to Seth.

What Comcast did was negligent by telling him there was broadband available when there wasn’t. But we are now at a time when a house’s value can be drastically affected by lack of access to broadband. I hope this guy sues Comcast and wins, but I also hope that people without broadband keep screaming and make themselves heard. Because for a lot of America, Seth’s story is just another day of normal life for rural America.

ALU Sells to Nokia

sculptura phoneIt was just announced that Nokia will be buying Alcatel/Lucent. It seems that this was done so that Nokia can pick up the latest 4G technology from ALU. As one who has been in the industry for a while I have a long memory of the history of Lucent.

Before the Lucent name, the business was a part of AT&T and was the combination of Western Electric and Bell Labs. Bell Labs was always a wonderment for techies like me because they employed some of the smartest minds in the world. The lab was started by Alexander Graham Bell and over the years they developed such things as the transistor, the laser, information theory, and the UNIX and C++ programming languages. There were eight Nobel Prize winners from Bell Labs. I worked in the Bell System for a few years pre-divestiture and it was a point of pride to work for the same company that operated Bell Labs.

There was a time when Western Electric was the sole manufacturer of telephones and telecommunications devices. I recall that when I was a kid the only option for a home phone was the ponderously heavy, black Western Electric phone. These were hard wired and didn’t have long cords and when you talked you had to stand close to the phone. Over the years, Western Electric introduced smaller phones like the Princess phone and introduced longer cords that provided a little more freedom when using the phone. But all of the Western Electric phones were solid and they rarely had problems or broke. They were solid America technology made in America.

The first big change I remember for Western Electric was when AT&T started licensing other companies to make some handsets. I remember when the Mickey Mouse phone, the Sculptura phone (pictured here) and other colorful phones hit the market. Within a few years, the FCC began to widely license handsets made by numerous companies as long as they passed Bell Labs certification, and Western Electric lost their monopoly on handsets.

Western Electric also made the bulk of the electronics used by AT&T. These included voice switches, line repeaters, and various kinds of carriers used to carry more than one call at a time across a piece of copper. But Western Electric never had a total monopoly and companies like Nortel often sold equipment to non-AT&T telcos.

The big change for the companies came during the divestiture of AT&T in 1984. During the divestiture both Western Electric and Bell Labs were placed into the AT&T Technologies subsidiary. The companies went on, largely unchanged, until they were spun off from AT&T as Lucent, a standalone corporation, in 1996. Most of Lucent’s business was still with the various Bell companies, but they were branching out into numerous fields of telephony technology. At that time Lucent was the mostly widely held stock company in the US and had a stock price of $84 and a market capitalization of $258 billion.

Lucent fell onto hard times at the end of 2000 and was one of the first companies to be hurt by the telephony and dot com crash. The industry as a whole had heavily pursued the new competitive telephone companies (CLECs) that had been authorized by Congress and the FCC in 1996. Unfortunately, the large companies like Lucent and Nortel provided significant vendor financing to the fledgling CLEC industry, and when those companies started folding all of the large manufacturers were thrown into financial trouble.

Lucent never fully recovered from that crash (like many other tech companies that disappeared at that time). Their stock lost significant capitalization from the crash, but then really got slammed when it was revealed that the company had been using dubious accounting methods for recognizing sales and revenues. By May of 2001, the company’s stock had fallen to $9. I remember at the time that everybody in the industry could quote the Lucent stock price and we all watched in wonder as the company crashed and burned.

Over the next few years Lucent tried to gain some value by spinning off business units. It spun off its business systems into Avaya and its microelectronics unit unto Agere Systems. By 2003 the Lucent stock price was down to just over $2 per share and the company had shed over 130,000 employees. Lucent merged with Alcatel in 2006 and became Alcatel Lucent (ALU). That company did well for a while but then had a long string of losses until positive profits were recently announced.

And now the business has been absorbed by Nokia, mostly to pick up the division that makes 4G wireless equipment. There is not much of the old company left. Bell Labs is still around and one has to wonder if Nokia will continue to operate it. The Lucent history is not unusual for high tech companies. Western Electric had a near-monopoly for decades, but over time everything made by them changed drastically and newer companies ate away at the old giant. Today we have new giant companies like Apple and Samsung, and if history is any indicator they will someday be supplanted by somebody new as well.

Connecticut’s Call for a Fiber PPP

Fiber CableA large number of municipalities in Connecticut have banded together and let it be known that they would love gigabit fiber. They’ve recently made numerous announcements about the initiative as well as assembled a list of the things that each municipality is prepared to contribute to somebody who will build fiber for them. This list includes the typical sort of things that cities offer to potential commercial partners – expedited permitting, real estate and building access, access to existing conduits or other sorts of existing telecom infrastructure, etc.

What these cities are doing is not unusual, but it’s by far the biggest such initiative I have seen. I have no doubt that these cities would love gigabit fiber, and they clearly understand the benefits and implications of getting it. But unfortunately, I see very little chance of their approach working. The Connecticut group is calling this a PPP (public private partnership). I love the ideas of PPPs and I have worked with some successful ones. In addition to being a telecom consultant I also am on the Board of Directors for a PPP financial consultancy firm that works across a wide range of infrastructure projects.

Numerous cities have taken this same approach to try to find a private partner. The first one I recall was the city of Seattle nearly a decade ago, a project that I helped with. The city made the same sort of appeal to investors to build fiber in their city and they had no takers. Since then I have regularly seen similar appeals made from cities all over the country. And except maybe for a few small cities that might have found a small local ISP or telco to partner with, I am not aware of any of these attempts that has ever been successful in attracting a partner. There are a number of reasons for this:

  • I saw several quotes from Connecticut officials that said that they are hoping that some venture capitalist will see this offer and become interested. That is not the way that venture capital works. Venture capitalists don’t invest in ideas for good projects; they invest in management teams that they trust. And they expect those management teams to have a strong track record and to bring them shovel-ready projects. Venture capitalists never go and seek opportunities, but instead wait for fully fleshed-out opportunities to be brought to them. For the Connecticut opportunity to be shovel-ready someone will have to expend significant millions up front to perform the engineering to determine the cost of the build, undertake market research to understand the potential for customer interest, and then create financial business plans to demonstrate that the project can make the kind of returns that venture capitalists or other financial sources will find acceptable. For just one decent sized city, that development work can easily cost half a million dollars or more, and for the 40+ towns in this consortium this would be a huge outlay. There are probably not many companies around willing to take a multi-million dollar risk on doing this upfront development work without knowing for sure that they can get it financed.
  • I have done several recent financial analyses of large fiber PPPs. These studies show that the kind of contributions that these cities (and most cities) are willing to make to a fiber project are not worth very much to a potential builder. One of the few benefits that might get somebody’s attention would be if a city already has a significant network of empty conduit through which fiber could be pulled. The other things these cities are offering don’t change the potential IRR (internal rate of return) of a project by much at all. Sadly, cities are overvaluing the benefits they can bring to a commercial fiber partner as part of a PPP. If a city really wants to attract a private builder they ought to be thinking about providing an economic development bond to pay for some significant portion of the fiber – a bond that they would repay out of tax dollars. That would be real skin in the game that would create a real partnership opportunity which might attract investors.
  • There are not many companies building fiber that would be large enough and have the ability to respond to the Connecticut group. To build to the fifty plus communities in this group must be a billion dollar venture and there are not many companies that have the wherewithal to raise that kind of money to build fiber. Only a company with deep pockets and a proven track record of building and operating telecom networks would have any chance of raising this kind of money. Even for companies with deep pockets this kind of construction is going to require at least 30% to 40% equity and there are not many firms sitting on the cash and equity needed to pay for this.
  • The various announcements said that since fiber is profitable that they ought to able to attract the needed money, But is it that profitable? Venture capital investors normally seek risk-adjusted returns of 30% or better and since fiber is a capital intensive undertaking it’s hard to achieve returns that high. It’s not impossible, but if building fiber was really that lucrative there would be fiber projects everywhere.

I hope these communities prove me wrong, because I think what they want is terrific. I really don’t want to be throwing a wet blanket on this because I am a huge believer in the benefits of fiber networks. But this sounds to me like wishful thinking on the part of economic development people who do not understand the market reality of how large amounts of money are raised in this country today. These cities are basically wanting somebody else to bring the money to build fiber in their communities, and you can count the firms who are capable of doing this on one hand – and most of those are the giant telcos who are no longer building fiber at all.

Web TV Not Hitting the Mark

Old TVI am sure that the day will come when there will be OTT web programming packages that will be legitimate competitors to cable. But that day is not here yet. We are starting to see the beginning of web TV, but nothing out there is yet a game changer.

And that is not surprising. We still live in a world where content is under the very tight grasp of the programmers and they are not about to release products that cannibalize the cash cow they have from the cable providers. The early web products are being touted as attempts to lure in the cord cutters and cord nevers who no longer buy traditional cable.

Here is what we’ve seen so far:

  • Sling TV is certainly priced right, starting at only $20 per month. That price includes ESPN as well as a few other popular channels like the Food Network and the Travel Channel. They have a growing list of add-on bundles priced at $5 each. And they are just now launching HBO. But there are problems with the service. As I covered in a blog a few weeks ago, watching some NCAA first round basketball games on Sling TV was the most painful sports watching experience I’ve ever had. And it’s been widespread that they botched the NCAA finals. But there are drawbacks other than the quality. For example, you can only watch it on one device at a time, making it family unfriendly.
  • Sony Vue has two major limitations. First, right now it is only available through a Sony Playstation which costs between $200 and $400. And it’s not cheap. They have three packages set at $49.99, $59.99, and $69.99. Without even considering cable bundle discounts, these can cost as much or more than normal cable.
  • Apple’s TV product is not even on the market yet. Their biggest limiting factor is that it’s going to require the use of a $99 Apple TV box. That unit has been far less popular than the Roku. Apple says they will have ‘skinny’ pricing similar to Sling TV.

There are several major factors that will work against web TV for the foreseeable future:

  • Incumbent Bundle Discounts. All of the major incumbent providers sell bundles of products and they charge a premium price to drop the bundle and go to standalone broadband. That is, if they will sell naked broadband at all. For instance, Comcast has no option for standalone broadband faster than 25 Mbps. When people do the math for canceling traditional cable many of them are going to see very little net savings from the change.
  • Issues with Live Streaming. People have become used to a certain quality level of web viewing due to Netflix and Amazon Prime. But those services cache their product to viewers, meaning that when you first start watching they send a burst of data and they then stay about five minutes ahead of where you view. This eliminates problems due to variance in the Internet connections, making the viewing experience smooth and predictable. But there is a far different challenge when streaming live content, meaning shows that are broadcast at set times. Such shows are largely not cached, and thus are vulnerable to every little hiccup in a viewer’s local network (of which there are many which becomes apparent when watching live sports on the web).
  • Programmer Bundles. Programmers make a ton of money by bundling their content to the ISPs. Comcast, Verizon, and everybody else are not able to pick and choose the content they want. There are seven major program owners that control a big majority of cable channels, and when you want any of their content they generally insist that you take almost all of it. This lets the programmers force ISPs to take programs that they would likely never otherwise buy. Web TV is trying to differentiate itself by offering smaller bundles. But I am sure that programmers are making the web providers pay a premium price for choosing to take only a subset of their channels.

The FCC is currently looking at the issue of web TV and they might make it easier for web companies to obtain content. If they do so, one would hope that they also make it easier for wireline cable providers to do the same. Nielsen released statistics late last year that show that the average household largely watches around eleven channels out of the hundreds that are sent to them. Consumers and cable providers would all benefit greatly if the programming that is being forced upon us better matched what we actually want to buy.

The web TV companies are trying to do just that and put together packages of just the most popular content. But I laugh every time I see them talking about going after the cord cutters, who at this point are largely younger households, because the content they are choosing for the web so far is popular with people fifty and older (sometimes much older). I can’t see too many younger households being attracted to these first web TV packages. If the rules can be changed so that different providers can try different packages, then we might someday soon see a few killer web packages that can give traditional cable a run for the money. And perhaps what we are already seeing will be the wave of the future. Perhaps there will be numerous web TV offerings, each attracting its own group of followers, meaning no one killer package but dozens of small packages each with their loyal fans.

FCC Looking at Backup Power for CPE

Fuld-modell-frankfurtThe FCC is currently deliberating whether they should require battery or other power back-up for all voice providers. They asked this question late last year in a Notice for Proposed Rulemaking (NPRM) in Docket 14-185 and recently numerous comments have been filed, mostly against the idea.

This would affect both cable TV companies and fiber providers since those technologies don’t provide power to telephone sets during a power outage. Customers still on copper have their phones powered from the copper (assuming they have a handset that can work that way), but the FCC sees the trend towards phasing out copper and so they ask the question: should all voice providers be required to provide up to eight hours of backup so that customers can call 911 or call for repairs?

The FCC also asks dozens of other questions. For instance, they ask if there should be an option for customers to replace batteries or other back-up power. They ask if something universal like 9 volt batteries might be made the default backup standard.

One can tell by the questions asked in the NPRM that the FCC really likes the idea of requiring battery backup. I put this idea into the category of ‘regulators love to regulate’ and one can see the FCC wanting taking a bow by providing a ‘needed’ service to millions of people.

But one has to ask: how valuable would this really be for the general public? As you might expect, both cable companies and fiber providers both responded negatively to the idea. They made several major valid points against the idea:

  • Most Handsets Don’t Use Network Power. We all remember the days of the wonderful Bell telephones that all were powered from the copper network. If you had a problem with your phone, one of the first things you always tried was to carry your phone outside and plug it into the NID to see if your problem was inside or outside of the house. I remember once when I had an inside wiring issue that I spent several days squatting on my carport steps to carry on with my work. And those phones were indestructible; my mother still has her original black Bell telephone and it works great. But today you have to go out of your way to buy a plain phone that is network powered. If you get a phone with a portable handset or with any built-in features it’s going to need home power to work. So the question becomes: how many homes actually have phones that would work even there was some sort of backup during an outage?
  • Cell Phone Usage. Landline penetration has fallen significantly in the country. At peak it was at 98% yet today the nationwide penetration is under 60%, with the penetration rate in some major cities far below that. But as landlines have dropped, cellphone usage has exploded and there are now more cellphones in the US than there are adults. As many filers pointed out, when power is out to a home people will make emergency calls from their cellphones. And for the 40% or so of homes that only use cellphones, it’s their only way to make such calls anyway.
  • High Cost of Maintaining Batteries. I have clients that operate FTTP networks and who originally supplied batteries for all of their customers. This turned into a very expensive maintenance nightmare. In a FTTP system these batteries were inside the ONT (the electronics box on the side of the home). This means that the ONT had to be opened by a company technician to replace the batteries, meaning a truck roll, and meaning that a customer can’t replace their own batteries. When batteries go bad they must be replaced or they leak and damage the electronics, and these companies found themselves committing major resources to replacing batteries while they also realized that due to the above issues most of their customers didn’t care about having the backup.
  • What Do You Back-up? There are numerous different ways these days to provision broadband to people (and consequently voice). Some of these options don’t have a practical battery backup available. For example, a cable modem costs a lot more if it includes a power backup, particularly one that is supposed to last for 8 hours. I can’t imagine that there is any practical way to provide backup power other than to supply an off-the-shelf UPS for ISPs who deliver broadband with unlicensed wireless networks. And today, even the FTTP business is changing and ONTs are becoming tiny devices that are plugged into an inside-the-house outlet. Also, who is responsible for providing the backup when a customer buys third party voice from somebody like Vonage that is provisioned over their broadband product?
  • This Adds to the Cost of Deploying Fiber. Building new fiber to premises is already expensive and such a requirement would probably add another $100 per household to the cost of deploying fiber, without even considering the ongoing maintenance costs.
  • Today Most of the Alternatives Proposed by the FCC Don’t Exist. Nobody has ever bothered to create standard battery backup units for a number of network components in coaxial networks. Cable companies have been delivering voice for many years and have had very few requests or demand for providing backup. There certainly are not any backup products that would rely on something standard like 9 volt batteries. And in many networks, such a product would not be able to provide 8 hours of backup. For example, a cable modem would drain even a commercial UPS in a few hours (I know, I have mine set up that way).

I am certainly hopeful that the FCC heeds the many negative comments about the idea and doesn’t create a new requirement for which I think there is very little public demand. Sometimes the best regulation is doing nothing, and this is clearly one such case.

How Vulnerable is Our Web?

The InternetWe all live under the assumption that the web is unbreakable. After all, it has thousands of different nodes and is so decentralized that there isn’t even as many as a handful of places that control the Internet. But does that mean that something couldn’t do enough harm to it to cripple it or bring it down?

Before I look at disaster scenarios, which certainly exist, there is one other thing to consider. The big global Internet as we think about it has probably already died. The Internet security firm Kaspersky reports that by the end of 2014 there were dozens of countries that had effectively walled themselves off from the global Internet. A few examples like China are well known, but numerous other countries, including some in Europe, have walled off their Internet to some degree in response to spying being done by the NSA and other governments.

So the question that is probably more germane to ask is whether or not there is anything that could bring down the US Internet for any substantial amount of time? In the US there are a handful of major hubs in places like Atlanta, Dallas, San Francisco, Northern Virginia, and Chicago. A large percentage of Internet traffic passes through these major portals. But there are also secondary hubs in almost every major city that act as regional Internet switching hubs, and so even if a major hub is disrupted somehow, these regional hubs can pick up a lot of the slack. Additionally, there is a lot of direct peering between Internet companies, and companies like Google and Netflix have direct connections to numerous ISPs using routes that often don’t go through major hubs.

But still, it certainly could be disastrous for our economy if more than one of the major hubs went down at the same time. Many people do not appreciate the extent that we have moved a large chunk of our economy to the Internet as part of the migration to the cloud. A large portion of the daily work of most companies would come to a screeching halt without the Internet and many employees would be unable to function during an outage.

There have been numerous security and networking experts who have looked at threats to the Internet and they have identified a few:

  • Electromagnetic Pulse. A large EMP could knock out Internet hubs in ways that make them difficult to restart immediately. While it’s probably unlikely that we have to be too worried about nuclear war (and if we do, the Internet is one of my smaller worries), there is always the possibility of a huge and prolonged solar flare. We have been tracking solar flares for less than a century and we don’t really know that the sun doesn’t occasionally pump out flares much larger than the ones that we expect.
  • Introducing Noise. It is possible for saboteurs to introduce noise into the Internet such that it would accumulate to make it hard to communicate. This could be done by putting black boxes into numerous remote fiber switching points that would inject enough noise into the system to garble the signals. If enough of these were initiated at the same time the Internet wouldn’t stop, but most of what is being sent would have enough errors to make it unusable.
  • Border Gateway Hijacking. The border gateway protocol is the system on the Internet that tells packets where to go. If the BGP routers at major Internet hubs could be infected or hacked at the same time the Internet could lose the ability to route traffic.
  • Denial of Service Attacks. DDoS attacks have become common and for the most part these are more of a nuisance than a threat. But network experts say that prolonged DDoS attacks from numerous locations directed against the Internet hubs might be able to largely halt other web traffic. Certainly nothing of that magnitude has ever been undertaken.
  • Cyberwarfare. Perhaps the biggest worry in coming years will be cyberattacks that are aimed at taking down the US Internet. Certainly we have enough enemies in the world who might try such a thing. While the US government has recently beefed up funding and emphasis on defending against cyberattacks, many experts don’t think this effort will make much improvement in our security.

Perhaps one of the biggest issues we have in protecting against these various kinds of attacks is that there is no ‘Internet’ infrastructure that is under the control of any one company or entity. There are numerous firms that own internet electronics and the fibers that feed the Internet; most of these companies don’t seem to be making cybersecurity a high priority. I’m not even sure that most of them know what they ought to do. How do you really defend against an all-out cyberattack when you can’t know ahead of time what it might look like?

This isn’t the kind of thing that should keep us up all night worrying, but the threats are there and there are people in the world who would love to see the US economy take a huge hit. It certainly will not be surprising to see a few such attempts over the coming decades – let’s just hope we are ready for it.