FCC Approves New Spectrum for Public Use

Transmitter_tower_in_SpainOn April 17 the FCC approved the use of a 150 MHz continuous band of spectrum that will be available for public use within the 3.5 MHz band, which they designated as Citizens Broadband Radio Service. This is not necessarily a replacement for traditional CB radio, but that is one of the possible uses of the spectrum.

The spectrum already has some existing users, mostly federal government use of radar and a few users who transmit to and from satellites. These existing users sit in the bands between 3550 and 3650 MHz. Additionally, this new offering will add a 50 MHz band up to 3700 MHz. There are also existing commercial users, including a handful of rural wireless ISPs using the higher end of the spectrum.

The FCC will be implementing a new way to share this spectrum among potential users. They want to implement a two-tiered approach to reduce interference with existing spectrum users. First, a radio that wants to use the spectrum must check with an FCC database to see if there is an existing user in their geographic neighborhood. Second, the radio must use what the FCC is calling ‘sensing technology’ that would first listen to be sure no one else is using the spectrum before transmitting. These two requirements differentiate this spectrum from other public bands used for WiFi where unlimited numbers of users are allowed to transmit simultaneously, and where interference is accepted.

The FCC hopes that this spectrum can support a wide variety of uses such as small cell deployment, fixed wireless broadband, and a category the FCC is calling general consumer use. The spectrum could be used to support CB-like radios, leading to the chosen name of the spectrum block. It’s anticipated that the spectrum could be used by wireless providers to extend LTE. There are already a few users that have been allowed to use the spectrum in rural markets to provide point-to-point data services.

The existing radar is mostly at naval bases near the coast. The FCC is not particularly worried about these bases being affected by the new users since they broadcast high-powered, strong signals. It’s likely the radar sites would overwhelm any attempted commercial use of the spectrum.

This announcement is part of the FCC’s response to widespread request for more public spectrum, and it furthers one of the goals set in the National Broadband Plan to have 500 MHz of spectrum available for wireless data. In many areas, the current public spectrum bands, such as those used for WiFi, are getting congested. At a time when there is a major proliferation of wireless devices and applications in the marketplace, the pressure is going to stay on the FCC to continue to find new slices of spectrum for public use.

There are several steps needed now that this order has been issued. First, somebody must be chosen to administer the geographic database of existing users. Verizon has already volunteered to take that role. Next, the FCC will be issuing a Further Notice of Proposed Rulemaking that will define the specific rules for using the spectrum. There is still a bit of a tug of war going on and the CTIA doesn’t want the spectrum to available to everybody, but the FCC seems somewhat determined on that point. Finally, vendors will need to get radios certified to meet the new requirements. Fortunately, many of them say that they now have radios that meet the expected final requirements.

This is a very interesting spectrum to consider for rural broadband deployment. The operating characteristics of the spectrum provide for long distance transmission and the deployment of significant bandwidth 5–8 miles from a transmitter. Further, it’s unlikely that in rural places there will be other users of the spectrum, particularly if using it for point-to-point connections to customers. Rule-compliant radios for the spectrum are expected to be affordable and this could be used to provide rural broadband links from 20–50 Mbps download. That is pretty good broadband for places that have no broadband alternatives today.

Congress and Net Neutrality

Capitol_domeNet neutrality is going to be in limbo for the next few years as the myriad of lawsuits make their way through the courts. I’ve written other blogs looking at this issue and, at least in my opinion, it’s unlikely that the FCC ruling will make it through the courts unscathed. Not only is there a question about if they have the authority to order what they did (I happen to think they do), but the whole process included irregularities compared to the normal FCC process. This was not one of those dockets where the FCC issued an idea, got comments, and then made a decision. There were numerous twists and turns during the process and some changes in thinking at the FCC. The irregularities of the process make the ruling vulnerable to court challenges.

If we had a functional Congress this could all be fixed with a very simple new bill. At the end of the day the FCC’s net neutrality decision boils down to three basic tenets that can all be captured in just over a page of text. It would be easy for Congress to pass a bill that laid forth those same principles and then concluded by giving the FCC the authority to enforce them.

Such a decision would cut through all of the red tape, and a Congressional order doing this would establish the FCC’s clear authority to keep the Internet open. Such a law would avoid the whole mess of Title II and forbearance from old rules that don’t really have anything to do with the Internet. Such a Congressional law could avoid the whole issue of treating ISPs and broadband as a utility.

There were several laws floating around Congress last year that did half of what was needed. They basically said that the Internet needs to be open and that nobody should be able to do things that endanger that openness. But every one of these draft bills had the fatal flaw of not giving the FCC the authority to enforce the net neutrality concept.

The bills that I’ve seen are just window dressing. They would let Congress go on the record as being in favor of net neutrality, without actually having done anything to make net neutrality the law of the land. The net neutrality bills I saw didn’t have any more practical application than the laws that Congress is always passing to celebrate things on a given day. We need net neutrality to be more important than National Friendship Day (August 2).

I don’t normally get too political in the blog, but this is not really a partisan issue. Congress as a whole, both parties, has increasingly gotten in bed with corporations, and those corporations that fund the incredibly expensive process of getting elected and staying in office now have most of the influence on what laws get passed.

And so Congress is loathe to pass a net neutrality bill that is not favored by the large cable companies and telcos that contribute to them. The carriers don’t want to be regulated in any manner (as is probably true of all large companies), and so it’s incredibly unlikely that our Congress, in this current environment, is ever going to establish telecom laws with the teeth needed to make them effective.

There has been talk over the last few years that Congress is working on a new telecom reform law. It certainly is time for one. The last Act was passed in 1996. While that doesn’t sound like very long ago, the telecom world has changed in drastic ways since then. 1996 was the height of AOL being the predominant broadband provider in the country. The broadband technologies of DSL and cable modems were just hitting the markets and there were no broadband customers using them at the time. Businesses thought that a T1 was blazingly fast access to the burgeoning Internet.

So the rules created then could use a fresh look, because many of them are already obsolete. But it seems doubtful that a similar kind of law can be passed today. The Telecommunications Act of 1996 opened up the large telephone companies to competition and they absolutely hated everything about the bill. The 1996 Act also changed a number of rules for cable companies that they did not like.

Today there really isn’t much practical difference between large cable and telephone companies, but we treat them very differently. And now that wireless data is growing, wireless providers should be considered in any new rules. One has to wonder if Congress has the gumption to rein in all of these companies under one set of sensible regulations. My fear is that any new telecom law would do just the opposite and that they would mostly relieve all large companies from being regulated – because corporations seem to be far more in favor right now than people.

The Skinny Cable Line-up

Fatty_watching_himself_on_TVIt’s going to take some time to see if Sling TV can fix their technical issues and be successful as an on-line TV product. But perhaps they have started something that will be the wave of the future. The most interesting thing about Sling is that they have a skinny base programming package to which a customer can then add small packages of optional channels. Sling has put some of the most popular channels in the base tier, so there are likely going to be some homes that will find just the skinny line-up attractive. While this is not a la carte TV where a customer can pick what they want, it brings more options than what we are used to having.

Verizon just announced that they are going to offer a skinny option on their FiOS TV. The Verizon offering is not as skinny as the one on Sling TV, but the Verizon package includes all of the major networks, PBS, and other local programming within the base package. The base Verizon package has 36 channels and it looks like a customer must then choose two add-on bundles that have from 10 to 17 channels. The add-on bundles are by type of programming such as a kid’s bundle and a sports bundle.

Verizon is proposing to sell the base plus two bundles for $54.99 per month on a standalone basis, but it gets much cheaper when bundled with broadband and/or telephone. For example, a bundle of symmetrical 25 Mbps broadband and the TV is only $64.99, or $10 more than the TV alone. A customer can add telephone for another $10 per month. Verizon says they will let people switch the add-on bundles, so people aren’t going to be stuck with only two.

Of course, we have a way to go until service providers can easily offer the skinny packages. Just yesterday ESPN announced that they were suing Verizon to stop them from including their channels in the skinny bundle. A few other networks are also  unhappy with the proposed Verizon line-up and we will have to wait to see if and how the product actually makes it to market. The skinny line-up might need to wait until the FCC comes out with some rules for Web TV since the programmers are obviously going to be very careful to not allow products that produce real competition with their bread and butter traditional cable TV subscription base.

Like Sling, Verizon is not a la carte TV, but it is a start in that direction. Nielsen just released a report that shows that people barely watch the content they get in the big cable bundles. The average household today receives 189 channels but only watches an average of 17.5 of them. Interestingly, the number of channels foisted onto people has grown dramatically; in 2008 the average home received 129 channels but watched the nearly identical 17.3 of them.

Cable is expensive because it has millions of people paying for channels they never watch. I know the last time I had cable I programmed my remote to only surf the channels I watched and I skipped the rest. The Verizon package, while not exactly skinny, pulls down the number of channels received by a household to a more manageable 60 or so.

The most interesting thing about this change is that it’s possible that Verizon will make more money on this skinny line-up than they do selling the giant one. They must pay for all of those 189 channels that people get whether people watch them or not. One would think their programming cost for these smaller packages is going to be significantly reduced.

Verizon and other cable providers are also in the process of quietly dropping channels to reduce their cost. For example, Verizon recently dropped the Weather Channel since they believe that most people check the weather on their smartphones today and do not watch TV to see the latest weather. Going to the skinnier base line-up is going to give Verizon more ammunition to slice other channels (or at least not pay for them for a lot of their customers).

While this trend toward skinnier line-ups is just starting, if it is successful it is going to have a dramatic effect on programmers. The cable providers like Verizon are going to put as many of the most popular channels into these smaller packages and that is going to leave a lot of less popular networks out in the cold and facing reduced revenues. There are over a hundred cable networks that thrive today because they can count on getting a small payment of a dime or less per month from a hundred million customers. But if the movement to smaller packages is popular, a lot of these networks are going to see vastly reduced earnings, and over time many of them will fade away.

To the extent that more web TV providers can come up with packages that people want to watch, one can imagine cable companies copying the most popular ideas in order to keep the customers on their networks. As you can see by the bundle packaging above, the cable providers have a huge advantage over any online provider since the cost of a customer’s broadband rises significantly if they drop cable altogether.

Wireless is Not a Substitute for Wireline

Cell-TowerAny time there is talk about government funding for broadband, arguments arise that wireless broadband is just as good as wireline broadband. But it is not the same and is not a substitute. I love wireless broadband and it is a great complement to having a home or business broadband connection, but there are numerous reasons why wireless broadband ought not to be funded by government broadband programs.

The most recent argument for wireless broadband comes the Minnesota House which is currently in session. In last year’s legislative session, Minnesota approved a $20 million grant program to help expand broadband in rural areas of the state. That grant was distributed to a number of broadband projects, all wireline, which required a significant matching fund from an entity building the wireline facilities. The 2014 funding, which mostly went to independent telephone companies, is being used to bring broadband to thousands of rural residents as well as 150 rural businesses and 83 rural schools and libraries.

But the chairman of the House Job Growth and Energy Affordability Committee in Minnesota killed an additional state grant; it’s been left out of this year’s House budget. Rep. Pat Garofalo, R-Farmington, said that wired broadband is too costly in sparsely populated areas and believes that wireless and satellite technologies are more financially effective.

In another case, Verizon recently got the New Jersey State Board of Public Utilities to agree that it could use LTE data plans as substitutes for homes that are losing their copper or DSL services.

Another place where this same argument is being made concerns the upcoming funding from the Connect America Fund, which is part of the federal Universal Service Fund, and that is being directed towards expanding rural broadband. As written several years ago, the Fund is allowed to consider investing in wireless as well as wireline broadband networks.

There have been numerous parties lobbying to try to get these billions get directed towards landline networks and not towards wireless networks. The NTCA, which is now called the Rural Broadband Association, sponsored a report from Vantage Point Solutions that compares wireless and wireline technologies, and which argues that government funding should only be used to fund wireline networks. This whitepaper makes many of the same arguments I have been making for years about the topic, and included a few I had not considered. Here are some of the major arguments made by the whitepaper:

  • Even without considering the cost of spectrum, it costs far more to build a wireless network when comparing construction cost per megabit that can be delivered to end users. Modern fiber networks rarely cost more than $10 per Mbps capacity created, and often far less than that, while it costs several hundred dollars per effective megabit to construct a wireless network using any of the common technologies like LTE.
  • From a physics perspective, the amount of frequency available through US allocated spectrum is not large enough to deliver large symmetrical bandwidth, which is the goal of the National Broadband Plan. This limitation is a matter of physics and not of technology. That limitation is still going to be there with 5G or later wireless technology unless the FCC massively reworks the way it allows frequency to be used.
  • At least in today’s world, the prices charged to customers are drastically different for wireless and wireline data. Already today, 25% of residences are downloading more than 100 gigabits per month in total data. That can be affordable on wireline, but almost every current wireless provider has monthly data caps that range upward from just a few gigabits per month. A customer on a capped data plan who uses 100 gigabits in a month would face an astronomical monthly bill.
  • The report also made the economic argument that the shelf-life for wireless equipment and networks is relatively short, in the range of seven years, while fiber networks can have an incredibly long economic life. The report argues that the Connect America Fund should not be investing in technology that will obsolete and potentially unusable just a few years after it’s built. There certainly is no guarantee that the large wireless carriers will make needed future investments once they stop getting a federal subsidy.
  • The report also made all of the normal comparisons between the two technologies in terms of operating characteristics such as available bandwidth, latency times, and high reliability, all of which tilt in favor of landline.

I agree with this report wholeheartedly. I know that when I first read the language in the Connect America Fund my initial reaction was that the money would all go to cellular companies who would use the money to build rural cell towers. But fiber technology has gotten far more efficient in just the few years since that order. Also, the wireless businesses of Verizon and AT&T are the two most profitable entities in telecom, by far, and it makes no sense to flow billions of federal dollars to them to build what they will probably build anyway with their own money.

Certainly, expanding rural LTE would get some broadband to more people, but in the long run we would be better off directing that same money to bring a permanent solution to some rural areas rather than a poor solution for all of it.

Economic Development and Fiber

sibley1898One of the main reasons smaller communities give for wanting fiber networks is economic development. They believe that fiber will help them attract new jobs or keep existing jobs. There are examples where fiber networks have led to these two things directly, but it’s not always so clear cut

I know one rural town that can attribute over 700 new jobs directly to fiber. The call centers and defense firms that came to the town said that fiber was the main reason they chose that community. And I know of another town that built fiber and was able to convince the major employer in town not to relocate elsewhere.

But economic development is a funny thing and fiber projects often don’t lead to these kinds of direct home runs — where fiber is the major reason for economic improvement. I saw an announcement this morning that shows there’s often a more tenuous line between cause and effect. The city of Gaylord, Minnesota just announced that a medical school is going to be built there. Gaylord is a small city in a rural county in Minnesota that is known more for growing Del Monte corn and producing ethanol than they are for attracting things like a medical school.

But Gaylord is in Sibley County which has been actively pursuing fiber for over five years. They are within months of completing financing and starting construction of a county-wide fiber network that is going to pass homes and businesses in 10 towns and the surrounding farms. Gaylord learned about the potential for attracting the medical school as part of their investigation into building fiber, and without the fiber initiative they would never have known about or pursued the medical school.

You can’t draw a direct line between the medical school and fiber because there are certainly reasons other than bandwidth why a medical school would locate in a rural location. But by the same token, the fiber is a very important part of why the area was chosen. The desire for fiber clearly shows how progressive the area is in terms of being friendly to technology. And it certainly is not going to hurt in attracting staff and students to a medical school if they know that they can get gigabit fiber at home.

Clearly, something like a medical school will be great for a small community. It will attract high-paying jobs, which is going to boost real estate and rentals for students. It’s likely to attract a new hotel, and one can imagine that there are numerous businesses in town that will do better simply because of the presence of the medical school – it’s going to be good for local banks, car dealers, restaurants, grocery stores – you name it. I would certainly put this into the category of success due to fiber because it’s clear that without the fiber initiative this probably would never have happened. But the impact of fiber on rural communities is often more subtle than this.

One thing that you don’t hear mentioned in public discussions of economic development is the phrase brain drain. But every rural town and county in the country is extremely aware of this phenomenon. Rural families almost universally lament that there are not enough jobs for their kids, and it’s routine for high school or college graduates from rural areas to relocate elsewhere to find work. Fiber can’t, by itself, solve this problem for a rural community. But it helps.

A fiber network generally creates a few new technical jobs. But more importantly, it gives people the ability to work from home. I’ve noted numerous times in this blog how poor the bandwidth is in rural America. There are numerous jobs today that people can do at home. My consulting company went virtual a few years ago and we all live in small towns and work from home. Bandwidth allows writers, architects, engineers, salespeople, and all sorts of employees to create a home base in a place they want to live while working from home.

So, for every success of a town getting something amazing like a new medical school, there are hundreds of small successes in the form of people using bandwidth as the tool needed to live where they choose rather than have to move to a big city to find work. The income and tax base that such people bring to the rural communities that have bandwidth probably has an overall larger impact than the occasional home run. It’s just very hard to measure because it happens under the radar.

. . . In Which the Blogger Gets to Play Lawyer

Network_neutrality_poster_symbolNow that numerous lawsuits are being filed against the net neutrality order, I thought I would give my own take on some of the issues being raised by the various lawsuits. I’m certainly no lawyer, but I have been following the FCC closely since 1980 and have seen how numerous challenges to their regulations have gone over the years. So following is my take on the major arguments being made against them.

Procedural Issues There are two procedural problems that could be troublesome for the FCC. First, the final order bears almost no resemblance to the original proposal that the Commission floated when seeking public comments. Even though it took a few hundred pages to explain it, the final net neutrality order boils down to a handful of actual new principles defining the way net neutrality is going to work.

The problem is that those few provisions include new ideas and terminology like prohibiting paid prioritization and not allowing throttling of web traffic. Those concepts were not included explicitly in the original notice. The FCC and other similar agencies have a long history of adhering to specific procedures for making new rules, and that process involves presenting the proposed ideas to the public and then soliciting comments.

I can see a valid argument that there was not adequate opportunity to discuss what was actually ordered. Of course, if that’s the only thing that the courts find wrong about the order, it’s a pretty easy fix, and at worst the FCC would have to go through another public notice and comment period.

More troubling is that at the last minute the FCC tossed in the idea of regulating Internet interconnection agreements and peering arrangements. That was a surprise to most of the industry and even made a few proponents come out against the final order. That is a major change in FCC authority and I can’t find any prior notice of the FCC’s intent to do this to the extent contained in the order.

Reversing Major Precedents Generally, agencies like the FCC must rely on the intent of Congress and existing legislation in creating their overall framework of rules. In this order the FCC reversed a lot of prior work it did on the topic of regulating the Internet. Obviously federal agencies are allowed to change their interpretation of the law, but in this case there is a mountain of prior legal opinions from FCC lawyers saying that Congress did not intend to treat the Internet as a public utility under the 1996 Telecommunications Act.

That makes it hard, under external scrutiny from courts, for the FCC to now reverse itself and say that the Internet is a utility and should be regulated as such. The net neutrality order spends a whole lot of words trying to defend its change in direction. And rightfully so, because I would guess that this is going to be at the crux of any court review of the FCC’s authority to make this new set of rules.

Use of Forbearance to Modernize Regulatory Law I thought the way that the FCC chose to implement net neutrality was clever, by bringing it under Title II rules but then forbearing against rules there that they don’t choose to apply. But that cleverness is a point of legal attack. Obviously choosing which parts of Title II to forbear was somewhat arbitrary. But worse, if forbearance is an acceptable method of regulating the Internet, there would be nothing to stop the FCC in the future from changing the list of things they are forbearing from.

Regulating the Internet is a major new undertaking, and one would normally expect an agency to publish a new coherent set of rules laying forth how they will accomplish such a new undertaking. They would publish the gist of the new rules and ask for comments. In this case, by choosing forbearance, there was no public discussion of which Title II rules should or should not apply, nor is there any particular reason to think that the forbearance choices made are permanent and cast in stone.

Applying the Rules to Small ISPs One of the lawsuits attacks the FCC for applying the network neutrality rules to all ISPs, including small ones that don’t have any market power. Certainly small ISPs, on their own, cannot undertake the large deals that would give priority to some content over others. But this doesn’t mean that small ISPs can’t be bad actors. It certainly is within the ability of small ISPs to block access to content that they find unacceptable, even if such content is lawful. Further, small ISPs often work as part of larger consortiums that might have the market power to undertake arrangements that would violate net neutrality. For example, there are several large ISP clearinghouses in the country that provide the servers, software, and ISP functions for millions of end users.

The courts have an interesting challenge with this order. To some degree, in the FCC’s defense, they undertook regulating the Internet in a manner that the last court review suggested they should consider. But the way they went about it is unorthodox, and in regulatory law that always creates a challenge.

Latest on Security Breaches

DARPA_Big_DataIn one of the more interesting reads this year, Verizon recently released its 2015 Data Breach Investigative Report, which can be downloaded at this link. Verizon works with seventy security firms from around the world to compile and document major security breaches. This report is fascinating and provides both the big picture of how the bad guys are attacking us, as well as interesting statistics about the details of the attacks. I highly recommend the report if you have a spare hour.

The report looks at nearly 80,000 security incidents including 2,122 confirmed security breaches in the last year, many of which hit the news. One thing that Verizon saw was that almost all of those breaches (96%) were the result of nine different types of attacks used by hackers. Those nine types of attacks are: point-of-sale intrusions, payment card skimmers, crimeware, web app attacks, denial-of-service attacks, physical theft, insider misuse, cyber-espionage, and miscellaneous errors.

The most common external cause of major breaches last year was from attacks by web applications (things like phishing and malware), which caused 458 breaches. This was followed by attacks on point-of-sale systems in stores which caused 419 breaches and attacks by state-sponsored espionage units which accounted for 290 breaches.

Some of the statistics in the report are really interesting:

  • A little more than 20% of breaches come from inside an organization where an employee or trusted contractor steals credit card numbers or corporate secrets. This percentage has remained consistent since 2010.
  • In 2010, over 95% of attacks came from compromised credentials (somebody stealing login information from employees and using it to gain entry to systems) or spyware of some sort. In 2014, the threat from direct spyware has largely disappeared as a corporate threat and companies are getting good at combatting common malware from the web. But the bad guys have changed tactics and the two new major malware threats are from RAM scraping and phishing. (RAM scraping is using malware to steal unencrypted credit card data in the few milliseconds between the time that a credit card is swiped at a retail location and the data is encrypted).
  • Verizon’s study shows that 23% of recipient employees in businesses open phishing messages and 11% click on the infected attachments. Nearly 50% of phishing emails are opened within the first hour of receipt. The three big groups within companies that fall prey to phishing are communications, legal, and customer service. Unfortunately it often only takes one phishing breach to infect a network.
  • Hackers are really good at what they do and in 60% of the breaches they were inside company systems within minutes of the onset of the attack.
  • Sadly, almost all of the exploited vulnerabilities happen after the industry as a whole has found a way to block or patch against the threat, with many of these breaches coming a year or more after a patch was created. There is obviously a big gap between the fixes being developed by security experts and the time it takes to get these fixes into business systems.
  • The shelf life of the vast majority of malware is about a month. Within that time a way to block the malware is developed and distributed to the companies that scrub web traffic on the Internet before it gets to end users. But there are always tons of new malware, and it’s a constant battle between hackers and security companies.
  • There are still very few effective hacks against cell phones. Verizon estimates that only 0.03% of cellphones are infected with truly malicious software.
  • There is a big difference in the amount of malware aimed at different industries. For instance, the average financial institution sees 350 malware attempts per day, the average retail location sees 801 and the average education location sees 2,332. A lot of malware is very specific to an industry or even to a specific location.
  • The industry has touted the cost to a business for a compromised record at $0.58 per record. This was calculated by looking at insurance claims and is conservative since very large companies often self-insure. Verizon estimates that the true cost to a business is between $52 and $87 per compromised record. The bigger the breach, the larger the cost per compromised record.

The main thing I get from this report is a reminder each year of how many bad guys there are in the world trying to steal credit card numbers, corporate data, and other valuable information. It’s also interesting to see over time how the methods of attacking networks change in the never-ending cat and mouse game between hackers and security systems. It’s also interesting to look through the list of the companies who participate in this report since they are the Who’s Who of Internet security around the world.

Broadband and Real Estate

Polk County SignBy now many of you have probably seen the articles about a guy, Seth, who bought a home in Kitsap County Washington only to find out that it didn’t have broadband. Seth works from home and needs broadband access. He did his homework first and was told by employees at both the Comcast and Xfinity phone numbers that the address had service previously and that he would be able to get broadband there. Here is Seth’s blog, and as someone who works at home I can certainly feel his pain.

If you work at home then having broadband is no small matter – it’s your lifeline. This is what I find so dreadful about the thousands of rural communities with little or no broadband. The people in those places do not have the same opportunities as the rest of us. It would be an inconvenience to not be able to watch streaming video, but it would be economically devastating if you couldn’t take a good-paying job because you don’t have broadband.

In this case I hope Seth knows a good lawyer, because Comcast directly caused him great financial harm. Multiple Comcast employees told him that the house had service in the past and that he could get broadband there, which turns out to be untrue. Instead, the home had never been served by Comcast and they were going to have to build cable to serve it. As anybody knows who has ever tried to get Comcast to build cable strand, it’s like trying to get water to run uphill.

I have my own similar Comcast story with a happier ending. When I moved to my house in Florida I knew Comcast was all over the neighborhood and my new house even had a Comcast pedestal in the driveway. But it took what felt like 40 calls to Comcast to get them to come out and give me a 40 foot drop wire. We started out with them not knowing if they serve my neighborhood until finally they decided to charge me $150 to verify that I could get service. Even with that it took me over a month from the first call until I had working broadband – and a lot of people are not willing to suffer through that ordeal. I know it soured me on Comcast and no matter what good they ever do for me I will always have in the back of my mind how I had to practically threaten them to get them to give me service.

Over a decade ago when I moved to the Virgin Islands, the first thing on my ‘must have’ list was broadband. Every real estate agent there lied to me and told me that the house I wanted could get DSL from the local telephone company. But luckily I understood that for a home that was 10 miles from the nearest town they were probably wrong. I found through knocking on my potential neighbors’ doors that the only broadband there was wireless, but that it was good enough for my needs (in those days about 2 Mbps download). If I had relied on what the real estate agents all told me, and if there had not been wireless, then I would have been in the same situation as Seth. It turns out that the copper lines at that house were so bad that they couldn’t even support a telephone call let alone broadband.

Seth’s troubles were further multiplied when he found out that he also couldn’t get DSL from CenturyLink. While they served his neighborhood, they had a ‘network exhaust’ situation, meaning that all of the wires in the telco cables are being used. I have lived in such neighborhoods and you have to get on a waiting list to buy a second line or add a burglar alarm. Sadly, there are numerous older neighborhoods where the copper network is totally full. Over the years some pairs of copper go bad and so the inventory of potential working lines slowly drops as the network ages.

The final insult to Seth is that the FCC would have told him he has options there. According to the National Broadband Map, that part of Kitsap County shows 10 options for broadband. That is a phenomenally large number of choices and even includes fiber from the local electric company. Yet none of these options were actually available to Seth.

What Comcast did was negligent by telling him there was broadband available when there wasn’t. But we are now at a time when a house’s value can be drastically affected by lack of access to broadband. I hope this guy sues Comcast and wins, but I also hope that people without broadband keep screaming and make themselves heard. Because for a lot of America, Seth’s story is just another day of normal life for rural America.

ALU Sells to Nokia

sculptura phoneIt was just announced that Nokia will be buying Alcatel/Lucent. It seems that this was done so that Nokia can pick up the latest 4G technology from ALU. As one who has been in the industry for a while I have a long memory of the history of Lucent.

Before the Lucent name, the business was a part of AT&T and was the combination of Western Electric and Bell Labs. Bell Labs was always a wonderment for techies like me because they employed some of the smartest minds in the world. The lab was started by Alexander Graham Bell and over the years they developed such things as the transistor, the laser, information theory, and the UNIX and C++ programming languages. There were eight Nobel Prize winners from Bell Labs. I worked in the Bell System for a few years pre-divestiture and it was a point of pride to work for the same company that operated Bell Labs.

There was a time when Western Electric was the sole manufacturer of telephones and telecommunications devices. I recall that when I was a kid the only option for a home phone was the ponderously heavy, black Western Electric phone. These were hard wired and didn’t have long cords and when you talked you had to stand close to the phone. Over the years, Western Electric introduced smaller phones like the Princess phone and introduced longer cords that provided a little more freedom when using the phone. But all of the Western Electric phones were solid and they rarely had problems or broke. They were solid America technology made in America.

The first big change I remember for Western Electric was when AT&T started licensing other companies to make some handsets. I remember when the Mickey Mouse phone, the Sculptura phone (pictured here) and other colorful phones hit the market. Within a few years, the FCC began to widely license handsets made by numerous companies as long as they passed Bell Labs certification, and Western Electric lost their monopoly on handsets.

Western Electric also made the bulk of the electronics used by AT&T. These included voice switches, line repeaters, and various kinds of carriers used to carry more than one call at a time across a piece of copper. But Western Electric never had a total monopoly and companies like Nortel often sold equipment to non-AT&T telcos.

The big change for the companies came during the divestiture of AT&T in 1984. During the divestiture both Western Electric and Bell Labs were placed into the AT&T Technologies subsidiary. The companies went on, largely unchanged, until they were spun off from AT&T as Lucent, a standalone corporation, in 1996. Most of Lucent’s business was still with the various Bell companies, but they were branching out into numerous fields of telephony technology. At that time Lucent was the mostly widely held stock company in the US and had a stock price of $84 and a market capitalization of $258 billion.

Lucent fell onto hard times at the end of 2000 and was one of the first companies to be hurt by the telephony and dot com crash. The industry as a whole had heavily pursued the new competitive telephone companies (CLECs) that had been authorized by Congress and the FCC in 1996. Unfortunately, the large companies like Lucent and Nortel provided significant vendor financing to the fledgling CLEC industry, and when those companies started folding all of the large manufacturers were thrown into financial trouble.

Lucent never fully recovered from that crash (like many other tech companies that disappeared at that time). Their stock lost significant capitalization from the crash, but then really got slammed when it was revealed that the company had been using dubious accounting methods for recognizing sales and revenues. By May of 2001, the company’s stock had fallen to $9. I remember at the time that everybody in the industry could quote the Lucent stock price and we all watched in wonder as the company crashed and burned.

Over the next few years Lucent tried to gain some value by spinning off business units. It spun off its business systems into Avaya and its microelectronics unit unto Agere Systems. By 2003 the Lucent stock price was down to just over $2 per share and the company had shed over 130,000 employees. Lucent merged with Alcatel in 2006 and became Alcatel Lucent (ALU). That company did well for a while but then had a long string of losses until positive profits were recently announced.

And now the business has been absorbed by Nokia, mostly to pick up the division that makes 4G wireless equipment. There is not much of the old company left. Bell Labs is still around and one has to wonder if Nokia will continue to operate it. The Lucent history is not unusual for high tech companies. Western Electric had a near-monopoly for decades, but over time everything made by them changed drastically and newer companies ate away at the old giant. Today we have new giant companies like Apple and Samsung, and if history is any indicator they will someday be supplanted by somebody new as well.

Connecticut’s Call for a Fiber PPP

Fiber CableA large number of municipalities in Connecticut have banded together and let it be known that they would love gigabit fiber. They’ve recently made numerous announcements about the initiative as well as assembled a list of the things that each municipality is prepared to contribute to somebody who will build fiber for them. This list includes the typical sort of things that cities offer to potential commercial partners – expedited permitting, real estate and building access, access to existing conduits or other sorts of existing telecom infrastructure, etc.

What these cities are doing is not unusual, but it’s by far the biggest such initiative I have seen. I have no doubt that these cities would love gigabit fiber, and they clearly understand the benefits and implications of getting it. But unfortunately, I see very little chance of their approach working. The Connecticut group is calling this a PPP (public private partnership). I love the ideas of PPPs and I have worked with some successful ones. In addition to being a telecom consultant I also am on the Board of Directors for a PPP financial consultancy firm that works across a wide range of infrastructure projects.

Numerous cities have taken this same approach to try to find a private partner. The first one I recall was the city of Seattle nearly a decade ago, a project that I helped with. The city made the same sort of appeal to investors to build fiber in their city and they had no takers. Since then I have regularly seen similar appeals made from cities all over the country. And except maybe for a few small cities that might have found a small local ISP or telco to partner with, I am not aware of any of these attempts that has ever been successful in attracting a partner. There are a number of reasons for this:

  • I saw several quotes from Connecticut officials that said that they are hoping that some venture capitalist will see this offer and become interested. That is not the way that venture capital works. Venture capitalists don’t invest in ideas for good projects; they invest in management teams that they trust. And they expect those management teams to have a strong track record and to bring them shovel-ready projects. Venture capitalists never go and seek opportunities, but instead wait for fully fleshed-out opportunities to be brought to them. For the Connecticut opportunity to be shovel-ready someone will have to expend significant millions up front to perform the engineering to determine the cost of the build, undertake market research to understand the potential for customer interest, and then create financial business plans to demonstrate that the project can make the kind of returns that venture capitalists or other financial sources will find acceptable. For just one decent sized city, that development work can easily cost half a million dollars or more, and for the 40+ towns in this consortium this would be a huge outlay. There are probably not many companies around willing to take a multi-million dollar risk on doing this upfront development work without knowing for sure that they can get it financed.
  • I have done several recent financial analyses of large fiber PPPs. These studies show that the kind of contributions that these cities (and most cities) are willing to make to a fiber project are not worth very much to a potential builder. One of the few benefits that might get somebody’s attention would be if a city already has a significant network of empty conduit through which fiber could be pulled. The other things these cities are offering don’t change the potential IRR (internal rate of return) of a project by much at all. Sadly, cities are overvaluing the benefits they can bring to a commercial fiber partner as part of a PPP. If a city really wants to attract a private builder they ought to be thinking about providing an economic development bond to pay for some significant portion of the fiber – a bond that they would repay out of tax dollars. That would be real skin in the game that would create a real partnership opportunity which might attract investors.
  • There are not many companies building fiber that would be large enough and have the ability to respond to the Connecticut group. To build to the fifty plus communities in this group must be a billion dollar venture and there are not many companies that have the wherewithal to raise that kind of money to build fiber. Only a company with deep pockets and a proven track record of building and operating telecom networks would have any chance of raising this kind of money. Even for companies with deep pockets this kind of construction is going to require at least 30% to 40% equity and there are not many firms sitting on the cash and equity needed to pay for this.
  • The various announcements said that since fiber is profitable that they ought to able to attract the needed money, But is it that profitable? Venture capital investors normally seek risk-adjusted returns of 30% or better and since fiber is a capital intensive undertaking it’s hard to achieve returns that high. It’s not impossible, but if building fiber was really that lucrative there would be fiber projects everywhere.

I hope these communities prove me wrong, because I think what they want is terrific. I really don’t want to be throwing a wet blanket on this because I am a huge believer in the benefits of fiber networks. But this sounds to me like wishful thinking on the part of economic development people who do not understand the market reality of how large amounts of money are raised in this country today. These cities are basically wanting somebody else to bring the money to build fiber in their communities, and you can count the firms who are capable of doing this on one hand – and most of those are the giant telcos who are no longer building fiber at all.