CAF II Technology Options

Copper CableThere has been a lot of speculation on what technologies the big telcos are going to use to meet their CAF II obligations. They have a tall task in front of them trying to bring a least 10 Mbps broadband to large swaths of rural America.

I know a lot of the areas they are being asked to serve. The typical rural county has some broadband in the county seat – often from both a cable company and from the telco. Businesses in county seats can usually get as much broadband as they want if they can afford the high prices offered in these communities for real broadband.

But the cable TV networks’ service areas usually stop near the city boundaries. And DSL that originates within the county seat doesn’t carry very far into the rural areas. To make matters worse, much of rural America still has older DSL technologies that can deliver only 6 Mbps or 12 Mbps for short distances. It’s not unusual to have a few other pockets of broadband in the typical rural county – there will often be a few subdivisions or other small towns that have DSL and perhaps even cable TV.

However, the vast majority of the physical area in most rural counties is served only by long copper telephone lines, which are usually too far from a DSL hub to get any meaningful DSL. Other than those few subdivisions that have DSL hubs, there is probably little if any fiber running to rural areas. There might be long-haul fiber running through the county, but this fiber was not built to serve local customers.

The CAF II companies are facing the goal of bringing broadband to large copper-only areas that have no existing fiber. The options for technologies that can affordably bring broadband to such areas are limited.

One solution is to build a lot of DSL hubs in the rural areas to bring DSL closer to homes. One advantage of a DSL upgrade is that it uses the existing copper wires to deliver the bandwidth. But DSL on copper won’t carry the 10 Mbps speeds required by CAF II, particularly on the older and smaller gauge copper that is found in rural networks. So the DSL option requires building a lot of fiber and a whole lot of DSL cabinets. That is expensive, particularly since in many rural areas there might only be a few potential subscribers within reach of a given DSL cabinet.

The DSL solution also assumes that the telco has maintained the copper network, and we know from experience that there are many rural areas where maintenance has been neglected for decades. Making DSL work on a degraded and compromised network can be a major challenge. We also know from experience that when you try to cram too many DSL signals in small-gauge copper cables that you get cross-wire interference that degrades the speeds.

One alternative to building fiber to DSL huts would be to instead deliver the bandwidth using point-to-point microwave radios. Microwave radios have been around a long time and are reliable. But the technology requires the use of towers of some sort – something that the telcos don’t own today and that is often not very common in rural areas. Still, there are certainly many places where a microwave radio shot is going to be cheaper than building new fiber, even considering the cost of building some towers.

I have talked to a number of engineers on the topic and they think that the telcos are going to have to introduce some point-to-multipoint wireless radios into the network to reach the most remote customers. I’ve looked at maps of many of the CAF II areas and in most of these areas there are numerous pockets of the network where there might only be a half dozen farms or homes in a large service area – and there is no cheap wireline option to upgrade such sparsely populated areas.

There is one other option that I know of – the telcos might just ignore the most remote customers. Once the networks have been built and the CAF II money spent, I’m not sure what recourse the FCC has to make the telcos finish the job. We certainly have a long history of telcos that have skirted regulatory requirements or that have reneged on promises made to regulators. So I suspect that if the telcos reach some ‘reasonable’ percentage of the people that are supposed to get the CAF upgrade that the FCC will put on its blinders and call it a job well done.

Industry Shorts – July 2016

unflagHere are a few topics I’ve been following but which don’t merit a full blog.

Mediacom Announces Upgrade Plans. Mediacom has announced plans to invest over $1 billion to upgrade its networks. The main thrust of the upgrades would be to increase speeds up to a gigabit in the 1,500 communities it serves in 22 states.

It will be interesting to see how they do this. There are many markets where they don’t have to do a lot more than upgrade to DOCSIS 3.1 and introduce new cable modems for high-bandwidth customers. But a lot of their rural markets will require forklift upgrades involving headend upgrades as well as revamping the coaxial cable plant. In the worst cases they’d have to replace coaxial cables, but in others would have to replace power taps and line amplifiers.

The company also announced it would open public WiFi hotspots in many of its markets. However, their current WiFi program is pretty weak by industry standards and only gives existing broadband subscribers access to 30 free WiFi minutes per month.

Dish Cuts Back on Ad-Skipping. Dish Networks has agreed to largely disable the feature in their new VCRs that let customers skip ads automatically. This has become such a sticky point in negotiations for content that Dish finally agreed to cut back on the very popular feature. Dish reached agreements with Disney and CBS to disable the feature in order to get new programming for Dish’s Sling TV OTT offering.

Google Launches Undersea Cable. Google and Japanese telecoms have built a new undersea cable joining Portland, Seattle, Los Angeles and San Francisco to two POPs in Japan. The cable can carry 60 terabits of data per second and is now the fastest undersea fiber. Google is also planning to complete a fiber between Florida and Brazil by the end of the year. Facebook and Microsoft are working together on an undersea connection between Virginia Beach and Bilboa Spain. With the explosive growth of Internet traffic worldwide this is probably just the beginning of the effort to create the needed connectivity between continents.

It’s interesting to see that some of the big traffic generators on the web are willing to spend money on fiber, and one has to suppose this will save them money in the long term by avoiding transport charges on other fiber routes. It’s probably also not a bad time to own a fiber-laying ship.

UN Declares Broadband Access a Universal Human Right. The United Nations recently passed a series of resolutions that makes online access to the Internet a basic human right. Among the key extracts in the resolutions are:

  • That people have the same rights online as offline, “in particular, freedom of expression, which is applicable regardless of frontiers and through any media of one’s choice.”
  • That human rights violations enacted against people due to making their views known online are “condemned unequivocally,” and states are held accountable for any such violations.
  • Any measures to “intentionally prevent or disrupt access” to the internet are also “condemned unequivocally,” and all states should “refrain from and cease such measures.”

While it’s easy to argue that much of what the UN does has no teeth, it has been the forum since its creation for recognizing human rights.

Netflix Users Would Hate Ads. In a survey with mixed results it’s clear that Netflix users have strong feelings about introducing advertising into the popular ad-free service. In a survey given by All Flicks, 75% of Netflix users said they would dump the service if it started carrying ads.

In a somewhat contradictory finding, the pole indicated that most Netflix users would pay a premium price to avoid ads if there were options. Nearly 60% of Netflix users said they would pay $1 per month to avoid ads with many others saying they would pay even more.

Do We Really Want to be Watched?

Outdoor cameraI noticed an article that mentioned that the Google free WiFi hotspots in New York City are equipped with cameras and are able to be used for surveillance. Google says that this function has not been activated, but it got me to thinking about public video surveillance in general.

There has been a surge in recent years in the installation of public surveillance cameras, fostered in part by the fiber networks that are being built everywhere. The sale of outdoor surveillance equipment is growing at about 7% per year. And the quality of that equipment is rapidly improving. New surveillance cameras no longer produce the grainy pictures we all think of as synonymous with security footage but are now using high definition and even 4K video technologies to drastically improve the quality of the images. Fiber bandwidth is allowing for higher frame rates and fewer gaps in the sequence of images.

The city of London led the way over a decade ago and installed cameras to saturate certain downtown neighborhoods in the city. After having had these in place for a long time the statistics show that the cameras haven’t really changed the crime rate in the watched neighborhoods. While it did change the nature of the crimes in the areas somewhat, the overall crime rate is close to what it was before the cameras.

Probably the biggest public fear about surveillance is that public cameras will be used to track where we go in public. I know I’m not nuts about the idea of a store knowing who I am as soon as I walk through the door, and I’m even more skeptical of having the government track me as I walk down city streets.

That aspect of surveillance is going to require better facial recognition technology. Currently, Facebook facial recognition is said to be able to identify people 98% of the time. Facebook is able to get such good results by limiting its search to friends and friends-of-friends of the person that posts a picture. Facebook also benefits from the pictures of people from different angles and different lighting which lets it build better profiles. The FBI’s software is said to be 85% accurate if they can limit a search to no more than 50 people for the software to consider.

There is no facial recognition software yet that is very good at identifying random people on a public street. However, everybody expects that software to be here in less than a decade through assistance from Artificial Intelligence.

Public surveillance cameras open up a number of ethical issues. The first is that it’s too tempting for law enforcement insiders to misuse surveillance information. Back in 1997 a high-ranking police officer in DC was convicted of using surveillance cameras near a gay bar for identifying patrons through license plates and then blackmailing them. The Detroit Free Press reported on cases of policemen using surveillance systems to help friends, stalk estranged spouses, or harass those with whom they had a traffic altercation.

Terrorism experts say that public surveillance cameras not only don’t deter terrorist attacks, but might instead invite them by producing images of a terrorist attack.

There are also arguments that video surveillance constitutes fourth amendment violations through unreasonable searches. The concern comes not just from having government cameras identifying you on the street, but that over time using that data to create a profile about where and when you go out, who you see, and what you do in public.

I know that a lot of US cities are considering putting in a lot more surveillance cameras as part of smart city initiatives. Tampa, near to me, has already begun the installation of an extensive outside camera network. I’m sure the city officials that do this have what they think are good reasons for watching their citizens, but our short history with the technology shows that such systems will be used for purposes different than what was intended. I, for one, am not a fan of the whole concept and I suspect most people don’t really want to be watched.

Can Big ISPs Resist Data Caps?

MagneticMapI think we can expect data caps to continue to be in the news. Comcast was getting a lot of negative press on data caps at the beginning of the year and had generated tens of thousands of complaints at the FCC from their 300 GB (gigabit) monthly data cap. They relieved that pressure by unilaterally raising all of the data caps to 1 TB (terabit) per month. But Comcast has now been quietly implementing the terabit cap across the country and recently activated it in the Chicago region.

In May of this year, AT&T U-verse revised a few of their data caps upward, but at the same time began seriously enforcing them for the first time. Until recently, most AT&T data customers that exceeded the caps paid no extra fees. The AT&T U-verse data caps are much smaller than the new Comcast cap. For traditional single-copper DSL customers the data caps is 150 GB per month. For U-verse speeds up to 6 Mbps the cap is now 300 GB per month. For speeds between 12 Mbps and 75 Mbps the cap is 600 GB, while customers with speeds at 100 Mbps or faster now have the same 1 TB monthly cap as Comcast. AT&T has a kicker, though, and any customer can buy unlimited usage for an additional $30 per month.

The large ISPs, in general, are under a lot of pressure to maintain earnings. They have all profited greatly by almost two decades of continuous rapid growth in broadband customers. But that growth is largely coming to an end. A few of the cable companies are still seeing significant broadband growth, but this is coming mostly from capturing the remaining customers from big telco DSL.

At the beginning of this year, the Leichtman Research Group reported that 81% of all American homes now have a broadband connection. When you add up rural homes that can’t get broadband and those elsewhere that can’t afford full-price broadband, there are not room for much more growth. Even if a lot of low-income households get broadband through the Lifeline Fund subsidies, those customers will be at low rates and won’t do a lot to the bottom line at the big ISPs.

Meanwhile, the large ISPs are seeing an erosion of cable revenues. While cord cutting is small, it is real and the cable industry as a whole is now slowly losing customers. Probably more significant to their profits is cord-shaving; customers cut back on the cable packages to save money (and because they have alternatives to the big cable packages). Even if cable wasn’t starting to bleed customers, the margins continue to shrink due to the huge increases in programming costs. Even high margin revenue streams like settop boxes are under fire at the FCC.

When I look out five years from now it’s obvious that the ISPs will somehow have to milk more profit out of broadband. There are only two ways to do that – increase rates or find backdoor ways like data caps to get more money from broadband customers.

It’s not hard to understand why the large ISPs fought net neutrality so hard. By putting broadband under Title II regulation the ruling has already started to impact their bottom line. I think Comcast raised their data cap to stop the FCC from investigating data caps. The proposed FCC rules on privacy will largely strip the ISPs of the ever-growing revenues from advertising and big data sales. And it’s certainly possible in the future that the FCC could use the Title II rules to hold down residential data rates if they climb too high.

It’s got to be a bit hard to be a big ISP right now. They look at envy at the big revenues that others are making. The cellular companies are making a killing with their stingy data caps. Companies like Google and Facebook are making huge amounts of money by using customer data for personalized advertising. Meanwhile, the ISPs live in a world where, if they aren’t careful, they will eventually become nothing more than the big dumb pipe provider – the one future they fear the most.

Comcast, and perhaps the new Charter, are large enough to find other sources of revenue. Comcast is now pursuing a cellular product and has done fairly well selling security and smart home products. Comcast also makes a lot of money as a content provider, boosted now by buying DreamWorks. But any ISP smaller than these two companies is going to have a nearly impossible time if they want to continue to match the growth in bottom line they have enjoyed for the last decade.

Presidential Candidate’s Broadband Platforms

White HouseThe political season is upon us and I noticed that Hillary Clinton issued an infrastructure plan that includes making significant investments in broadband. I guess like is true for most industries, people like me that try to track a given industry are always interested in what potential presidents might have in store for us. Here are the key points of her broadband platform:

  • “By 2020, 100% of households in America will have the option of affordable broadband that delivers speeds sufficient to meet families’ need.”
  • Create a $25 billion Infrastructure Bank that will offer grants to communities for broadband and for other purposes to jumpstart community-initiated broadband projects.
  • Reduce regulatory barriers to the private provision of broadband.
  • Promote policies like ‘dig once’ that will provide more broadband infrastructure.
  • Develop public-private partnerships for broadband.
  • Connect more anchor institutions to high-speed Internet.
  • Deploy 5G wireless and next generation wireless systems.
  • Reallocate and repurpose spectrum for broadband.
  • Foster a civic Internet of Things to foster broadband deployment.

That’s quite a wish list and it’s the most detailed list of broadband goals that I recall ever seeing from a candidate before. Both Barack Obama and George W. Bush had platforms that included expanding access to broadband, but they were far less specific than this. Here are my reactions to the platform:

  • The goal of getting broadband to everybody by 2020 is a silly political goal. If the programs were already in place today to somehow pay for this it would still take a lot longer than that to deploy adequate broadband to everybody. And this raises the issue of what ‘affordable broadband that delivers speeds that meets families’ needs’ means. I would think at the federal level that they believe that is what CAF II is doing, although most of us in the industry understand this to be a boondoggle that will deliver speeds that will be obsolete before they are installed. And even after CAF II is built there will be a lot of places in rural America (and urban America) that won’t meet this goal.
  • Putting federal dollars into programs that can seed broadband expansion is something that we know can work. Just look at the DEED grants in Minnesota to see how the $65 million in grants there since the start of the program have seeded numerous rural broadband projects. I see many broadband projects that can’t find funding, and many of those projects could get a great jump-start with some seed money from a federal source. One always has to worry with federal funding that taking the money isn’t too expensive – for example, a huge amount of money was wasted by the BTOP rule that all projects had to pay urban labor rates for construction. And since this infrastructure bank will be doing more than just broadband, we’d have to see how much money would actually be available. But any federal money is going to help.
  • It would be interesting to see how regulations could be improved for broadband. The FCC is now in the process of regulating broadband for the first time, but the regulations that make it hard to build fiber are mostly at the state and local level.
  • I’ve commented in the past how most ‘dig once’ plans are often a waste of time. Conduit that is built without the necessary handholes and other access points is nearly worthless for serving neighborhoods with fiber.
  • I sit here and wonder what the federal government could do to promote public-private partnerships and I come up empty. I’m a big fan of PPPs, but I also know the challenges of putting together a good partnership and it doesn’t seem like the sort of things that federal rules could make easier or better.
  • I worry about including 5G as a broadband plan. I guess anybody who reads my blog knows I think fiber is always going to be the ultimate technology. Even if we migrate to wireless drops there will need to be fiber deep in neighborhoods and close to homes.

Overall this is not a bad list. It’s certainly more ambitious than anything we’ve seen before. The most promising thing on the list are the grants to promote broadband. Those might do more good than the rest of the list to get a lot more fiber projects under shovel.

Donald Trump has not put out any specific goals for broadband. But he said in a few speeches that he’s in favor of putting as much as a trillion dollars into infrastructure. If more details become available I will try to compare both plans.

Of course, as history has shown us, having something in a presidential platform is no guarantee that it will ever come to pass – but it is a set of goals. But first, the candidate has to get elected and after that there are a lot of politics that include Congress and the FCC that are necessary before most of these goals can ever become realized.

Cord Cutting is Getting Harder

SANYO DIGITAL CAMERA

SANYO DIGITAL CAMERA

One thing that cable operators might have going for them is that the OTT market is changing in ways that make it a little less attractive to cord cutters. It turns out that it’s getting harder to be a cord cutter, and certainly more expensive.

For a while it looked like Netflix and Hulu would offer a real alternative to cable TV, and for many people they still do. But the things we used to like about those two services are changing rapidly. Hulu is a great example. Around 2011 they made a deal with over 20 sources of content like NBC, ABC, USA, Syfy, Fox, and many others. But those were 5-year deals that are now coming to an end and Hulu is about to lose a lot of the content that attracted people.

Hulu is being hit from all sides. NBC is pulling most of its content and has launched its own OTT product that is hinging on the popularity of the new Star Trek to draw customers. BBC has pulled the very popular Doctor Who and other programming in favor of its own OTT product. Even CW is pulling shows like the Flash, Arrow, Vampire Diaries, and Jane the Virgin and has launched its own OTT service.

The same has happened to Netflix. Long-time subscribers complain that there is half the content on Netflix as there was years ago, and it’s true. Content providers have slowly been withdrawing content and making it harder for Netflix to obtain both TV shows and movies. Netflix makes up for this with original content, but that content isn’t for everybody.

With the plethora of OTT options, it’s getting expensive to be a cord cutter. A cord cutter probably can’t pick only Netflix and/or Hulu and be happy – or at least not as happy as they were a few years ago. To get a wide variety of OTT programming, a cord cutter is going to have to subscribe to multiple OTT products, and at the end of that process might easily be spending as much for programming as they did with the cable company.

Consider Sling TV as an example. Sling launched with a very simple set of options. They launched with a basic package of 15 channels for $20 per month – these were channels that people miss when they cut the cord – ESPN, the Travel Channel, the Food Network, TBS, and the Disney Channel. They had an add-on package for $5 to add more sports channels.

But Sling TV has morphed to become a lot more complicated and a lot more expensive. They now have two basic packages each priced at $25 per month. One called Sling Blue is sports-oriented including Fox Sports and NBC (for the Olympics). The original package has been renamed Sling Orange and has also been bumped to $25. Both together are $40, and there are now several $5 add-on packages such as news and sports. Sling is looking at adding more content, but in doing so, they are now at or above the price that this same content can be received on satellite. But with satellite cable you get a lot more channels than Sling. The new Sling TV prices are starting to feel like a programming alternative, not a cord cutting savings option.

Hulu also has more expensive options now. They will soon be offering a package that includes live network programming starting at $35 per month. For $50 per month customers can store up to 20 hours of Hulu content in cloud DVRs.

Cord cutters now have much harder choices than even just six months ago. If they cobble together a half dozen OTT sources they can easily be paying more than they were with cable. If they limit themselves to one or two sources of content like Netflix or Hulu they will see their content choices shrinking and their monthly fees increasing.  This all has to be good news to the cable companies – a lot of homes are going to like the OTT options less than they did a year ago.

A Better Customer Interface

AndroidToday a lot of the time and money for what we think of as programming is really spent connecting APIs (Application Program Interface). APIs are defined as a software component that defines the operations, inputs, outputs, and underlying functionalities of a specific program that are then used to have a reliable interface with a given program so that programmers can query programs or can link different programs together.

The software world is full of APIs that are the basis for building and operating most complex software systems. APIs can be simple, such as an API that just looks up a customer’s address when querying a database with their name. Or an API can be more complex, such as having an API routine that calculates the outstanding balance for a given customer and then ages the accounts receivable. The process of connecting a new software package to all of the needed APIs is time-consuming and expensive and is one of the reasons it sometimes takes months to implement a large new software package.

APIs have been integrated into almost all parts of the software world and there are plenty of them in the telecom world. For example, an integrated billing system uses a lot of APIs. APIs are used to allow OSS/BSS software to gather calling details from a voice switch to add or delete features. APIs connect to the software in a cable headend to define what channels a given customer should be receiving or to capture billing information from a pay-per-view event.

But there is a downside to using API-based software that anybody paying for software is all too familiar with. Almost every complex software package you buy these days in a telecom environment requires signing up for software maintenance – and it’s not cheap. Software maintenance is often set at an annual fee of 10% – 12% of the cost of the original software package. A large percentage of that money is to pay for keeping abreast of the changes in APIs. Every time the software changes somewhere in a telecom system, such as in your voice switch, those changes then ripple through the rest of your software systems.

Even if a large telco employs their own programmers a lot of programming time is used in working with and updating APIs. Reliance in APIs goes far beyond the telco world. Establishing the right APIs is the number one hassle of writing an application for smartphones, and it’s the difference between the APIs of iOS and Android that requires app makers to create and maintain two versions of their software.

There was an attempt a decade ago to make it easy for software packages to communicate with each other under the label of the Semantic Web. That initiative ran into a wall because of the massive effort required to make APIs work easily. However, it looks like maybe we are on the verge of doing away with the need for a lot of those APIs, at least in the manner we use them today. It seems likely that we are headed towards a time when bots are going to take over a lot of the tasks that require API interfaces today.

I use the Amazon Echo with the Alexa bot. Already today I can ask Alexa a question in English such as, “Which Baltimore Oriole has the most home runs?” and Alexa will search the web and bring back the answer Cal Ripken. Alexa and other personal bots are improving at breakneck speeds. We are getting close to a time when we are going to have bot-to-bot communications, which over time will replace a lot of the software in place today.

Soon we are going to be able to use our personal bots to interface with other software systems. I should be able to ask Alexa to go my bank and get a copy of my June bank statement. Alexa will then interface with my bank’s bot and get the needed information. The plan is for bot-to-bot communication to be in English, so if I don’t get what I want I can look to see what went wrong with the transaction between the two bots.

The beauty of bot-to-bot communication is that each customer is going to be able to find out what they want. Today, the owners of a web site for something like a bank have to pre-determine what they think customers are most interested in and then set up menus to supply those answers. But with bot-to-bot communication the bank doesn’t need to guess what customers want and doesn’t have to arrange the data in a format needed to support the APIs. The bots will figure this out for each customer inquiry. Bot-to-bot communication means doing away with a lot of clunky customer service interfaces and that means cheaper software costs for the bank. And for customers it means getting what you want by talking to your own bot in plain English. That should cut down on a huge percentage of customer service calls.

APIs won’t die, of course, but even interfaces with APIs can be automated using bots so that when something changes in a hardware or software system the bots in connected systems can figure out what this means on their own. That’s bad news for computer programmers, because today a lot of their work is connecting to or updating APIs. But it’s good news for consumers and it should be good news for any company spending too much money maintaining software that uses a lot of APIs.

Regulatory Shorts – July 2016

Scale_of_justice_2_newThere are some interesting things happening in courts lately that will be of concern to ISPs.

ISPs Might be Liable for Customer Piracy. In two court decisions, courts have said that ISPs can be held responsible by piracy committed by ISP customers. In the Alexandria, VA district court a jury found Cox Communications liable of copyright infringement from a lawsuit brought by BMG, the music publisher. BMG had argued that Cox should have disconnected customers who violate copyrights. There was a similar ruling in Manhattan district court against RCN, also brought by BMG. Both companies are currently vigorously fighting the rulings. This kind of ruling could have a chilling impact on ISPs. Net neutrality rules would make it hard, and maybe illegal, to block sites like BitTorrent. And yet ISPs might somehow be liable for what customers do on piracy sites.

Internet Firms Not Necessarily Liable for False Information. On May 16 the FCC handed down a narrow victory to Spokeo.com. The company had been sued by a Virginia resident who said that the site contained errors about his age, education, employment, and marital status. The court said that the plaintiff could not sue without having proven any real damage from the bad information.

The case was watched closely by Facebook, Google, and other internet firms that are worried about a negative impact from having inaccurate data. The court ruling seems to make it unlikely that class action suits could be brought against internet companies, but it did open the door to individual suits when real damage could be claimed.

Fourth Amendment Does Not Protect Home Computers. The federal district court in Virginia ruled that a criminal defendant had no ‘reasonable expectation of privacy’ for information stored on his home computer. The particular case came out of an FBI sting of Playpen – a TOR site on the dark web used to host child pornography. It’s a complicated and unprecedented case where the FBI seized the server and continued to operate the site, and to eventually arrest numerous users.

But the ruling is a bit troublesome because it implies that police have the power to remotely access the files on somebody’s computer without a warrant. That runs contrary to recent rulings about the security of information on a cell phone. Police have searched computers before of people who have been charged with crimes, but the ability to search the computers of people who have not been accused of any crime without a warrant is scary. I expect this to be appealed.

FBI says Location of Surveillance Cameras Must be Kept Secret. The FBI was successful in getting a judge to block Seattle City Light from divulging the location of FBI security cameras. City Light is part of the city government and would normally be required to respond to requests for information like this from the public.

One thing the court process revealed is that the majority of police surveillance cameras are installed without a warrant, which raises the issue of violating the Fourth Amendment. The judge in this case did say that he thought the FBI needed warrants to install cameras.

Europe Proposes Requiring an Online ID. Officials in the European Commission have suggested that European citizens be required to use a government issued ID when online. The purpose of this is supposedly to provide a trustworthy environment online for merchants and people to be able to know who they are dealing with.

The White House had proposed a similar voluntary system a few years ago in response to cyberbullying and other online issues. They suggested that if people adopted a verified and trustworthy identity online that they could be safer by only dealing with others who did the same. There are still a few states considering trials of the idea. But that proposal was very far away from being the mandatory requirements suggested in Europe.

The Best Explanation of Network Neutrality Yet. And finally, Stephen Colbert discusses net neutrality while on a roller coaster.

 

 

 

Fiber for Everyone?

Fiber CableJust a few days ago I wrote about the two cities that are considering having citizens pay for their fiber networks through utility fees and pledges to support the fiber financing. After writing about Ammon, Idaho I heard back from several people in the industry pointing out that the proposed Ammon utility fee was a pledge intended to support bonds. The fees, which are supposed to be about $16.50 per month for about twenty years, would total nearly $4,000 over that twenty-year period and would be used to secure, and then pay for, the bonds needed to build the system.

That raises an issue that I have raised before: how important is it that everybody in a community get access to broadband? Every community that thinks about finding a fiber solution faces this issue. They can look for an approach that will get fiber to every household or they can settle for something less. This choice is sometimes a philosophical decision, but it often comes down to the difference in cost between the two choices.

Ammon has clearly chosen a solution that will benefit homeowners who are able and willing to pledge a lien on their homes. To be able to make the pledge a resident must own a home that can be pledged, so this eliminates renters. Interestingly it might also make it a challenge for anybody who doesn’t think they’ll be in their home for long. According to the US Census, the average time that families stay in an owned home is 13 years. And fewer than 40% of homeowners stay even 10 years. So anybody that thinks they are going to move out of their home in Ammon in the next few years probably ought not to pledge since they are likely to have to cover the remaining amount of the lien when they sell their home.

I don’t want to sound like I’m coming down negative on Ammon, because they have come up with a creative solution to get fast broadband to at least part of their city. And that is exactly what a whole lot of other cities have done. Ammon is unique because of their creative financing solution, but a whole lot of other cities have settled for broadband to less than everybody.

For instance, almost every city getting Google fiber is going to end up with fiber built to only parts of their city. Only cities willing to step up with a lot of city dollars like Huntsville, Alabama are going to get fiber everywhere. And the vast majority of cities that got Verizon FiOS years ago now thinks they made a mistake since they now have fiber in some neighborhoods and not others. They are now seeing a big difference between neighborhoods with fiber and those without. This difference is likely to grow since both Verizon and AT&T have made noises about tearing down copper in older city neighborhoods. We might end up with more urban households without affordable landline broadband than we have today in rural areas.

Fifteen years ago I worked for several cities that wanted to get Verizon’s attention to get onto the FiOS list. At that time these cities were so ecstatic to get some fiber that they didn’t insist that Verizon eventually build their whole city. But it probably would not have mattered if they had – because there are cities that got that agreement from Verizon but which still don’t have fiber everywhere.

I don’t want to make Google and Verizon sound like bad actors because almost every large fiber overbuilder is doing the same thing in only building to the most profitable parts of cities. The returns from only building to the best neighborhoods are dramatically better than from building everywhere – I’ve created dozens of business plans that quantify the difference. This is also the approach being taken by CenturyLink, Aspire, and half a dozen other fiber overbuilders – they are simply making the best financial decision for their company.

This is a tough philosophical issue for a city. Do they take the high ground and hold out for a solution that gets fiber everywhere or do they take the practical approach and get some fiber built? The risk of holding out for a whole-city solution might mean that nobody gets fiber. But the flip side of this is that building to only parts of a city probably means there will be neighborhoods that will be cut off from fiber for decades to come – talk to any city that has FiOS if you don’t believe that.

It’s almost impossible to build a reasonable business plan today to somehow fill in fiber where Verizon didn’t build – because they built where the construction costs were the lowest.  So Ammon is not at all unique, and in fact they are joining the majority of the cities in the US that have elected a solution that will result in something less than 100% fiber coverage. My primary reaction to this issue is a personal one – I know how I’d feel if I was in one of the neighborhoods that didn’t get fiber. I think that any city that elects to build less than 100% fiber ought to expect to hear an outcry from the rest of the city for many years to come.

Metropolitan ISPs

Seattle-SkylineI spend most of my time working with rural ISPs. Even my clients that work in larger cities tend to provide service to residential customers and to small and medium businesses. But there is a very competitive market for larger businesses and for businesses that operate in multiple markets.

My clients often run into this when they realize that they are unable to sell broadband to a local chain restaurant, convenience store, bank or other large nationwide or regional business. I recently poked around to see who the carriers are that are selling to metropolitan or nationwide businesses. Some of those on the list will surprise you with their success and there are carriers on the list that you’ve probably not heard of.

Since most of the ISPs in this category don’t report business revenues separate from other revenues it’s difficult to rank these companies by revenue.  Further, some of the companies on this list are almost entirely retail ISPs while others offer wholesale connections to other carriers. Probably the easiest way to compare these carriers is by looking at the number of buildings they claim to have lit with fiber. Of course, even that is not a very reliable way to compare them since there is no standard definition of what constitutes a lit building. But generally these counts are supposed to represent locations with either one very large customer, like a hospital, or else buildings with multiple business tenants. Some of these companies are also count locations like data centers or large metropolitan cell towers.

Here are the carriers that claim to provide fiber to more than 5,000 business buildings:

  • Time Warner 75,000
  • Level3 30,000
  • Cox 28,000
  • AT&T 20,000
  • Zayo 16,700
  • Charter 13,800
  • Fibertech 10,400
  • Verizon 10,000
  • Lightower   8,500
  • Sunesys   7,200
  • Cablevision   7,000
  • Frontier   6,300

Missing from this list is Comcast. I can’t find any references to the number of lit buildings they are in. They are a major provider of business broadband and reported just over 1 million business customers along with $1.3 billion in revenue for their Business Services division. Also missing from the list is CenturyLink who doesn’t seem to claim lit buildings anywhere that I could find. CenturyLink claims to be selling to more than 100,000 businesses on fiber at the end of 2015.

On the list though are both Charter and Time Warner Cable that just merged, which would put them in 88,800 buildings. Fibertech and Lightower merged in 2015 giving them a total of 18,900 lit buildings.

Level3 and Zayo provide both retail and wholesale fiber products, meaning that they will sell connections into the lit buildings directly to businesses or else to other carriers, and they derive a large portion of their revenues from wholesale sales.

The first thing that surprised me about this list is that the cable companies appear to be in a lot more buildings than AT&T and Verizon. There are two possible explanations for this. One is that each group of companies is counting lit buildings in a different way. For example, the cable companies might be counting buildings like schools while the telcos might only be counting larger multi-tenant buildings. But it’s also possible that the telcos have a strategy of only building fiber to the largest buildings in each market while the cable companies will build routinely to smaller buildings. It does raise the question if this is a reasonable side-by-side comparison.

I would also note that some of these companies are growing rapidly and that most of these counts came from 2014 or 2015. Vertical Systems Group (a research and consulting firm that tracks the metro Ethernet market) says that the percentage of the metropolitan businesses connected to fiber grew from 42% in 2014 to 46% in 2015.