Broadband Shorts – March 2017

Today I’m writing about a few interesting topics that are not long enough to justify a standalone blog:

Google Scanning Non-user Emails. There has been an ongoing class action lawsuit against Google for scanning emails from non-Google customers. Google has been open for years about the fact that they scan email that originates through a Gmail account. The company scans Gmail for references to items that might be of interest to advertisers and then sell that condensed data to others. This explains how you can start seeing ads for new cars after emailing that you are looking for a new car.

There are no specific numbers available for how much they make from scanning Gmail, but this is part of their overall advertising revenues which were $79.4 billion for 2016, up 18% over 2015.  The class action suit deals with emails that are sent to Gmail users from non-Gmail domains. It turns out that Google scans these emails as well, although non-Gmail users have never agreed to the terms of service that applies to Gmail users. This lawsuit will be an important test of customer privacy rights, particularly if Google loses and appeals to a higher court. This is a germane topic right now since the big ISPs are all expected to do similar scanning of customer data now that the FCC and Congress have weakened consumer privacy rights for broadband.

Verizon FiOS and New York City. This relationship is back in the news since the City is suing Verizon for not meeting its promise to bring broadband to everybody in the city in 2008. Verizon has made FiOS available to 2.2 million of the 3.3 million homes and businesses in the city.

The argument is one of the definition of a passing. Verizon says that they have met their obligation and that the gap is due to landlords that won’t allow Verizon into their buildings. But the city claims that Verizon hasn’t built fiber on every street in the city and also that the company has often elected to not enter older buildings due to the cost of distributing fiber inside the buildings. A number of landlords claim that they have asked Verizon into their buildings but that the company either elected to not enter the buildings or else insisted on an exclusive arrangement for broadband services as a condition for entering a building.

New Applications for Satellite Broadband.  The FCC has received 5 new applications for launching geostationary satellite networks bringing the total requests up to 17. Now SpaceX, OneWeb, Telesat, O3b Networks and Theia Holdings are also asking permission to launch satellite networks that would provide broadband using the V Band of spectrum from 37 GHz to 50 GHz. Boeing also expanded their earlier November request to add the 50.4 GHz to 52.4 GHz bands. I’m not sure how the FCC picks winners from this big pile – and if they don’t we are going to see busy skies.

Anonymous Kills 20% of Dark Web. Last month the hackers who work under the name ‘Anonymous’ knocked down about 20% of the web sites from the dark web. The hackers were targeting cyber criminals who profit from child pornography. Of particular interest was a group known as Freedom Hosting, a group that Anonymous claims has over 50% of their servers dedicated to child pornography.

This was the first known major case of hackers trying to regulate the dark web. This part of the Internet is full of pornography and other kinds of criminal content. The Anonymous hackers also alerted law enforcement about the content they uncovered.

The Fight Over Wireless Pole Attachments

PoleAll around the country there are fights going on between pole owners, governments, and wireless carriers over pole attachments and related issues for small cell deployment. Small cells are the first new technology that is mostly interested in non-traditional attachments, but will soon be followed by a proliferation of companies also wanting to hang devices to transmit millimeter wave radios and wireless local loops. The fights cover a wide range of different issues:

Safety. Most current pole rules were created for the purposes of keeping it safe for technicians to work on poles, particularly during bad weather conditions. Some of the devices that carriers now want to hang on poles are not small. Some are the size of dorm refrigerators or even a bit larger. And these devices are connected to live electric wires. Adding such devices to poles can make it significantly harder for a technician trying to restore power during a rain or snow storm. Just maneuvering around such devices can be a major safety concern even in good weather.

New Poles / Taller Poles. There are reports of wireless carriers asking to install new poles as tall as 120 feet in city rights-of-way. For network deployments that include wireless backhaul it’s vital that each small cell or other device has a clear line-of-sight to other devices in the network – and being higher in the air can create the needed wireless network.

In most towns the poles are not taller than 60 feet and often shorter. Taller poles create a whole new set of problems. They might mean a whole new level of tree trimming or even eliminating taller trees – and many communities take great pride in their trees. And these new poles will need power, meaning stringing more wires in the air, which can detract from the aesthetics of a residential neighborhood as well as to create more issues with downed power lines and trees to keep trimmed.

This also raises the issue of the long-term impact of such new poles. Many cities have moved other utilities underground or have multi-year programs to migrate existing utilities underground. These new wireless-only poles also require a power feed, and at least some of them require a fiber feed. Can a carrier require a wireless pole/tower in a neighborhood where everything else is already underground? Can they insist that their poles be left standing during future conversions of neighborhoods to underground utilities?

There is also the issue of sharing such new poles. Cities fear that they will be swamped with requests for new poles from companies wanting to deploy wireless technologies. It’s not hard to picture an NFL city that might have a dozen different companies wanting to deploy wireless devices – and it’s not hard to picture this resulting in chaos and a proliferation of multiple new poles on the same streets as well as numerous new electric lines to connect all of the new devices.

Right to Say No. Cities largely want the right to decide what goes in their rights-of-way. This often has manifested with requirements that anybody that wants access to rights-of-way get some sort of a franchise. It also has meant the development of local ordinances that define the whole process of using rights-of-way from the permitting process through installation techniques. But the carriers are currently lobbying at the state level and at the FCC to make uniform rules to apply everywhere. If the FCC or a state passes blanket rules there are many cities likely to challenge such rules in court.

Fees for Attachments. The carriers are also lobbying heavily to define the fee structure for attachments of these sorts of new connections. Compensation has always been an issue and my guess is that at some point the FCC will step in here in the same manner they did in the past with other pole attachments.

General Irony. I find it ironic that AT&T is leading the battle to get good terms for attaching wireless devices. AT&T has been the primary entity that has been fighting hard against Google to keep them off AT&T poles. And now AT&T wants the right to force their way onto poles owned by others. But in the regulatory world if we have ever learned any lesson it’s that big companies don’t seem to have a problem with arguing both sides of the same argument when it suits their purposes.

The Challenges of Fixed Gigabit Wireless

webpass_logoWe got a preview this week of what fixed wireless service might look like in urban environments. Google announced it is aggressively expanding the footprint of Webpass, the wireless ISP that Google purchased last year. The company has been operating in six cities and will now be expanding to nine more markets. These will all be downtown urban deployments.

The deployment uses high-capacity microwave links to serve high-rise buildings. Webpass already has 20,000 residential customers in the six markets, all which live in downtown high-rises. The company focuses more on serving business customers. This business plan has been around for years and I was actually helping to launch a business years ago with the same plan that died with the 2000 telecom crash.

The network consists of microwave shots to each building on the network. The first hurdle in getting this to work is to get enough quality radio sites to see buildings. As I noted in a blog last week, access to this kind of real estate is at a premium in urban areas, as cellphone providers have found when trying to deploy small cell sites.

The radios required to make the links are not gigantic, but you need one full radio and a dish at both ends of every link. This means that from any one given hub building there will be a limited number of links that can be made to other buildings, just due to space limitations. If you imagine half a dozen companies trying to this same thing (this will be the same basic deployment method for urban 5G), then you can picture a proliferation of companies fighting over available radio space on roofs.

Webpass in the past has limited their deployment to buildings that are either already wired with category 5 cable or fiber. They face the same issue that any broadband provider faces in bringing broadband into older buildings – only they are starting on the roof rather than from a basement wiring closet like other ISPs. There are very few ISPs yet willing to tackle the rewiring effort needed in large older buildings that serve residences. As you will see from the pricing below, Webpass and other ISPs are a lot more willing to tackle business buildings and absorb some rewiring costs.

The primary thing for the public to understand about this new roll-out is that it’s very limited. This won’t go to single family homes. It will go to downtown residential high-rises, but only to those that are pre-wired or easy to wire. And even in those buildings Webpass won’t go unless they get at least 10 customers. However, they will contract with landlords to serve whole buildings.

The Webpass pricing is interesting. For residential customers the price is $60 per month regardless of the speed achieved. Webpass says they deliver speeds between 100 Mbps and 500 Mbps, but in reading numerous reviews, there are complaints that speeds can get slower at peak evening time in some buildings (as one would expect when there are a lot of customers sharing one radio link).

Webpass’ pricing for businesses varies according to the number of other customers they get in a building. For example, if there are 10 or more business customers in a building they will sell a 100 – 200 Mbps connection for $250 per month with a 10 TB monthly data cap. But prices are much higher for customers in buildings with fewer than 10 customers:

Speed              Cost                 Data Cap         Price with no Cap

10 Mbps          $125                   1 TB                $375

20 Mbps          $250                   2 TB                $750

50 Mbps          $500                   5 TB                $1,500

100 Mbps        $1,000                10 TB              $2,000

250 Mbps                                                           $2,500

500 Mbps                                                           $4,000

1 Gbps                                                                $5,500

From a technical perspective Webpass is deploying in line with the way the technology works. The radios are too expensive to deploy to smaller customers or to smaller buildings. A building also need to be within a mile of the base transmitter (and hopefully closer) to get good speeds. That is largely going to mean downtown deployments.

We know there are a number of other companies considering a similar plan. Starry announced almost two years ago that they were deploying something similar in Boston, but has yet to launch. We know AT&T and Verizon are both exploring something similar to this Google product using 5G radios. But all of these companies are going to be fighting over the same limited markets.

The cellular companies keep hinting in their press releases that they will be able to use 5G to bring gigabit speeds. When they say that, this is the kind of deployment they are talking about. The only way they are going to be able to bring gigabit wireless speeds to single family homes and to suburbs is if they can develop some sort of mini transmitters to go onto utility poles. That technology is going to require building fiber close to each house and the radios are going to replace fiber drops. The above deployment by Webpass is not hype – they already have customers in six markets. But this technology is not the panacea for fast broadband for everyone that you might believe from reading the press releases.

Industry Shorts – July 2016

unflagHere are a few topics I’ve been following but which don’t merit a full blog.

Mediacom Announces Upgrade Plans. Mediacom has announced plans to invest over $1 billion to upgrade its networks. The main thrust of the upgrades would be to increase speeds up to a gigabit in the 1,500 communities it serves in 22 states.

It will be interesting to see how they do this. There are many markets where they don’t have to do a lot more than upgrade to DOCSIS 3.1 and introduce new cable modems for high-bandwidth customers. But a lot of their rural markets will require forklift upgrades involving headend upgrades as well as revamping the coaxial cable plant. In the worst cases they’d have to replace coaxial cables, but in others would have to replace power taps and line amplifiers.

The company also announced it would open public WiFi hotspots in many of its markets. However, their current WiFi program is pretty weak by industry standards and only gives existing broadband subscribers access to 30 free WiFi minutes per month.

Dish Cuts Back on Ad-Skipping. Dish Networks has agreed to largely disable the feature in their new VCRs that let customers skip ads automatically. This has become such a sticky point in negotiations for content that Dish finally agreed to cut back on the very popular feature. Dish reached agreements with Disney and CBS to disable the feature in order to get new programming for Dish’s Sling TV OTT offering.

Google Launches Undersea Cable. Google and Japanese telecoms have built a new undersea cable joining Portland, Seattle, Los Angeles and San Francisco to two POPs in Japan. The cable can carry 60 terabits of data per second and is now the fastest undersea fiber. Google is also planning to complete a fiber between Florida and Brazil by the end of the year. Facebook and Microsoft are working together on an undersea connection between Virginia Beach and Bilboa Spain. With the explosive growth of Internet traffic worldwide this is probably just the beginning of the effort to create the needed connectivity between continents.

It’s interesting to see that some of the big traffic generators on the web are willing to spend money on fiber, and one has to suppose this will save them money in the long term by avoiding transport charges on other fiber routes. It’s probably also not a bad time to own a fiber-laying ship.

UN Declares Broadband Access a Universal Human Right. The United Nations recently passed a series of resolutions that makes online access to the Internet a basic human right. Among the key extracts in the resolutions are:

  • That people have the same rights online as offline, “in particular, freedom of expression, which is applicable regardless of frontiers and through any media of one’s choice.”
  • That human rights violations enacted against people due to making their views known online are “condemned unequivocally,” and states are held accountable for any such violations.
  • Any measures to “intentionally prevent or disrupt access” to the internet are also “condemned unequivocally,” and all states should “refrain from and cease such measures.”

While it’s easy to argue that much of what the UN does has no teeth, it has been the forum since its creation for recognizing human rights.

Netflix Users Would Hate Ads. In a survey with mixed results it’s clear that Netflix users have strong feelings about introducing advertising into the popular ad-free service. In a survey given by All Flicks, 75% of Netflix users said they would dump the service if it started carrying ads.

In a somewhat contradictory finding, the pole indicated that most Netflix users would pay a premium price to avoid ads if there were options. Nearly 60% of Netflix users said they would pay $1 per month to avoid ads with many others saying they would pay even more.

The Changing Face of Advertising

advertiseherebillboardmedThere has been talk for a number of years of advertising dollars shifting from television to the Internet, and it looks like maybe this is finally starting to happen. Consider the recent advertising revenues from Viacom and Facebook.

Viacom is one of a handful of the big programmers and owns such channels as Comedy Central, MTV, and Nickelodeon (along with Paramount Pictures). This has always made Viacom one of the powerhouses in attracting advertisers along with other large programmers like Disney, Fox, Comcast, Warner Brothers, and a few others. Viacom’s ad revenues in the first quarter of this year were $1.123 B, down slightly from $1.172 B a year ago.

But Facebook’s ad revenues were $5.201 B for the first quarter of this year, up from $3.317 B a year ago. It’s pretty obvious that the big web companies are starting to win the advertising battle. For all of 2015 the total advertising for television was $80.4 B, down slightly from $82.0 B in 2014. But in 2015 the advertising revenues for just Facebook and Google had grown to $84.5 B and is still growing rapidly.

This is not particularly surprising since ratings for television as a whole are plummeting. People are watching traditional television less and are watching more and more video on the web. It seems like the battle between television advertising and web advertising has passed a milestone and that web advertising is now dominant for the first time. I have no idea how fast (or by how much) television advertising will fall, but it looks inevitable that it will.

What does this trend mean to small cable providers? I think it matters a lot because advertising revenue is a major source of revenue for programmers. To the extent that advertising revenues drop for them there is going to be more pressure for them to raise programming rates to cable companies even faster to make up for the revenue difference.

But that could lead into a classic death spiral. Rapidly rising cable TV rates is one of the major factors in driving people towards alternate programming. Many cord cutters and cord shavers cite the cost of traditional cable as a big reason they are looking for alternatives. The more that programmers raise rates, the more eyeballs they are going to lose, and one assumes the more revenue they will lose.

Programmers are also starting to get some pushback from small cable operators. There are a handful of smaller cable systems with less than a million customers in total that have dropped Viacom completely in the last year due to the unreasonable rate increases the company is demanding. I have a number of small cable clients who – when they do the math – realize that they are either losing money on cable or are getting close to the time when they will lose money. Once a company gets to that point then dropping programming is a natural response. It’s better to cut costs and lose customers when you are losing money rather than to keep shoveling money out the door to the programmers.

The programmers are also facing an FCC that is leaning more and more towards giving customers more choices in programming. You can see this in the recent NPRM for settop box reform where they want the cable companies to include ‘channel slots’ for alternate programming like Netflix. The FCC has yet to act on the open docket that is looking at the rights of companies to put content onto the Internet – but it’s clear that the FCC favors consumer choice.

And all of the big cable companies are now implementing or looking to implement skinny bundles. These are smaller packages of just the channels that people want to watch, at a much lower cost to consumers than the big traditional packages. The cable companies want to get off the treadmill of paying huge amounts for programming, and skinny bundles reduce and reset the bar. The cable companies also want to offer an alternative to people to stop them from totally dropping the cable company.

It’s a tough time to be a cable company because margins on the cable product keep tumbling. But it’s starting to also be a rough time for the programmers. Probably the best thing that can happen to the programmers is for Wall Street to lower their stock price to reset the expectations for earnings performance. At that point maybe the whole industry can take a pause and see if they can salvage what is looking like a slowly sinking ship.

Why No Redundancy?

Copper wireI usually load a blog every morning between 7:00 and 8:00 eastern. But today my Internet was down. I first noticed then when I woke up around 2:30. Don’t even ask why I was up then, but that is not unusual for me. My Internet outage was also not that unusual. I have Comcast as my ISP and they seem to go out a few times per month. I’ve always given them the benefit of the doubt and assumed that a few of the late night outages are due to routine network maintenance.

So I grab my cell phone to turn on my mobile hot spot. Most of the outages here last an hour or two and that is the easy way to get through outages. But bam! – AT&T is out too. I have no bars on my LTE network. So my first thought is cable cut. The only realistic way that both carriers go out in this area is if the whole area is isolated by a downed fiber.

I check back and hit a few web sites and I find at about 3:00 that I have a very slow Facebook connection, but that it’s working. I can get Facebook updates and I can post to Facebook, but none of the links outside of Facebook work. And nothing else seems to be working. This tells me that Facebook has a peering arrangement of some kind with Comcast and must come into the area by a different fiber than the one that was cut.

So I start looking around. The first thing I find is that Netflix is working normally, just as fast as ever. So now I have a slow Facebook feed and fast Netflix and still nothing else. After a while Google starts working. It wasn’t working earlier, but it seems that I can search Google, although none of the links work. This tells me that Comcast peers with Google but that the Google links use the open Internet. I force a few links back through the Google URL just to see if that will work and I find that I can read links through Google. No other search engines seem to be working.

The only other think I found that worked with the NFL highlight films and I was able to see the walk-off blocked punt in last night’s Ravens – Browns game. It’s highly unlikely that the NFL has a peering relationship with anybody and they must have a deal with Google.

So now I know a bit about the Comcast Network. They peer with Netflix, Google and Facebook – and since these are three of the largest traffic producers on the web that is not unusual. And at least in my area the peering comes into the area on a different fiber path than the normal Internet backbone that has knocked out both Comcast and AT&T.

But I also now know that in my area that Comcast has no redundancy in the network. I find this interesting because most of my small clients insist on having redundancy in their networks. Of course, most of them operate in rural areas that are used to getting isolated when cables get cuts – it happened for many years with telephone lines and now with the Internet.

But I can see that Comcast hasn’t bothered creating a redundant network. This particular outage went for 7 or 8 hours which is a bit long, so this must be from a major fiber cut. But I look at a map of Florida and it is a natural candidate to have rings. Everybody lives on one of the two coasts and there are several major east-west connector roads. This makes for natural rings. And if our backbone was on a ring we wouldn’t even know there was an outage. But with all of their billions of dollars of profits, neither Comcast nor AT&T wireless cares enough about redundancy to have put our area backbone on a ring.

And I also don’t understand why they don’t have automatic alternate routing to bypass a fiber cut. If Netflix, Facebook and Google were connected everything else could have been routed along those same other fibers. That is something else my clients would have done to minimize outages for customers.

This is honestly unconscionable and perhaps it’s time we start clamoring to the FCC to require the big companies to plow some of their profits back into a better network. These same sort of outages happened a few times to the power grid a decade ago and the federal response was that the electric companies had to come up with a better network that could stop rolling outages. I know some of my clients that are electric companies spent some significant dollars towards that effort, and it seems to have worked. Considering how important the Internet has become for our daily lives and for commerce perhaps it’s time for the FCC to do the same thing.

Google’s Experiment with Cellular Service

Wi-FiAs I’m writing this (a week ago), Google opened up the ability to sign-up for its Project Fi phone service for a 24-hour period. Until now this has been by invitation only, limited I think by the availability of the Google Nexus phones. But they are launching the new Nexus 5X phone and so they are providing free sign-up for a 24-hour period.

The concept behind the Google phone plan is simple. They sell unlimited voice and text for $20 per month and sell data at $10 per gigabit as it’s used. The Google phone can work on WiFi networks or will use either the Sprint or T-Mobile networks when a caller is out of range of WiFi. And there is roaming available on other carriers when a customers in not within the range of any of the preferred networks.

Cellular usage is seamless for customers and Google doesn’t even tell a customer which network they are using at any given time. They have developed a SIM card that can choose between as many as 10 different carriers although today they only have deals with the two cellular carriers. The main point of the phone is that a customer doesn’t have to deal with cellular companies any longer and just deals with Google. There are no contracts and you only pay for what you use.

Google still only supports this on their own Nexus phones for now although the SIM card could be made to work in numerous other phones. Google is letting customers pay for the phones over time similar to what the other cellular carriers do.

Google is pushing the product harder in markets where it has gigabit networks. Certainly customers that live with slow or inconsistent broadband won’t want their voice calls routing first to WiFi.

The main issue I see from the product is that it is an arbitrage business plan. I define anything as arbitrage that relies on using a primary resource over which the provider has no control. Over the years a lot of my clients are very familiar with other arbitrage plans that came and went at the whim of the underlying providers. For example, there have been numerous wholesale products sold through Sprint like long distance, dial tone, and cellular plans that some of my clients used to build into a business plan, only to have Sprint eventually decide to pull the plug and stop supporting the wholesale product.

I am sure Google has tied down Sprint and T-Mobile for the purchase of wholesale voice and texting for some significant amount of time. But like with any arbitrage situation, these carriers could change their mind in the future and strand both Google and all of their customers. I’m not suggesting that will happen, but I’ve seen probably a hundred arbitrage opportunities come and go in the marketplace during my career and not one of them lasted as long as promised.

It’s been rumored that Apple is considering a similar plan. If they do, then the combined market power of both Google and Apple might make it harder for the underlying carriers to change their mind. But at the end of the day only a handful of companies own the vast majority of the cellular spectrum and they are always going to be the ones calling the shots in the industry. They will continue with wholesale products that make them money and will abandon things that don’t.

There are analysts who have opined that what Google is doing is the inevitable direction of the industry and that cellular minutes will get commoditized much in the manner as long distance in the past. But I think these analysts are being naive. AT&T and Verizon are making a lot of money selling overpriced cellular plans to people. These companies have spent a lot of money for spectrum and they know how to be good monopolists. I still laugh when I think about how households that used to spend $30 to $50 per month for a landline and long distance now spend an average of $60 per family member for cellphones. These companies have done an amazing job of selling us on the value of the cellphone.

Perhaps the analysts are right and Google, maybe with some help from Apple, will create a new paradigm where the carriers have little choice but to go along and sell bulk minutes. But I just keep thinking back to all of the past arbitrage opportunities where the buyers of the service were also told that the opportunity would be permanent – and none of them were.

New Video Format

alliance-for-open-mediaSix major tech companies have joined together to create a new video format. Google, Amazon, Cisco, Microsoft, Netflix, and Mozilla have combined to create a new group called the Alliance for Open Media.

The goal of this group is create a video format that is optimized for the web. Current video formats were created before there was wide-spread video using web browsers on a host of different devices.

The Alliance has listed several goals for the new format:

Open Source Current video codecs are proprietary, making it impossible to tweak them for a given application.

Optimized for the Web One of the most important features of the web is that there is no guarantee that all of the bits of a given transmission will arrive at the same time. This is the cause of many of the glitches one gets when trying to watch live video on the web. A web-optimized video codec will be allowed to plow forward with less than complete data. In most cases a small amount of missing bits won’t be noticeable to the eye, unlike the fits and starts that often come today when the video playback is delayed waiting for packets.

Scalable to any Device and any Bandwidth One of the problems with existing codecs is that they are not flexible. For example, consider a time when you wanted to watch something in HD but didn’t have enough bandwidth. The only option today is to fall back the whole way to an SD transmission, at a far lower quality. But in between these two standards is a wide range of possible options where a smart codec could analyze the bandwidth available and could then maximize the transmission by choosing different options among the many variables within a codec. This means you could produce ‘almost HD’ rather than defaulting to something of much poorer in quality.

Optimized for Computational Footprint and Hardware. This means that the manufacturers of devices would be able to maximize the codec specifically for their devices. All smartphones or all tablets or all of any device are not the same and manufacturers would be able to choose a video format that maximizes the video display for each of their devices.

Capable of Consistent, High-quality, Real-time Video Real-time video is a far greater challenge than streaming video. Video content is not uniform in quality and characteristics and there is thus a major difference in the quality between watching two different video streams on the same device. A flexible video codec could standardize quality much in the same way that a sound system can level out differences in listener volume between different audio streams.

Flexible for Both Commercial and Non-commercial Content A significant percentage of videos watched today are user-generated and not from commercial sources. It’s just as important to maximize the quality of Vine videos as it is for showing commercial shows from Netflix.

There is no guarantee that this group can achieve all of these goals immediately, because that’s a pretty tall task. But the power of these various firms combined certainly is promising. The potential for a new video codec that meets all of these goals is enormous. It would improve the quality of web videos on all devices. I know that personally, quality matters and this is why I tend to watch videos from sources like Netflix and Amazon Prime. By definition streamed video can be of much higher and more consistent quality than real-time video. But I’ve noticed that my daughter has a far lower standard of quality than I do and watches videos from a wide variety of sources. Improving web video, regardless of the source, will be a major breakthrough and will make watching video on the web enjoyable to a far larger percentage of users.

Universal Internet Access

navigator_globe_lgWhile many of us are spending a lot of time trying to find a broadband solution for the unserved and underserved homes in the US, companies like Facebook, Google, and Microsoft are looking at ways of bringing some sort of broadband to everybody in the world.

Mark Zuckerberg of Facebook spoke to the United Nations this past week and talked about the need to bring Internet access to the five billion people on the planet that do not have it. He says that bringing Internet access to people is the most immediate way to help lift people out of abject poverty.

And one has to think he is right. Even very basic Internet access, which is what he and those other companies are trying to supply, will bring those billions into contact with the rest of the world. It’s hard to imagine how much untapped human talent resides in those many billions and access to the Internet can let the brightest of them contribute to the betterment of their communities and of mankind.

But on a more basic level, Internet access brings basic needs to poor communities. It opens up ecommerce and ebanking and other fundamental ways for people to become engaged in ways of making a living beyond a scratch existence. It opens up communities to educational opportunities, often for the first time. There are numerous stories already of rural communities around the world that have been transformed by access to the Internet.

One has to remember that the kind of access Zuckerberg is talking about is not the same as what we have in the developed countries. Here we are racing towards gigabit networks on fiber, while in these new places the connections are likely to be slow connections almost entirely via cheap smartphones. But you have to start somewhere.

Of course, there is also a bit of entrepreneurial competition going on here since each of these large corporations wants to be the face of the Internet for all of these new billions of potential customers. And so we see each of them taking different tactics and using different technologies to bring broadband to remote places.

Ultimately, the early broadband solutions brought to these new places will have to be replaced with some real infrastructure. As any population accepts Internet access they will quickly exhaust any limited broadband connection from a balloon, airplane, or satellite. And so there will come a clamor over time for the governments around the world to start building backbone fiber networks to get real broadband into the country and the region. I’ve talked to consultants who work with African nations and it is the lack of this basic fiber infrastructure that is one of the biggest limitations on getting adequate broadband to remote parts of the world.

And so hopefully this early work to bring some connectivity to remote places will be followed up with a program to bring more permanent broadband infrastructure to the places that need it. It’s possible that the need for broadband is going to soon be ranked right after food, water, and shelter as a necessity for a community. I would expect the people of the world to expect, and to then push their governments into making broadband a priority. I don’t even know how well we’ll do to get fiber to each region of our own country, and so the poorer parts of the world face a monumental task over the coming decades to satisfy the desire for connectivity. But when people want something badly enough they generally find a way to get what they want, and so I think we are only a few years away from a time when most of the people on the planet will be clamoring for good Internet access.

 

The Gigabit Dilemma

common carrierCox recently filed a lawsuit against the City of Tempe, Arizona for giving Google more preferable terms as a cable TV provider than what Cox has in their franchise with the city. Tempe undertook the unusual step in creating a new license category of “video service provider’ in establishing the contract with Google. This is different than Cox, which is considered a cable TV provider as defined by FCC rules.

The TV offerings from the two providers are basically the same. But according to the Cox complaint Google has been given easier compliance with various consumer protection and billing rules. Cox alleges that Google might not have to comply with things like giving customers notice of rate changes, meeting installation time frames, and even things like the requirement for providing emergency alerts. I don’t have the Google franchise agreement, so I don’t know the specific facts, but if Cox is right in these allegations then they are likely going to win the lawsuit. Under FCC rules it is hard for a city to discriminate among cable providers.

But the issue has grown beyond cable TV. A lot of fiber overbuilders are asking for the right to cherry pick neighborhoods and to not build everywhere within the franchise area – something that incumbent cable companies are required to do. I don’t know if this is an issue in this case, but I am aware of other cities where fiber overbuilders only want to build in the neighborhoods where enough customers elect to have them, similar to the way that Google builds to fiberhoods.

The idea of not building everywhere is a radical change in the way that cities treat cable companies, but is very much the traditional way to treat ISPs. Since broadband has been defined for many years by the FCC as an information service, data-only ISPs have been free to come to any city and build broadband to any subset of customers, largely without even talking to a city. But cable TV has always been heavily regulated and cable companies have never had that same kind of freedom.

But the world has changed and it’s nearly impossible any more to tell the difference between a cable provider and an ISP. Companies like Google face several dilemmas these days. If they only sell data they don’t get a high enough customer penetration rate – too many people still want to pay just one provider for a bundle. But if they offer cable TV then they get into the kind of mess they are facing right now in Tempe. To confuse matters even further, the FCC recently reclassified ISPs as common carriers which might change the rules for ISPs. It’s a very uncertain time to be a broadband provider.

Cities have their own dilemmas. It seems that every city wants gigabit fiber. But if you allow Google or anybody into your city without a requirement to build everywhere within a reasonable amount of time, then the city is setting themselves up for a huge future digital divide within their own city. They are going to have some parts of town with gigabit fiber and the rest of the town with something that is probably a lot slower. Over time that is going to create myriad problems within the city. There will be services available to the gigabit neighborhoods that are not available where there is no fiber. And one would expect that over time property values will tank in the non-fiber neighborhoods. Cities might look up fifteen years from now and wonder how they created new areas of blight.

I have no idea if Google plans to eventually build everywhere in Tempe. But I do know that there are fiber providers who definitely do not want to build everywhere, or more likely cannot afford to build everywhere in a given city. And not all of these fiber providers are going to offer cable TV, and so they might not even have the franchise discussion with the city and instead can just start building fiber.

Ever since the introduction of DSL and cable modems we’ve had digital divides. These divides have either been between rich and poor neighborhoods within a city, or between the city and the suburban and rural areas surrounding it. But the digital divide between gigabit and non-gigabit neighborhoods is going to be the widest and most significant digital divide we have ever had. I am not sure that cities are thinking about this. I fear that many politicians think broadband is broadband and there is a huge current cachet to having gigabit fiber in one’s city.

In the past these same politicians would have asked a lot of questions of a new cable provider. If you don’t think that’s true you just have to look back at some of the huge battles that Verizon had to fight a decade ago to get their FiOS TV into some cities. But for some reason, which I don’t fully understand, this same scrutiny is not always being applied to fiber overbuilders today.

It’s got to be hard for a city to know what to do. If gigabit fiber is the new standard then a city ought to fight hard to get it. But at the same time they need to be careful that they are not causing a bigger problem a decade from now between the neighborhoods with fiber and those without.