The Future of AT&T and Verizon

The cellphone companies have done such a great job of getting everybody to purchase a smartphone that cellular service in the country is quickly turning into a commodity. And, as is typical with most commodity products, that means less brand loyalty from customers and lower market prices for the products.

We’ve recently seen the cellular market demonstrate the turn toward becoming a commodity. In the first quarter of this year the cellular companies had their worse performance since back when they began. Both AT&T and Verizon posted losses for post-paid customers for the quarter. T-Mobile added fewer customers than expected and Sprint continued to lose money.

This is a huge turnaround for an industry where the big two cellular companies were each making over $1 billion per month in profits. The change in the industry comes from two things. First, people are now shopping for lower prices and are ready to change carriers to get lower monthly bills. The trend for lower prices was started by T-Mobile to gain market share, but low prices are also being pushed by cellular resellers – being fed by the big carriers. The cellular industry is only going to get more competitive when the cable companies soon enter the market. That will provide enough big players to make cellular minutes a true commodity. The cable companies have said they will be offering low prices as part of packages aimed at making customers stickier and will put real price pressure on the other cellular providers.

But the downturn in the first quarter was almost entirely due to the rush by all of the carriers to sell ‘unlimited’ data plans – which, as I’ve noted in some earlier blogs, are really not unlimited. But these plans offer lower prices for data and are freeing consumers to be able to use their smartphones without the fear of big overage fees. Again, this move was started by T-Mobile, but it was also driven heavily by public demand. AT&T and Verizon recognized that if they didn’t offer this product set that they were going to start bleeding customers to T-Mobile.

It will be really interesting to watch what happens to AT&T and Verizon, who are now predominantly cellular companies that also happen to own networks. The vast majority of revenues for these companies comes from the cellular parts of their companies. When I looked at both of their annual reports last year I had a hard time finding evidence that these companies were even in the landline network business. Discussions of those business lines are buried deeply within the annual reports.

These companies obviously need to find new forms of revenues to stay strong. AT&T is tackling this for now by going in a big way after the Mexican market. But one only has to look down the road a few years to see that Mexico and any other cellular market will also trend towards commoditization.

Both companies have their eyes on the same potential growth plays:

  • Both are making the moves necessary to tackle the advertising business. They look at the huge revenues being made by Facebook and Google and realize that as ISPs they are sitting on customer data that could make them major players in the targeted marketing space. Ad revenues are the predominant revenue source at Google and if these companies can grab even a small slice of that business they will make a lot of money.
  • Both are also chasing content. AT&T’s bid for the purchase of Time Warner is still waiting for government approval. Verizon has made big moves with the purchases of AOL and Yahoo and is rumored to be looking at other opportunities.
  • Both companies have been telling stockholders that there are huge amounts of money to be made from the IoT. These companies want their cellular networks to be the default networks for collecting data from IoT devices. They certainly ought to win the business for things like smart cars, but there will be a real battle between cellular and WiFi/landline connections for most other IoT usage.
  • Both companies are making a lot of noise about 5G. They are mostly concentrating on high-speed wireless connections using millimeter wave spectrum that they hope will make them competitive with the cable companies in urban areas. But even that runs a risk because if we see true competition in urban areas then prices for urban broadband might also tumble. And that might start the process of making broadband into a commodity. On the cellular side it’s hard to think that 5G cellular won’t quickly become a commodity as well. Whoever introduces faster cellphone data speeds might get a bump upward for a few years, but the rest of the industry will certainly catch up to any technological innovations.

It’s hard to foresee any business line where AT&T and Verizon are going to get the same monopoly power that they held in the cellular space for the past few decades. Everything they might undertake is also going to be available to competitors, meaning they are unlikely to make the same kind of huge margins they have historically made with cellular. No doubt they are both going to be huge companies for many decades to come since they own the cellular networks and spectrum. But I don’t think we can expect them to be the cash cows they have been in the past.

Broadband Shorts – March 2017

Today I’m writing about a few interesting topics that are not long enough to justify a standalone blog:

Google Scanning Non-user Emails. There has been an ongoing class action lawsuit against Google for scanning emails from non-Google customers. Google has been open for years about the fact that they scan email that originates through a Gmail account. The company scans Gmail for references to items that might be of interest to advertisers and then sell that condensed data to others. This explains how you can start seeing ads for new cars after emailing that you are looking for a new car.

There are no specific numbers available for how much they make from scanning Gmail, but this is part of their overall advertising revenues which were $79.4 billion for 2016, up 18% over 2015.  The class action suit deals with emails that are sent to Gmail users from non-Gmail domains. It turns out that Google scans these emails as well, although non-Gmail users have never agreed to the terms of service that applies to Gmail users. This lawsuit will be an important test of customer privacy rights, particularly if Google loses and appeals to a higher court. This is a germane topic right now since the big ISPs are all expected to do similar scanning of customer data now that the FCC and Congress have weakened consumer privacy rights for broadband.

Verizon FiOS and New York City. This relationship is back in the news since the City is suing Verizon for not meeting its promise to bring broadband to everybody in the city in 2008. Verizon has made FiOS available to 2.2 million of the 3.3 million homes and businesses in the city.

The argument is one of the definition of a passing. Verizon says that they have met their obligation and that the gap is due to landlords that won’t allow Verizon into their buildings. But the city claims that Verizon hasn’t built fiber on every street in the city and also that the company has often elected to not enter older buildings due to the cost of distributing fiber inside the buildings. A number of landlords claim that they have asked Verizon into their buildings but that the company either elected to not enter the buildings or else insisted on an exclusive arrangement for broadband services as a condition for entering a building.

New Applications for Satellite Broadband.  The FCC has received 5 new applications for launching geostationary satellite networks bringing the total requests up to 17. Now SpaceX, OneWeb, Telesat, O3b Networks and Theia Holdings are also asking permission to launch satellite networks that would provide broadband using the V Band of spectrum from 37 GHz to 50 GHz. Boeing also expanded their earlier November request to add the 50.4 GHz to 52.4 GHz bands. I’m not sure how the FCC picks winners from this big pile – and if they don’t we are going to see busy skies.

Anonymous Kills 20% of Dark Web. Last month the hackers who work under the name ‘Anonymous’ knocked down about 20% of the web sites from the dark web. The hackers were targeting cyber criminals who profit from child pornography. Of particular interest was a group known as Freedom Hosting, a group that Anonymous claims has over 50% of their servers dedicated to child pornography.

This was the first known major case of hackers trying to regulate the dark web. This part of the Internet is full of pornography and other kinds of criminal content. The Anonymous hackers also alerted law enforcement about the content they uncovered.

The Fight Over Wireless Pole Attachments

PoleAll around the country there are fights going on between pole owners, governments, and wireless carriers over pole attachments and related issues for small cell deployment. Small cells are the first new technology that is mostly interested in non-traditional attachments, but will soon be followed by a proliferation of companies also wanting to hang devices to transmit millimeter wave radios and wireless local loops. The fights cover a wide range of different issues:

Safety. Most current pole rules were created for the purposes of keeping it safe for technicians to work on poles, particularly during bad weather conditions. Some of the devices that carriers now want to hang on poles are not small. Some are the size of dorm refrigerators or even a bit larger. And these devices are connected to live electric wires. Adding such devices to poles can make it significantly harder for a technician trying to restore power during a rain or snow storm. Just maneuvering around such devices can be a major safety concern even in good weather.

New Poles / Taller Poles. There are reports of wireless carriers asking to install new poles as tall as 120 feet in city rights-of-way. For network deployments that include wireless backhaul it’s vital that each small cell or other device has a clear line-of-sight to other devices in the network – and being higher in the air can create the needed wireless network.

In most towns the poles are not taller than 60 feet and often shorter. Taller poles create a whole new set of problems. They might mean a whole new level of tree trimming or even eliminating taller trees – and many communities take great pride in their trees. And these new poles will need power, meaning stringing more wires in the air, which can detract from the aesthetics of a residential neighborhood as well as to create more issues with downed power lines and trees to keep trimmed.

This also raises the issue of the long-term impact of such new poles. Many cities have moved other utilities underground or have multi-year programs to migrate existing utilities underground. These new wireless-only poles also require a power feed, and at least some of them require a fiber feed. Can a carrier require a wireless pole/tower in a neighborhood where everything else is already underground? Can they insist that their poles be left standing during future conversions of neighborhoods to underground utilities?

There is also the issue of sharing such new poles. Cities fear that they will be swamped with requests for new poles from companies wanting to deploy wireless technologies. It’s not hard to picture an NFL city that might have a dozen different companies wanting to deploy wireless devices – and it’s not hard to picture this resulting in chaos and a proliferation of multiple new poles on the same streets as well as numerous new electric lines to connect all of the new devices.

Right to Say No. Cities largely want the right to decide what goes in their rights-of-way. This often has manifested with requirements that anybody that wants access to rights-of-way get some sort of a franchise. It also has meant the development of local ordinances that define the whole process of using rights-of-way from the permitting process through installation techniques. But the carriers are currently lobbying at the state level and at the FCC to make uniform rules to apply everywhere. If the FCC or a state passes blanket rules there are many cities likely to challenge such rules in court.

Fees for Attachments. The carriers are also lobbying heavily to define the fee structure for attachments of these sorts of new connections. Compensation has always been an issue and my guess is that at some point the FCC will step in here in the same manner they did in the past with other pole attachments.

General Irony. I find it ironic that AT&T is leading the battle to get good terms for attaching wireless devices. AT&T has been the primary entity that has been fighting hard against Google to keep them off AT&T poles. And now AT&T wants the right to force their way onto poles owned by others. But in the regulatory world if we have ever learned any lesson it’s that big companies don’t seem to have a problem with arguing both sides of the same argument when it suits their purposes.

The Challenges of Fixed Gigabit Wireless

webpass_logoWe got a preview this week of what fixed wireless service might look like in urban environments. Google announced it is aggressively expanding the footprint of Webpass, the wireless ISP that Google purchased last year. The company has been operating in six cities and will now be expanding to nine more markets. These will all be downtown urban deployments.

The deployment uses high-capacity microwave links to serve high-rise buildings. Webpass already has 20,000 residential customers in the six markets, all which live in downtown high-rises. The company focuses more on serving business customers. This business plan has been around for years and I was actually helping to launch a business years ago with the same plan that died with the 2000 telecom crash.

The network consists of microwave shots to each building on the network. The first hurdle in getting this to work is to get enough quality radio sites to see buildings. As I noted in a blog last week, access to this kind of real estate is at a premium in urban areas, as cellphone providers have found when trying to deploy small cell sites.

The radios required to make the links are not gigantic, but you need one full radio and a dish at both ends of every link. This means that from any one given hub building there will be a limited number of links that can be made to other buildings, just due to space limitations. If you imagine half a dozen companies trying to this same thing (this will be the same basic deployment method for urban 5G), then you can picture a proliferation of companies fighting over available radio space on roofs.

Webpass in the past has limited their deployment to buildings that are either already wired with category 5 cable or fiber. They face the same issue that any broadband provider faces in bringing broadband into older buildings – only they are starting on the roof rather than from a basement wiring closet like other ISPs. There are very few ISPs yet willing to tackle the rewiring effort needed in large older buildings that serve residences. As you will see from the pricing below, Webpass and other ISPs are a lot more willing to tackle business buildings and absorb some rewiring costs.

The primary thing for the public to understand about this new roll-out is that it’s very limited. This won’t go to single family homes. It will go to downtown residential high-rises, but only to those that are pre-wired or easy to wire. And even in those buildings Webpass won’t go unless they get at least 10 customers. However, they will contract with landlords to serve whole buildings.

The Webpass pricing is interesting. For residential customers the price is $60 per month regardless of the speed achieved. Webpass says they deliver speeds between 100 Mbps and 500 Mbps, but in reading numerous reviews, there are complaints that speeds can get slower at peak evening time in some buildings (as one would expect when there are a lot of customers sharing one radio link).

Webpass’ pricing for businesses varies according to the number of other customers they get in a building. For example, if there are 10 or more business customers in a building they will sell a 100 – 200 Mbps connection for $250 per month with a 10 TB monthly data cap. But prices are much higher for customers in buildings with fewer than 10 customers:

Speed              Cost                 Data Cap         Price with no Cap

10 Mbps          $125                   1 TB                $375

20 Mbps          $250                   2 TB                $750

50 Mbps          $500                   5 TB                $1,500

100 Mbps        $1,000                10 TB              $2,000

250 Mbps                                                           $2,500

500 Mbps                                                           $4,000

1 Gbps                                                                $5,500

From a technical perspective Webpass is deploying in line with the way the technology works. The radios are too expensive to deploy to smaller customers or to smaller buildings. A building also need to be within a mile of the base transmitter (and hopefully closer) to get good speeds. That is largely going to mean downtown deployments.

We know there are a number of other companies considering a similar plan. Starry announced almost two years ago that they were deploying something similar in Boston, but has yet to launch. We know AT&T and Verizon are both exploring something similar to this Google product using 5G radios. But all of these companies are going to be fighting over the same limited markets.

The cellular companies keep hinting in their press releases that they will be able to use 5G to bring gigabit speeds. When they say that, this is the kind of deployment they are talking about. The only way they are going to be able to bring gigabit wireless speeds to single family homes and to suburbs is if they can develop some sort of mini transmitters to go onto utility poles. That technology is going to require building fiber close to each house and the radios are going to replace fiber drops. The above deployment by Webpass is not hype – they already have customers in six markets. But this technology is not the panacea for fast broadband for everyone that you might believe from reading the press releases.

Industry Shorts – July 2016

unflagHere are a few topics I’ve been following but which don’t merit a full blog.

Mediacom Announces Upgrade Plans. Mediacom has announced plans to invest over $1 billion to upgrade its networks. The main thrust of the upgrades would be to increase speeds up to a gigabit in the 1,500 communities it serves in 22 states.

It will be interesting to see how they do this. There are many markets where they don’t have to do a lot more than upgrade to DOCSIS 3.1 and introduce new cable modems for high-bandwidth customers. But a lot of their rural markets will require forklift upgrades involving headend upgrades as well as revamping the coaxial cable plant. In the worst cases they’d have to replace coaxial cables, but in others would have to replace power taps and line amplifiers.

The company also announced it would open public WiFi hotspots in many of its markets. However, their current WiFi program is pretty weak by industry standards and only gives existing broadband subscribers access to 30 free WiFi minutes per month.

Dish Cuts Back on Ad-Skipping. Dish Networks has agreed to largely disable the feature in their new VCRs that let customers skip ads automatically. This has become such a sticky point in negotiations for content that Dish finally agreed to cut back on the very popular feature. Dish reached agreements with Disney and CBS to disable the feature in order to get new programming for Dish’s Sling TV OTT offering.

Google Launches Undersea Cable. Google and Japanese telecoms have built a new undersea cable joining Portland, Seattle, Los Angeles and San Francisco to two POPs in Japan. The cable can carry 60 terabits of data per second and is now the fastest undersea fiber. Google is also planning to complete a fiber between Florida and Brazil by the end of the year. Facebook and Microsoft are working together on an undersea connection between Virginia Beach and Bilboa Spain. With the explosive growth of Internet traffic worldwide this is probably just the beginning of the effort to create the needed connectivity between continents.

It’s interesting to see that some of the big traffic generators on the web are willing to spend money on fiber, and one has to suppose this will save them money in the long term by avoiding transport charges on other fiber routes. It’s probably also not a bad time to own a fiber-laying ship.

UN Declares Broadband Access a Universal Human Right. The United Nations recently passed a series of resolutions that makes online access to the Internet a basic human right. Among the key extracts in the resolutions are:

  • That people have the same rights online as offline, “in particular, freedom of expression, which is applicable regardless of frontiers and through any media of one’s choice.”
  • That human rights violations enacted against people due to making their views known online are “condemned unequivocally,” and states are held accountable for any such violations.
  • Any measures to “intentionally prevent or disrupt access” to the internet are also “condemned unequivocally,” and all states should “refrain from and cease such measures.”

While it’s easy to argue that much of what the UN does has no teeth, it has been the forum since its creation for recognizing human rights.

Netflix Users Would Hate Ads. In a survey with mixed results it’s clear that Netflix users have strong feelings about introducing advertising into the popular ad-free service. In a survey given by All Flicks, 75% of Netflix users said they would dump the service if it started carrying ads.

In a somewhat contradictory finding, the pole indicated that most Netflix users would pay a premium price to avoid ads if there were options. Nearly 60% of Netflix users said they would pay $1 per month to avoid ads with many others saying they would pay even more.

The Changing Face of Advertising

advertiseherebillboardmedThere has been talk for a number of years of advertising dollars shifting from television to the Internet, and it looks like maybe this is finally starting to happen. Consider the recent advertising revenues from Viacom and Facebook.

Viacom is one of a handful of the big programmers and owns such channels as Comedy Central, MTV, and Nickelodeon (along with Paramount Pictures). This has always made Viacom one of the powerhouses in attracting advertisers along with other large programmers like Disney, Fox, Comcast, Warner Brothers, and a few others. Viacom’s ad revenues in the first quarter of this year were $1.123 B, down slightly from $1.172 B a year ago.

But Facebook’s ad revenues were $5.201 B for the first quarter of this year, up from $3.317 B a year ago. It’s pretty obvious that the big web companies are starting to win the advertising battle. For all of 2015 the total advertising for television was $80.4 B, down slightly from $82.0 B in 2014. But in 2015 the advertising revenues for just Facebook and Google had grown to $84.5 B and is still growing rapidly.

This is not particularly surprising since ratings for television as a whole are plummeting. People are watching traditional television less and are watching more and more video on the web. It seems like the battle between television advertising and web advertising has passed a milestone and that web advertising is now dominant for the first time. I have no idea how fast (or by how much) television advertising will fall, but it looks inevitable that it will.

What does this trend mean to small cable providers? I think it matters a lot because advertising revenue is a major source of revenue for programmers. To the extent that advertising revenues drop for them there is going to be more pressure for them to raise programming rates to cable companies even faster to make up for the revenue difference.

But that could lead into a classic death spiral. Rapidly rising cable TV rates is one of the major factors in driving people towards alternate programming. Many cord cutters and cord shavers cite the cost of traditional cable as a big reason they are looking for alternatives. The more that programmers raise rates, the more eyeballs they are going to lose, and one assumes the more revenue they will lose.

Programmers are also starting to get some pushback from small cable operators. There are a handful of smaller cable systems with less than a million customers in total that have dropped Viacom completely in the last year due to the unreasonable rate increases the company is demanding. I have a number of small cable clients who – when they do the math – realize that they are either losing money on cable or are getting close to the time when they will lose money. Once a company gets to that point then dropping programming is a natural response. It’s better to cut costs and lose customers when you are losing money rather than to keep shoveling money out the door to the programmers.

The programmers are also facing an FCC that is leaning more and more towards giving customers more choices in programming. You can see this in the recent NPRM for settop box reform where they want the cable companies to include ‘channel slots’ for alternate programming like Netflix. The FCC has yet to act on the open docket that is looking at the rights of companies to put content onto the Internet – but it’s clear that the FCC favors consumer choice.

And all of the big cable companies are now implementing or looking to implement skinny bundles. These are smaller packages of just the channels that people want to watch, at a much lower cost to consumers than the big traditional packages. The cable companies want to get off the treadmill of paying huge amounts for programming, and skinny bundles reduce and reset the bar. The cable companies also want to offer an alternative to people to stop them from totally dropping the cable company.

It’s a tough time to be a cable company because margins on the cable product keep tumbling. But it’s starting to also be a rough time for the programmers. Probably the best thing that can happen to the programmers is for Wall Street to lower their stock price to reset the expectations for earnings performance. At that point maybe the whole industry can take a pause and see if they can salvage what is looking like a slowly sinking ship.

Why No Redundancy?

Copper wireI usually load a blog every morning between 7:00 and 8:00 eastern. But today my Internet was down. I first noticed then when I woke up around 2:30. Don’t even ask why I was up then, but that is not unusual for me. My Internet outage was also not that unusual. I have Comcast as my ISP and they seem to go out a few times per month. I’ve always given them the benefit of the doubt and assumed that a few of the late night outages are due to routine network maintenance.

So I grab my cell phone to turn on my mobile hot spot. Most of the outages here last an hour or two and that is the easy way to get through outages. But bam! – AT&T is out too. I have no bars on my LTE network. So my first thought is cable cut. The only realistic way that both carriers go out in this area is if the whole area is isolated by a downed fiber.

I check back and hit a few web sites and I find at about 3:00 that I have a very slow Facebook connection, but that it’s working. I can get Facebook updates and I can post to Facebook, but none of the links outside of Facebook work. And nothing else seems to be working. This tells me that Facebook has a peering arrangement of some kind with Comcast and must come into the area by a different fiber than the one that was cut.

So I start looking around. The first thing I find is that Netflix is working normally, just as fast as ever. So now I have a slow Facebook feed and fast Netflix and still nothing else. After a while Google starts working. It wasn’t working earlier, but it seems that I can search Google, although none of the links work. This tells me that Comcast peers with Google but that the Google links use the open Internet. I force a few links back through the Google URL just to see if that will work and I find that I can read links through Google. No other search engines seem to be working.

The only other think I found that worked with the NFL highlight films and I was able to see the walk-off blocked punt in last night’s Ravens – Browns game. It’s highly unlikely that the NFL has a peering relationship with anybody and they must have a deal with Google.

So now I know a bit about the Comcast Network. They peer with Netflix, Google and Facebook – and since these are three of the largest traffic producers on the web that is not unusual. And at least in my area the peering comes into the area on a different fiber path than the normal Internet backbone that has knocked out both Comcast and AT&T.

But I also now know that in my area that Comcast has no redundancy in the network. I find this interesting because most of my small clients insist on having redundancy in their networks. Of course, most of them operate in rural areas that are used to getting isolated when cables get cuts – it happened for many years with telephone lines and now with the Internet.

But I can see that Comcast hasn’t bothered creating a redundant network. This particular outage went for 7 or 8 hours which is a bit long, so this must be from a major fiber cut. But I look at a map of Florida and it is a natural candidate to have rings. Everybody lives on one of the two coasts and there are several major east-west connector roads. This makes for natural rings. And if our backbone was on a ring we wouldn’t even know there was an outage. But with all of their billions of dollars of profits, neither Comcast nor AT&T wireless cares enough about redundancy to have put our area backbone on a ring.

And I also don’t understand why they don’t have automatic alternate routing to bypass a fiber cut. If Netflix, Facebook and Google were connected everything else could have been routed along those same other fibers. That is something else my clients would have done to minimize outages for customers.

This is honestly unconscionable and perhaps it’s time we start clamoring to the FCC to require the big companies to plow some of their profits back into a better network. These same sort of outages happened a few times to the power grid a decade ago and the federal response was that the electric companies had to come up with a better network that could stop rolling outages. I know some of my clients that are electric companies spent some significant dollars towards that effort, and it seems to have worked. Considering how important the Internet has become for our daily lives and for commerce perhaps it’s time for the FCC to do the same thing.

Google’s Experiment with Cellular Service

Wi-FiAs I’m writing this (a week ago), Google opened up the ability to sign-up for its Project Fi phone service for a 24-hour period. Until now this has been by invitation only, limited I think by the availability of the Google Nexus phones. But they are launching the new Nexus 5X phone and so they are providing free sign-up for a 24-hour period.

The concept behind the Google phone plan is simple. They sell unlimited voice and text for $20 per month and sell data at $10 per gigabit as it’s used. The Google phone can work on WiFi networks or will use either the Sprint or T-Mobile networks when a caller is out of range of WiFi. And there is roaming available on other carriers when a customers in not within the range of any of the preferred networks.

Cellular usage is seamless for customers and Google doesn’t even tell a customer which network they are using at any given time. They have developed a SIM card that can choose between as many as 10 different carriers although today they only have deals with the two cellular carriers. The main point of the phone is that a customer doesn’t have to deal with cellular companies any longer and just deals with Google. There are no contracts and you only pay for what you use.

Google still only supports this on their own Nexus phones for now although the SIM card could be made to work in numerous other phones. Google is letting customers pay for the phones over time similar to what the other cellular carriers do.

Google is pushing the product harder in markets where it has gigabit networks. Certainly customers that live with slow or inconsistent broadband won’t want their voice calls routing first to WiFi.

The main issue I see from the product is that it is an arbitrage business plan. I define anything as arbitrage that relies on using a primary resource over which the provider has no control. Over the years a lot of my clients are very familiar with other arbitrage plans that came and went at the whim of the underlying providers. For example, there have been numerous wholesale products sold through Sprint like long distance, dial tone, and cellular plans that some of my clients used to build into a business plan, only to have Sprint eventually decide to pull the plug and stop supporting the wholesale product.

I am sure Google has tied down Sprint and T-Mobile for the purchase of wholesale voice and texting for some significant amount of time. But like with any arbitrage situation, these carriers could change their mind in the future and strand both Google and all of their customers. I’m not suggesting that will happen, but I’ve seen probably a hundred arbitrage opportunities come and go in the marketplace during my career and not one of them lasted as long as promised.

It’s been rumored that Apple is considering a similar plan. If they do, then the combined market power of both Google and Apple might make it harder for the underlying carriers to change their mind. But at the end of the day only a handful of companies own the vast majority of the cellular spectrum and they are always going to be the ones calling the shots in the industry. They will continue with wholesale products that make them money and will abandon things that don’t.

There are analysts who have opined that what Google is doing is the inevitable direction of the industry and that cellular minutes will get commoditized much in the manner as long distance in the past. But I think these analysts are being naive. AT&T and Verizon are making a lot of money selling overpriced cellular plans to people. These companies have spent a lot of money for spectrum and they know how to be good monopolists. I still laugh when I think about how households that used to spend $30 to $50 per month for a landline and long distance now spend an average of $60 per family member for cellphones. These companies have done an amazing job of selling us on the value of the cellphone.

Perhaps the analysts are right and Google, maybe with some help from Apple, will create a new paradigm where the carriers have little choice but to go along and sell bulk minutes. But I just keep thinking back to all of the past arbitrage opportunities where the buyers of the service were also told that the opportunity would be permanent – and none of them were.

New Video Format

alliance-for-open-mediaSix major tech companies have joined together to create a new video format. Google, Amazon, Cisco, Microsoft, Netflix, and Mozilla have combined to create a new group called the Alliance for Open Media.

The goal of this group is create a video format that is optimized for the web. Current video formats were created before there was wide-spread video using web browsers on a host of different devices.

The Alliance has listed several goals for the new format:

Open Source Current video codecs are proprietary, making it impossible to tweak them for a given application.

Optimized for the Web One of the most important features of the web is that there is no guarantee that all of the bits of a given transmission will arrive at the same time. This is the cause of many of the glitches one gets when trying to watch live video on the web. A web-optimized video codec will be allowed to plow forward with less than complete data. In most cases a small amount of missing bits won’t be noticeable to the eye, unlike the fits and starts that often come today when the video playback is delayed waiting for packets.

Scalable to any Device and any Bandwidth One of the problems with existing codecs is that they are not flexible. For example, consider a time when you wanted to watch something in HD but didn’t have enough bandwidth. The only option today is to fall back the whole way to an SD transmission, at a far lower quality. But in between these two standards is a wide range of possible options where a smart codec could analyze the bandwidth available and could then maximize the transmission by choosing different options among the many variables within a codec. This means you could produce ‘almost HD’ rather than defaulting to something of much poorer in quality.

Optimized for Computational Footprint and Hardware. This means that the manufacturers of devices would be able to maximize the codec specifically for their devices. All smartphones or all tablets or all of any device are not the same and manufacturers would be able to choose a video format that maximizes the video display for each of their devices.

Capable of Consistent, High-quality, Real-time Video Real-time video is a far greater challenge than streaming video. Video content is not uniform in quality and characteristics and there is thus a major difference in the quality between watching two different video streams on the same device. A flexible video codec could standardize quality much in the same way that a sound system can level out differences in listener volume between different audio streams.

Flexible for Both Commercial and Non-commercial Content A significant percentage of videos watched today are user-generated and not from commercial sources. It’s just as important to maximize the quality of Vine videos as it is for showing commercial shows from Netflix.

There is no guarantee that this group can achieve all of these goals immediately, because that’s a pretty tall task. But the power of these various firms combined certainly is promising. The potential for a new video codec that meets all of these goals is enormous. It would improve the quality of web videos on all devices. I know that personally, quality matters and this is why I tend to watch videos from sources like Netflix and Amazon Prime. By definition streamed video can be of much higher and more consistent quality than real-time video. But I’ve noticed that my daughter has a far lower standard of quality than I do and watches videos from a wide variety of sources. Improving web video, regardless of the source, will be a major breakthrough and will make watching video on the web enjoyable to a far larger percentage of users.

Universal Internet Access

navigator_globe_lgWhile many of us are spending a lot of time trying to find a broadband solution for the unserved and underserved homes in the US, companies like Facebook, Google, and Microsoft are looking at ways of bringing some sort of broadband to everybody in the world.

Mark Zuckerberg of Facebook spoke to the United Nations this past week and talked about the need to bring Internet access to the five billion people on the planet that do not have it. He says that bringing Internet access to people is the most immediate way to help lift people out of abject poverty.

And one has to think he is right. Even very basic Internet access, which is what he and those other companies are trying to supply, will bring those billions into contact with the rest of the world. It’s hard to imagine how much untapped human talent resides in those many billions and access to the Internet can let the brightest of them contribute to the betterment of their communities and of mankind.

But on a more basic level, Internet access brings basic needs to poor communities. It opens up ecommerce and ebanking and other fundamental ways for people to become engaged in ways of making a living beyond a scratch existence. It opens up communities to educational opportunities, often for the first time. There are numerous stories already of rural communities around the world that have been transformed by access to the Internet.

One has to remember that the kind of access Zuckerberg is talking about is not the same as what we have in the developed countries. Here we are racing towards gigabit networks on fiber, while in these new places the connections are likely to be slow connections almost entirely via cheap smartphones. But you have to start somewhere.

Of course, there is also a bit of entrepreneurial competition going on here since each of these large corporations wants to be the face of the Internet for all of these new billions of potential customers. And so we see each of them taking different tactics and using different technologies to bring broadband to remote places.

Ultimately, the early broadband solutions brought to these new places will have to be replaced with some real infrastructure. As any population accepts Internet access they will quickly exhaust any limited broadband connection from a balloon, airplane, or satellite. And so there will come a clamor over time for the governments around the world to start building backbone fiber networks to get real broadband into the country and the region. I’ve talked to consultants who work with African nations and it is the lack of this basic fiber infrastructure that is one of the biggest limitations on getting adequate broadband to remote parts of the world.

And so hopefully this early work to bring some connectivity to remote places will be followed up with a program to bring more permanent broadband infrastructure to the places that need it. It’s possible that the need for broadband is going to soon be ranked right after food, water, and shelter as a necessity for a community. I would expect the people of the world to expect, and to then push their governments into making broadband a priority. I don’t even know how well we’ll do to get fiber to each region of our own country, and so the poorer parts of the world face a monumental task over the coming decades to satisfy the desire for connectivity. But when people want something badly enough they generally find a way to get what they want, and so I think we are only a few years away from a time when most of the people on the planet will be clamoring for good Internet access.