The Transition to IP Telephony

ATTAT&T reported to the FCC about the progress of its transition of customers from a traditional TDM network to an all-IP network. AT&T had undertaken two trials of such a conversion in Carbon Hill, AL and Delray Beach, FL.

These were voluntary trials. AT&T had advertised widely and asked customers to move to the new IP-based services. In Carbon Hill 36% of residents and 28% of businesses voluntarily moved to the new service. In Delray Beach the numbers were similar with 38% and 25% converting. AT&T reported there were no reports of degraded service, including the transition of business customers to IP-based Centrex and similar services.

Since the trials were announced AT&T has also grandfathered Centrex and TV1-Analog Video service, meaning they will take no new orders for the services. The company also asked the FCC’s permission to discontinue 13 legacy services that are obsolete. This includes products that most people never heard of like 4-wire and voice-grade telemetry and various alarm bridging services. The company also has asked permission to discontinue six operator services including collect calling, person-to-person calling, billed to third party, busy line verification, busy line interrupt and international directory assistance.

These trials need to be put into perspective. From a technical perspective there is no reason to think that transitioning these service from TDM to IP-based technology wouldn’t work because a lot of the rest of the telephony world made that transition years ago. Cable companies like Comcast and anybody operating on an all-fiber network has been offering IP-based telephone products for many years. AT&T’s offerings include many products that are strictly copper-based, such as the legacy products they want to discontinue.

And that leads to the whole purpose behind these trials. AT&T wants to move customers off old copper networks to either a landline or wireless IP-based solution. Since the company’s goal is to tear down copper, the vast majority of such transitions will be to the company’s cellular network. A miniscule percentage of AT&T’s customers are on fiber – particularly residential customers since the company has launched very little FTTP in that market.

The trials are largely the result of what happened to Verizon on Fire Island a few years ago after Hurricane Sandy. There Verizon didn’t replace destroyed copper but moved people to a cellular-based service. But unlike these trials, which were meticulously slow and careful, it seems that in many of the Fire Island cases Verizon did not offer equivalent services to what they had offered before the hurricane. Apparently things like burglar alarms, medical monitoring devices, and other services didn’t work on the new wireless connections.

The FCC has already granted these big telcos the ability to tear down copper as long as they follow customer notification processes. My guess is that after these trials are blessed by the FCC that the companies will begin ripping down rural copper all over the country.

I expect that many customers are going to be unhappy when they lose their copper. Anybody who has traveled in rural areas understands that cellular coverage is often spotty, or even non-existent. Customers are worried about being cut off from telephony services inside their homes. It’s a legitimate concern for somebody with poor cellular service and with little or no broadband options, like we see in millions of rural homes and businesses.

But the time is coming soon when these transitions will not be voluntary like was done in these two communities. The big telcos will issue the legally required notices, and then they will proceed to shut off and tear down the copper. In doing so they will have undone the original FCC’s goal set by the Telecommunications Act of 1934, which was to make telephone service available everywhere. There are now going to be homes and communities that are going to be cut off from a workable alternative to make reliable voice calls.

I honestly never thought I’d see this happen. But I guess it was the pretty obvious end game after it became clear decades ago that the big telcos were not going to properly maintain their rural copper networks. We aren’t too far from the day when copper telephone networks join the list of other technologies that outlived their usefulness and are a thing of the past – at least for the giant telcos. There are still other companies like Frontier and Windstream that are fighting to extend the life of their copper, but we’ll have to see what the future holds for them and their customers.

Who Will Win the Telecom Battle?

facebookNow that Google has pulled back with expansion of Google Fiber it’s easy to see that the cable companies and telcos think they have won the broadband war. But I think if you look a little closer this might not really be the case.

Tech companies like Google, Facebook and Amazon are still focused on making sure that people have enough bandwidth to take advantage of the many products these giant companies offer or plan to offer in the future. And all three companies are growing in importance as content providers.

Consider first the strength of these companies as content providers. Google owns YouTube which is becoming the most important video destination for the younger generation – and those kids are growing up. We’ve seen young millennial households largely reject traditional cable TV offerings. While Amazon Prime is not nearly as big as Netflix it is a strong second and is continuing to grow. Amazon is also reported to be pouring big money into producing original content for its platform. Facebook is on a trajectory to become the preferred source of news and information. And their Facebook Live is also quickly becoming a huge content platform.

But content isn’t everything. Consider that these companies have amassed an enormous private fiber network. Google doesn’t talk about it’s network, but way back in 2013 it was reported that Google had assembled a network consisting of 100,000 miles of dark fiber. We also don’t know the size of the networks, but both Amazon and Facebook have also built large private networks. We know that Google and Facebook have partnered to build a massive undersea fiber to China and are looking at other undersea fiber routes. Amazon has built a huge network to support its cloud services business. It would not be surprising if these companies have already together amassed a larger fiber network than the telcos and cable companies. If they are not bigger yet, they are on a trajectory to get there soon. With these networks the tech companies could hurt the big ISPs where it most hurts – by taking a huge bite out of their special access and transport businesses.

These companies are also not done with the ISP business. Google Fiber has retracted from expanding FTTH networks for now, but they acquired Webpass and are looking to expand as an ISP using wireless last mile. And we saw in Huntsville that Google is not afraid to use somebody else’s fiber network – something we have never seen any of the telcos or cable companies consider. It would not be surprising to see Google make deals with other private networks to expand its ISP business to avoid spending the upfront capital. But perhaps Google’s biggest foray into providing data services is Google Fi, their service that provides unlimited cellular data using WiFi first rather than cellular. It’s been rumored that Google is looking for partnerships to expand WiFi access in many markets. And it’s been reported that Amazon is strongly considering becoming an ISP. I’ve not heard any details about how they might do this, but the company has shown the ability to succeed in everything it’s tackled – so it’s an intriguing possibility.

It’s a gigantic task to take on companies like AT&T and Comcast head on. I think Google Fiber learned this the hard way. But at the end of the day content is still king. As these companies continue to grow in influence as content providers they present a real challenge to traditional programmers. But they also are a growing threat to the big ISPs. If these tech companies decide that their best strategy is to directly deliver their content to subscribers they have a big enough marketing position to pull along a huge number of customers. It’s clear that consumers like these tech companies far more than they like the big ISPs, and in the end the accumulated animus with customers might be their undoing.

This kind of industry shift won’t happen overnight. But it’s already quietly going on behind the scenes. We may not be as far away as you might imagine when these companies provide more content than the traditional programmers and also carry more bandwidth on their own networks than the big ISPs. From my perspective that looks a lot like winning the battle.

Time for a New Telecom Act, Part 2

FCC_New_LogoYesterday’s blog postulated that we would see a new telecom act this year from Congress. That blog looked at what was accomplished by the last Telecommunications Act of 1996. Today I’m looking ahead at the issues that a new Act needs to address.

Last week we learned more about how the process will probably work. A new telecom act would likely be spearheaded by the Energy and Commerce Subcommittee on Communications and Technology. Last week Rep. Marsha Blackburn, head of that committee, told the press that she favored giving the new FCC a shot at fixing the things under its purview before the House would tackle a new Act. The FCC doesn’t have the authority to make many of the needed changes in telecom regulation, but it does have considerable power. Anyway, this probably means a new act is at least a year away.

Here are some of the things that I think the FCC and Congress need to address to modernize telecom:

Need for More Spectrum. It’s becoming clear that a lot of big ISPs are thinking of deploying 5Gn and various other millimeter wave technologies. The FCC needs to continue to open up more spectrum for broadband. There is still a lot of spectrum has been reserved for government use and there needs to be more attempts to share frequency when possible. There also needs to be a fresh look taken at how frequency is used. Historically many bands of frequency had narrow channels aimed at accommodating voice traffic or a single channel of television. From an engineering perspective we can get a lot more out of spectrum if we can make wider channels in the spectrum bands that are already in use.

Tackling Cybersecurity. 2016 was a year when security breaches led the industry news weekly. There is no easy fix for security issues, but there are big steps that can be taken. For example, we are flooding the world with IoT devices that are easily hacked and which can now be used to launch coordinated denial of service attacks. With Congressional backing the FCC could create standards to make IoT devices more secure. The government will never make us free from hacking, but there are a lot of sensible standards and fixes needed for IoT devices.

Expanding Access to Fast Broadband. As somebody who works regularly in rural America I know that lack of broadband there is now one of the biggest problems identified by rural households. We need to find ways to get good broadband to more places, and we have to do this smartly by building infrastructure that will last for decades. We’ve already seen how not to do this with the CAF II program that is being used to expand DSL and LTE wireless – two technologies that are already inadequate today.

Unless we see that fiber is built everywhere this is going to be an ongoing major issue. For example, if we fix broadband for those that have none but ignore the bigger swathe of the country that has only marginally acceptable broadband today, we will be back in a decade looking at how to fix broadband in those places.

We also need rules that unleashes anybody willing to spend money on fiber. I see numerous rural counties and towns that are ready to spring for bond issues to get fiber. We need rules that allow anybody willing to invest in fiber be able to do so – be that local governments, electric cooperatives, rural telcos or anybody else.

Infrastructure Issues. There are still a lot of infrastructure roadblocks to deploying fiber. We have never done a good job of fulfilling the mandate from the 1996 Act to provide access to poles and conduit. And we are now looking at deploying a fiber-fed wireless network that is going to mean bringing both fiber and power to buildings, rooftops, poles and other infrastructure. We need to find a way to get this done without also trampling over the legitimate concerns of local jurisdictions. For example, the FCC can’t just demand that cities allow free and quick fiber construction if that means digging up newly paved streets or overburdening poles – we need to find rules that work. And we need to do a much better job of this than we have done so far.

Programming. It’s now clear that online video content is competitive alternative to traditional cable TV. We need rules that unleash cable companies and anybody else to sell programming that people really want to buy. That means stepping away from the current rigid cable rules that mandate the giant channel lineups. Companies need to be free to create programming bundles that people want to buy. This might mean allowing a la carte programming. And there must be rules that require content providers to sell to everybody in an unbiased manner.

I don’t know how many of these big issues the current FCC is going to be willing to tackle. It seems like a lot of their agenda for the first six months will be to undo things ordered by the previous FCC. While I understand the desire to mold the FCC to the political persuasion of whatever party is in power, most of the issues on my list above are not partisan. They are just things that we all need to solve if we are to have a telecom infrastructure that serves us all well.

Time for a New Telecom Act, Part 1

capitalNothing is ever certain in the regulatory world, but it looks like there is a good chance that we will see a new telecom act this year. There are certainly parts of the old Telecommunications Act of 1996 that need to be refreshed and there are a lot of new topics like broadband, OTT and the IoT that need to be addressed by Congress. Today’s blog is going to review the old telecom act and tomorrow I will address the changes that I hope are included in any new act.

It’s hard to believe but the Telecommunications Act of 1996 was enacted 21 years ago. From a technological perspective that was almost the dark ages. 1996 was the year that AOL launched its unlimited dial-up product for $19.95 per month (before then subscribers paid by the minute). This drew millions of people to the Internet and convinced them to pay a monthly fee for access. DSL and cable modems were still in the lab and dial-up access ruled the world.

The main thrust of the 1996 Act was to create more competition with telephone service. Ma Bell had been broken up in 1984 which had resulted in long distance competition. Long distance rates dropped steadily over the years after divestiture. Congress decided that it was time to also create competition for dial tone. They recognized that the roadblock to competition was that the big telcos owned the vast majority of the copper lines going to homes and businesses and that nobody was likely to build a second telecom network.

So the Act implemented new rules to promote competition. Some of the changed mandated by the new Act were:

  • Creating a new regulatory category for telephone competitors that was labeled CLEC (Competitive Local Exchange Carrier).
  • Requiring the big telcos to ‘unbundle’ their copper network. This meant that they had to provide access to their copper plant to CLECs. To accomplish this the FCC mandated that CLECs had the right to interconnect to the big telco networks and to collocate in their central offices when necessary.
  • Mandating that the big telcos offer up telecom services for resale. They basically had to sell bulk services to competitors who could then sell them to customers.
  • Requiring that anybody that wanted to build new network be given access to poles and conduits and be allowed to connect to telco network at any reasonable place of their choosing.

The Act was immediately successful and unleashed a flurry of competitive activity. Giant new CLECs were formed that collocated in telco offices gained access to copper loops. The most popular product was the unbundled T1 that allowed new competitors to sell data and telephone services to businesses over one connection. There were also giant companies formed to tackle resale. I recall that one of my clients in those days, Talk America, got over one million residential customers by reselling local phone service along with cheap long distance. Many consultants were formed to help the new competitive companies including my company, CCG Consulting.

The Act also brought about many other changes, some of the most significant being:

  • The regional Bell companies were allowed to get into the long distance business and compete against AT&T.
  • The Act granted the FCC the right of preemption to allow it to override conflicting state rules.
  • The Act created intercarrier compensation for paying for the exchange of traffic between telcos and CLECs.
  • The Act also shook up the Universal Service Fund and made compensation more directly cost-based.
  • The Act also tackled a number of other regulatory issues such as preempting telecom services from franchise fees, establishing rules to define obscene programming, and enabling the over-the-air transmission of digital TV signals.

In many ways the 1996 Act was a big success. Prices for telecom services plummeted in subsequent years. But over time the effective lobbying of the large telcos reversed some of the aspects of the Act, like resale and the unbundling of dark fiber. The Act also did not foresee the explosion of cellphones and of landline broadband and those industries have never gotten the same level of regulatory scrutiny that applies to telephone service. There are still CLECs today making a living by providing DSL over telephone copper. But the increasing needs for faster broadband speeds is starting to make that technology irrelevant and it’s definitely time to consider a new Act to deal with today’s issues.

Regulating the IoT

Nest_Diamond_ThermostatThe FCC has joined other government agencies and private organizations that are concerned about the lack of security with the Internet of Things. The agency issued a 50-page research paper that discussed the issue and came to some troubling conclusions.

From the report: The large and diverse number of IoT vendors, who are driven by competition to keep prices low, hinders coordinated efforts to build security by design into the IoT on a voluntary basis. Left unchecked, the growing IoT widens the gap between the ideal investment from the commercial point of view and from society’s view.

That’s not nearly as strident as the sentiment expressed by most industry experts who understand that most IoT device makers look at security only as an afterthought. It’s been demonstrated repeatedly that almost every IoT device on the market can be hacked, often quite easily. There are exceptions, but a large percentage of devices have little or no defense against hacking.

The Department of Homeland Security is also looking at IoT and issued a set of guidelines they want to the industry to adopt. DHS believes that unprotected IoT devices are a national security threat. We now saw good evidence of this last month after massive denial of service attacks were launched from security cameras and home appliances. The DHS guidelines suggest some common sense requirements like allowing devices to have unique passwords and allowing IoT devices to receive needed software updates.

The Federal Trade Commission is also looking at IoT security issues. The agency recently announced a $25,000 prize to anybody who could offer a security solution for dealing with outdated software in IoT devices.

The Department of Commerce also recently issued IoT guidelines, but the guidelines seem to be aimed internally at the agency and not at the wider world.

This all raises the question of who should be regulating IoT? Right now the answer is nobody – there is no agency that has clear jurisdiction to impose any requirements on the IoT industry. And that is because such authority can only be granted by Congress. We’ve seen this same thing happen many times in the last fifty years as new technologies spring into existence that don’t fit neatly into any existing jurisdictional bucket.

The closest process we have to what is needed to regulate at least part of the IoT today is the way the FCC certifies new wireless and other telecom devices. Most people don’t realize it, but all phones and many other kinds of telecom gear undergo vigorous testing at the FCC to make the sure the devices do what they say they do and to make sure that they won’t interfere with the rest of the world. We need a similar process to tst and certify IoT devices because we can’t ever just take the IoT manufacturers’ words that their devices meet and standards that are developed.

But the FCC today has zero authority to regulate the IoT. For now they have created the ability to regulate ISPs through Title II regulations – but that is expected to be reversed or watered down soon. But even that authority doesn’t give them any jurisdiction over the IoT. Like many technologies, the IoT is something new that doesn’t fit into any existing regulatory framework.

It’s not really comforting, but there are a bunch of other new industries with the same situation. There is no agency that has any clear regulatory authority over driverless cars. Nobody has any real authority to regulate artificial intelligence. There are only very minimal regulations for gene-splicing.

I think most of us believe that some level of regulation is good for these big society-changing technologies. Certainly if nobody regulates the IoT we will have disaster after disaster from misuse of the technology. I hope we don’t wait too long to tackle this until it’s too late and there are billions of poorly manufactured IoT devices in the world that can’t be fixed.

New Technology – February 2017

grapheneThere has been so much going on in the telecom industry lately that I haven’t published a blog examining promising new technologies for a while. Here are a few new breakthroughs that ought to eventually affect our industry:

Metal that Conducts Electricity but not Heat. Physicists at the Lawrence Berkeley National Lab and UC Berkeley have found a metal that contradicts the Wiedermann-Franz Law.  This Law states that good conductors of electricity will also be proportionately good conductors of heat. The physicists were working with vanadium dioxide and unexpectedly discovered this property. There are a few other materials that are much better at conducting electricity than heat, but they only do so at temperatures a few hundred degrees below zero. It appears vanadium dioxide can do this at room temperatures. This property is derived from the fact that electrons move through the metal in a synchronized manner which is normally observed only in fluids, instead of individually which is normally observed in metals.

There is great potential for a material with this property – it could be used as an insulator in computers to keep components cool and to drastically lower the cooling costs experienced in data centers. On a more macro level this could lead to better insulation in homes and appliances and could drastically improve energy efficiency in a wide range of applications.

Superconductor Graphene. Researchers at the University of Cambridge in the UK have found a way to induce superconductivity in graphene. Today all superconducting materials only function at temperatures below -454 degrees Fahrenheit. But their research indicates superconducting graphene will work at much higher temperatures. The researchers created superconducting properties by layering graphene only on an underlying sheet of metal.

Superconduction is a big deal, because in the ultimate state a superconductor passes electrons with zero resistance. Compare that to normal materials, such as our electric grid that loses 7% of generated power getting to homes, and the difference is remarkable.  Finding a room-temperature superconductor would be a huge breakthrough because it could mean electric transmissions with no power losses and an end to the heat generated in electronics and appliances that comes from resistance.

Mass Producing Graphene. Scientists at Kansas State have found a cheap way to mass produce graphene. They discovered the process when working with carbon soot aerosol gels. The process is simple and only requires hydrocarbon gas, oxygen and a spark plug. The gases are forced into a chamber and graphene is formed with a spark. This is a low-power way to make graphene since it only needs a spark rather than continuous power.

Until now graphene has been expensive to make in quantities greater than milligrams and the process required caustic chemicals. With this method it’s easy to make graphene in gram quantities and the process ought to be scalable to much larger quantities.

Better Use of Wireless Spectrum. Engineers at UCLA have found a technique that might allow better use of wireless spectrum. They have found a way to use a tiny device called a circulator that allows a chip to use both incoming and outgoing signals of a given spectrum at the same time. Today’s technology only uses spectrum in one direction since dual use of spectrum has caused interference.

Circulators have been tried before, but earlier devices used magnetic materials which can’t be incorporated into chips. The prototype they built uses coaxial cables to route the signals through non-magnetic materials and they believe the design can be built directly into silicon.

The circulator works by sequentially switching signals using different paths in a similar manner that a busy train station can have trains coming in going in both directions. The design uses six transmission lines and five switches which are turned off and on sequentially to allow incoming and outgoing signals to pass each other without interference.

This would be a big breakthrough for cellphones since it would allow for better use of the spectrum. This wouldn’t increase data speeds, but would allow a cell site to handle more phones at the same time.

Cable Industry Shorts – February 2017

television-sony-en-casa-de-mis-padresHere are a few industry shorts, each not quite long enough to justify a full blog:

New York Takes on Charter. On February 1 the Attorney General of New York sued Time Warner Cable (which is now Charter Spectrum) for delivering inferior products that don’t match what was being advertised to customers.

The specific issue is that the majority of the cable modems provided to customers in the state are not capable of delivering the speeds being sold to customers. For example, in 2013 it was demonstrated that ¾ of the modems sold to supply 20 Mbps service were unable to process that much speed. And it appears that most of those modems still have not been upgraded. The lawsuit accuses the company of never notifying customers that they had inferior modems, and also of recycling inferior modems back to new data customers.

Charter says that the law suit isn’t needed because they have been making improvements since purchasing Time Warner. But the lawsuit alleges that the old practices are still widespread. The lawsuit is asking for significant refunds to affected customers.

Comcast Charging for Roku Boxes. In perhaps the best demonstration of why Comcast is rated so poorly by customers, Comcast says they will still charge customers if they use a Roku box to watch TV rather than a Comcast settop box.

Comcast currently has one of the highest settop box charges around at $9.95 per month, per box.  They also then charge $7.45 for each additional TV in the home using an ‘additional outlet charge.” Comcast hasn’t announced the rate for using a Roku box, but speculation is it will be at the $7.45 rate. This is clearly a case of a cable company charging for something for which they are providing zero value. Perhaps the company has already been emboldened by an FCC and Congress that say they will be reducing regulations.

For a customer to use the XFINITY TV app on a Roku box they must currently subscribe to Comcast cable TV and broadband service. They must have and pay for at least one settop box and also have a cablecard and a compatible IP gateway in the home.

Esquire Channel Disappearing. There is a lot of pressure by the big cable companies to cut back on the number of channels, and the expectations are that less popular networks are going to start disappearing.

The latest network that will vanish from cable line-ups is the Esquire channel. It’s a low-rated channel with content aimed at upscale men that is rated at 82 out of the 105 major cable networks. It was just launched in 2013 and had grown to 60 million subscribers. But last month AT&T and its DirecTV subsidiary decided to drop the channel, cutting 15 million subscribers. Charter is also considering dropping the channel, so NBC, the owner of Esquire, decided to kill the channel for cable systems. Some remnants of it will remain on-line.

Esquire joins the millennial channel Pivot and NBC’s Universal Sports as channels that disappeared in the last year. There are likely more to come and there are 23 networks with lower ratings than Esquire including Fox Business, Great American Country, Chiller and the Golf Channel.

Cable Companies Stop Sending Piracy Warnings. Just about every large cable company and telco has stopped forwarding messages to customers about piracy that were sent through the Copyright Alert System (CAS). These alerts were sent to customers who made illegal downloads of movies or music. The main purpose of these alerts was to warn customers that they were violating copyright laws. The content industry has always pressured ISPs to somehow punish habitual content pirates, but that has never happened to any significant degree.

Groups like the RIAA which were pushing ISPs for compliance have said they will look for an alternative. They said for now that they will probably back off from suing end user customers – a tactic that never seemed to make much difference. This is another case where technology outstripped the law. The CAS launched at the heyday of peer-to-peer file sharing, and while that still exists, it’s not the way that most copyrighted material is shared these days. We now live in a more nuanced world where there is copyrighted material on sites like YouTube sitting right next to mountains of non-copyrighted material, and it’s a lot harder to pinpoint copyright violations.

The Challenges of Fixed Gigabit Wireless

webpass_logoWe got a preview this week of what fixed wireless service might look like in urban environments. Google announced it is aggressively expanding the footprint of Webpass, the wireless ISP that Google purchased last year. The company has been operating in six cities and will now be expanding to nine more markets. These will all be downtown urban deployments.

The deployment uses high-capacity microwave links to serve high-rise buildings. Webpass already has 20,000 residential customers in the six markets, all which live in downtown high-rises. The company focuses more on serving business customers. This business plan has been around for years and I was actually helping to launch a business years ago with the same plan that died with the 2000 telecom crash.

The network consists of microwave shots to each building on the network. The first hurdle in getting this to work is to get enough quality radio sites to see buildings. As I noted in a blog last week, access to this kind of real estate is at a premium in urban areas, as cellphone providers have found when trying to deploy small cell sites.

The radios required to make the links are not gigantic, but you need one full radio and a dish at both ends of every link. This means that from any one given hub building there will be a limited number of links that can be made to other buildings, just due to space limitations. If you imagine half a dozen companies trying to this same thing (this will be the same basic deployment method for urban 5G), then you can picture a proliferation of companies fighting over available radio space on roofs.

Webpass in the past has limited their deployment to buildings that are either already wired with category 5 cable or fiber. They face the same issue that any broadband provider faces in bringing broadband into older buildings – only they are starting on the roof rather than from a basement wiring closet like other ISPs. There are very few ISPs yet willing to tackle the rewiring effort needed in large older buildings that serve residences. As you will see from the pricing below, Webpass and other ISPs are a lot more willing to tackle business buildings and absorb some rewiring costs.

The primary thing for the public to understand about this new roll-out is that it’s very limited. This won’t go to single family homes. It will go to downtown residential high-rises, but only to those that are pre-wired or easy to wire. And even in those buildings Webpass won’t go unless they get at least 10 customers. However, they will contract with landlords to serve whole buildings.

The Webpass pricing is interesting. For residential customers the price is $60 per month regardless of the speed achieved. Webpass says they deliver speeds between 100 Mbps and 500 Mbps, but in reading numerous reviews, there are complaints that speeds can get slower at peak evening time in some buildings (as one would expect when there are a lot of customers sharing one radio link).

Webpass’ pricing for businesses varies according to the number of other customers they get in a building. For example, if there are 10 or more business customers in a building they will sell a 100 – 200 Mbps connection for $250 per month with a 10 TB monthly data cap. But prices are much higher for customers in buildings with fewer than 10 customers:

Speed              Cost                 Data Cap         Price with no Cap

10 Mbps          $125                   1 TB                $375

20 Mbps          $250                   2 TB                $750

50 Mbps          $500                   5 TB                $1,500

100 Mbps        $1,000                10 TB              $2,000

250 Mbps                                                           $2,500

500 Mbps                                                           $4,000

1 Gbps                                                                $5,500

From a technical perspective Webpass is deploying in line with the way the technology works. The radios are too expensive to deploy to smaller customers or to smaller buildings. A building also need to be within a mile of the base transmitter (and hopefully closer) to get good speeds. That is largely going to mean downtown deployments.

We know there are a number of other companies considering a similar plan. Starry announced almost two years ago that they were deploying something similar in Boston, but has yet to launch. We know AT&T and Verizon are both exploring something similar to this Google product using 5G radios. But all of these companies are going to be fighting over the same limited markets.

The cellular companies keep hinting in their press releases that they will be able to use 5G to bring gigabit speeds. When they say that, this is the kind of deployment they are talking about. The only way they are going to be able to bring gigabit wireless speeds to single family homes and to suburbs is if they can develop some sort of mini transmitters to go onto utility poles. That technology is going to require building fiber close to each house and the radios are going to replace fiber drops. The above deployment by Webpass is not hype – they already have customers in six markets. But this technology is not the panacea for fast broadband for everyone that you might believe from reading the press releases.

Erosion of Cable Subscribers

Old TVA lot has been written about the impact of cord cutting and there are varying estimates about how significant the phenomenon has become. But there is a different way to examine the effects on the cable industry, which is to count the number of US homes that are paying to subscribe to each cable channel.

Below I am comparing the numbers of subscribers from August 2013 to the same subscriber counts today for some of the more popular channels. It’s easy to see that almost across the board networks have lost a lot of customers. I chose August 2013 because somewhere around that date was the peak of the cable industry in terms of customers. Since then total customers (and also customers for each network) have dropped.

These drops can’t all be attributed to cord cutting – cord shaving (where customers downsize their cable packages) is also a factor in these drops. Some cable systems are also working hard to cut back on the number of channels they carry. To put this chart into perspective, there are currently about 136 million housing units in the US.

In (000)
Network August 2013 Current Change
Weather Channel 99,926 84,683 (15,243)
ESPN 97,736 87,859 (9,877)
Travel Channel 94,418 84,862 (9,556)
MTV 97,654 88,137 (9,517)
Nickelodeon 98,799 89,663 (9,136)
VH1 96,786 88,085 (8,701)
TV Land 96,282 87,901 (8,381)
Comedy Central 97,838 89,857 (7,981)
A&E 98,302 90,478 (7,824)
SYFY 97,447 89,854 (7,593)
TNT 98,139 90,586 (7,553)
CNN 99,292 91,794 (7,498)
Discovery Channel 98,891 91,829 (7,062)
HGTV 98,229 91,169 (7,060)
AMC 97,699 90,767 (6,932)
FX 97,157 90,389 (6,768)
E! Entertainment 96,472 89,887 (6,585)
Disney Channel 98,142 91,611 (6,531)
Bravo 94,129 87,620 (6,509)
Food Network 99,283 93,062 (6,221)
MSNBC 94,519 89,764 (4,755)
Oxygen 78,208 75,651 (2,557)
NFL Network 70,910 71,252 342
Showtime 28,094 29,014 920
HBO 32,445 34,369 1,924
Hallmark Channel 85,897 88,885 2,988
National Geographic 84,446 89,865 5,419

These numbers tell a different story than articles about cord cutting. Industry estimates of cord cutting during this same time frame vary between 2.5 and 4 million homes that have dropped cable altogether. But these figures show that most major networks have lost between 6 and 10 million paying subscribers in a little under three and a half years.

Obviously not every network is experiencing the same changes. For example, the 15 million households lost by The Weather Channel are due to many cable systems changing to a cheaper alternative. And you can see at the bottom of the chart that there are still networks that are growing. These networks are gaining customers by attracting more subscriptions, like the premium movie channels, or by getting added to additional cable systems that didn’t carry them in 2013.

But overall this is a sobering chart, and one that all of the programmers are well aware of. The various factors of cord cutting, cord shaving, and of cable companies trying to cut back their channels are all steadily eroding the number of households that get to watch the various networks.

Michael O’Rielly’s Vision of Broadband Expansion

FCC_New_LogoA whole lot of the telecom industry is anxiously watching the news to see if there will be a federal program to expand rural broadband. We’ve already had new FCC Chairman Pai come out in favor of closing the digital divide and bringing broadband to everyone. And there are those in Congress pushing for money to expand rural broadband.

Last week FCC member Michael O’Rielly entered the fray with a blog post about funding rural broadband expansion. There are things in that blog I heartily agree with, and others that I disagree with (as you might expect).

O’Rielly warns that the government should not shovel money at a rural solution in such a way as to drastically overspend to get a solution. I completely agree and I wrote a series of blogs last year (1, 2, 3, and 4) that make the same point. The government wasted a lot of money when handing out stimulus grants in the past and I’d hate to see them make the same mistakes again. There is a long list of things that were done poorly in that grant program, but a lot of this was because it was cobbled together quickly. Hopefully, if we give out new federal money to help deploy broadband we can take the time to get it right.

O’Rielly suggests that any rural broadband expansion program be handled through the Universal Service Fund. No matter which part of government tackles this there will be a need to staff up to implement a major broadband expansion program. But I agree it makes more sense to hand this to an existing program rather than to hand it to somebody like the NTIA again.

He stated one thing that has me scratching my head. He stated that he has heard of ‘countless’ examples of where stimulus middle-mile fiber routes hurt commercial providers. I have hundreds of clients, most of them commercial ISPs, and I have never once heard anyone complain about this. Many of my clients instead are enjoying lower-cost rural transport on the BTOP networks. These complaints have to be coming from AT&T and Verizon who don’t like lower-cost alternatives to their massively overpriced special access. Special access transport is one of the biggest killers of rural business plans.

It’s clear that O’Rielly has a bias towards having commercial solutions for broadband rather than government ones. I don’t know anybody that disagrees with that concept. But by now it’s pretty obvious that the big commercial ISPs are never going to invest in rural America and it’s disingenuous to keep pretending that if government funds rural broadband that it will somehow harm them. The big ISPs have been working hard to withdraw from rural America and the providers that are left – the independent telcos, cooperatives, and rural governments – are the ones we should trust to deploy the broadband we know is needed.

I take major exception to his contention that “ultra-fast residential service is a novelty and good for marketing, but the tiny percentage of people using it cannot drive our policy decisions.” This statement has two glaring omissions. First, there are many households that need fast speeds today for home-based businesses, education, and reasons beyond just watching videos or playing games. When 10% of homes in the US don’t have broadband those homes are excluded from participating in the benefits of the digital economy. It’s hard to put a dollar value on what that is costing our economy – but it’s huge.

But second – and more importantly – this ignores the inevitable increase in demand over time. US households have been doubling their need for speed and the amount of total download every three years since 1980 – and there is no sign that growth in demand is over. This means any network that is just adequate today is going to feel obsolete within a decade – and this also means you don’t make policy for today’s demands, but for demands that we already know will be here in another decade. This is why there has to continue to be a focus on fiber first. As much as O’Rielly might hate some of the worst practices of the stimulus grants, his FCC approved the disastrous giveaway of billions to the big telcos to expand rural DSL in the CAF II program. We can’t take that path again.

Finally, O’Rielly says that the government should not be picking broadband winners and losers. That sounds like a great political sentiment, but if the government is going to supply funding to promote rural broadband that money has to go to somebody – and by definition that is picking winners. But O’Rielly does temper this statement by saying that funding shouldn’t just go to the ‘well-connected’. I hope he really means that and gets behind a plan that doesn’t just hand federal broadband funding to AT&T, Verizon and CenturyLink.