Free Broadband from Facebook

freebasics_facebook_thumbFacebook is talking to the FCC about launching a free Internet service in the US. This would provide a subset of the Internet for free to anybody with a smartphone and would provide such things as news, health information, job sites, and of course Facebook.

This would obviously benefit many people that can’t afford access to the web. Today the national broadband penetration of households that have some kind of access to the web is around 82%. Some of those without broadband live in rural places that don’t have access. Some don’t want Internet access. And the rest would like web access but can’t afford it.

Facebook has launched a similar product around the world in 53 emerging markets in the Middle East, Asia Pacific and Latin America. This is offered under the name Free Basics.

But the free product ran into problems and has been banned in India due to the fact that it violates net neutrality. The Indian net neutrality laws aren’t too different than our own laws and the service is what called zero-rated, meaning that any use of this plan is not counted against a data plan from a participating ISP.

In India the biggest complaint about the product was that it was restricted only to those things that Facebook wanted customers to see and not to the wider Internet. But in Facebook’s favor, it was free.

For this to work in the US, Facebook will need to find a US cellular partner which would not count usage of the app against a data plan. I recall that Facebook was close to this a few years ago in a partnership with T-Mobile that would have provided free access to a suite of products called GoSmart.

But more importantly, Facebook needs to convince the FCC that this is not a violation of net neutrality. The FCC has not formally made any pronouncements about zero-rating of wireless content, but it has talked to the major wireless carriers about the zero-rating they are already doing today.

This is the kind of situation that is really tough for regulators. With this kind of product Facebook could be providing some sort of free access to the web for millions of people in the country that might otherwise not have it. Even if it’s a scrubbed and sanitized piece of the web, it’s hard to find anything wrong with the results of that. People could buy a smartphone with no data plan and have access to parts of the web.

But the downside to the FCC is the same one faced by the Indian regulators. Once you let Facebook do this then the genie is out of the bottle and there doesn’t seem to be any way that the FCC could stop other kinds of zero-rating.

The dilemma is that Facebook is not quite like other companies. I am sure that somehow this isn’t costing Facebook too much and they might even make a little money from the idea. But Mark Zuckerberg seems to be on an altruistic mission to bring broadband access to the whole world. He has already used this idea to bring free broadband to many millions, and his goal is to bring it to billions.

But even with the altruism, this has certainly been good for Facebook – they had 1 billion users in 2015 and are now are reported to have over 1.7 billion users. That’s a lot of people to advertise to and to gather data from, which is how Facebook makes its money.

And of course, no matter how altruistic Facebook might be, nobody would expect the same motives from other large companies like Comcast, AT&T or Verizon. One of the main fears that drove the creation of net neutrality is that we could end up with a web that is filtered by the biggest ISPs and that the openness of the web would be killed by deals like the one Facebook wants to do. The web brought to you by Comcast is not the same web that we know today – and I think it’s a web that we don’t want as a society. But if we take the first step and let a big company like Facebook filter the web, we could be headed down the path where almost all future web access is filtered.

How to Collect Broadband Lifeline

USF-logoThe Wireline Bureau of the FCC released clarification rules last week in Docket DA 16-1118 that describe how companies can participate in the broadband Lifeline program. This is the program where the Universal Service Fund will compensate ISPs $9.25 per month for broadband customers that qualify for the Lifeline program.

The program requires landline speeds of 10/1 Mbps with a data cap of no less than 150 GB per month. Mobile speeds can be slower and there is also a much lower data cap starting at 500 MB and increasing to 2 GB by the end of 2018. The FCC has established a registry listing eligible participants called the National Eligibility Verifier. Only households in that registry can receive the Lifeline subsidy and only one subsidy is allowed per household.

The new clarification in the docket describes the process for ISPs to participate in the Lifeline Fund. The FCC will require ISPs to register as a Lifeline Broadband Provider (LBP). The FCC is developing an application process for ISPs that want to gain this designation.

The original order said that the FCC had up to six months to act on LBP applications, but there is now the ability to request a streamlined process where the FCC will approve requests within 60 days. Basically an ISP must complete the application, and if they don’t hear back from the FCC then they automatically have the designation on the 60th day after submission of the request. If the FCC asks questions or asks for changes to the submitted information then the request will be approved 60 days after the request filing has been amended and corrected.

In order to qualify for the streamlined and expedited review process an applicant must 1) serve at least 1,000 non-Lifeline voice customers and/or 1,000 Lifeline-eligible broadband Internet access service (BIAS) customers. This would be measured as a snapshot as of the time of making the application; and, 2) has offered broadband service to the public for at least two years, without interruption. So the expedited process is for established ISPs and not new ones.

Any ISP that doesn’t meet the streamlined review process will still have their application reviewed within six months.

Carriers that are already certified as Eligible Telecommunications Carriers (ETCs) or as Lifeline-only ETCs do not need to seek the LBP status unless they are seeking to ask for Lifeline subsidies in new geographic areas where they were not previously certified.

In a petition to seek LBP status a carrier must:

  • Certify that they will meet all of the service requirements of the Lifeline program.
  • They must demonstrate the ability to remain functional during emergency situations and that they have taken precautions such as having back-up power to remain functional.
  • Demonstrate that they will satisfy all applicable consumer protection service quality standards.
  • Demonstrate that they are financially and technically capable of meeting all of the FCC rules needed to provide Lifeline. The FCC will look to see that the company can be viable without receiving the subsidies.
  • Provide the terms and conditions that the ISP will offer to Lifeline subscribers.

The FCC is clearly trying to help as many ISPs as possible to participate in the Lifeline program. If your company is interested in taking part in this program feel free to contact CCG Consulting and we can help you through the application process.

Do We Need 10 Gbps?

wraparound-glassesWe are just now starting to see a few homes nationwide being served by a 1 Gbps data connection. But the introduction of DOCSIS 3.1 cable modems and a slow but steady increase in fiber networks will soon make these speeds available to millions of homes.

Historically we saw home Internet speeds double about every three years, dating back to the 1980s. But Google Fiber and others leapfrogged that steady technology progression with the introduction of 1 Gbps for the home.

There are not a whole lot of home uses today that require a full gigabit of speed – but there will be. Home usage of broadband is still doubling about every three years and homes will catch up to that speed easily within a few years. Cisco recently said that the average home today needs 24 Mbps speeds but by 2019 will need over 50 Mbps. It won’t take a whole lot of doublings of those numbers to mean that homes will expect a lot more speed than we are seeing today.

There is a decent chance that the need for speed is going to accelerate. Phil McKinney of CableLabs created this video that shows what a connected home might look like in the near future. The home owns a self-driving car. The video shows a mother working at home with others using a collaboration wall, with documents suspended in the air. It shows one daughter getting a holographic lecture from Albert Einstein while another daughter is talking with her distant grandmother, seemingly in a meadow somewhere. And it shows the whole family using virtual / enhanced reality goggles to engage in a delightful high-tech game.

This may seem like science fiction, but all of these technologies are already being developed. I’ve written before about how we are at the start of the perfect storm of technology innovation. Our past century was dominated by a few major new technologies and the recent forty years has been dominated by the computer chip. But there are now literally dozens of potentially transformational technologies all being developed at the same time. It’s impossible to predict which ones will have the biggest influence on daily life – but many of them will.

Most of these new technologies are going to require a lot of bandwidth. Whether it’s enhanced reality, video collaboration, robots, medical monitoring, self-driving cars or the Internet of Things, we are going to see a lot of needs for bandwidth much greater than today’s surge due to video. The impact of video, while huge today, will pale against the bandwidth needs of these new technologies – particularly when they are used together as implied in this video.

So it’s not far-fetched to think that we are going to need homes with bandwidth needs beyond the 1 Gbps data speeds we are just now starting to see. I’m always disappointed when I see ISP executives talking about how their latest technology upgrades are making them future proof. There are only two technologies that can meet the kinds of speeds envisioned in McKinney’s video – fiber and cable networks. These speeds are not going to be delivered by telephone copper or wirelessly, and to think so is to ignore the basic physics underlying each technology.

Some of the technologies shown in KcKinney’s video are going to start becoming popular within five years, and within twenty years they will all be mature technologies that are part of everyday life. We need to have policies and plans that look towards building the networks we are going to need to achieve that future. We have to stop having stupid government programs that throw away money on expanding DSL and we need to build networks that have use beyond just a few years.

McKinney’s video is more than just an entertaining glimpse into the near-future; it’s also meant to prod us into making sure that we are ready for that future. There are many companies today investing in technologies that can’t deliver gigabit speeds – and such companies will grow obsolete and disappear within a decade or two. And policies that do anything other than promote gigabit networks are a waste of time and resources.

An Argument for Data Caps

slow-downA few weeks ago Mediacom sent a letter to the FCC as part of Docket 16-245 that defended data rate caps. The letter was signed by Joseph E. Young, the Senior Vice President, General Counsel, and Secretary of the company.

Mr. Young lays forth probably the best argument for data caps I have seen. This is from his letter:

Imagine you are out for a walk and experience a sudden, irresistible craving for Oreo® cookies.  You only want to spend two dollars, which means that you will be able to buy a two-pack or maybe even a four-pack but for sure you cannot get the family size of over 40 cookies.  For that many, you have to spend more. Of course, it would be nice if your two dollars bought you the right to eat an unlimited number of cookies, but you know that is not the way our economy works. It is the same for the Starbucks latte you might want to drink with your cookies and for socks, gasoline and just about every single one of the thousands of other products and services that are for sale in the United States, including essentials like water and electricity. 

In the case of virtually everything you buy, the fact that your cost goes up as you consume more will neither surprise you nor set you off on a passionate crusade to get the government to force producers to sell an unlimited quantity at a fixed price. We all know this to be the way things work in our economy and understand at some level that there are valid reasons for why that is so. . . . Remarkably, the only exception to this truism we can think of is bandwidth.

He goes on to say that what ISPs are doing is not greed, but just trying to put broadband on the same basis as other products. He laments that ISPs are thought of as greedy when trying to price their product like everything else in the economy. You have to admit that at least on the surface this sounds reasonable.

However his argument lost a little steam when he went on to say that, “A fair number of otherwise intelligent people vociferously complain about ISPs imposing a “cap” on bandwidth usage.” He basically called everyone who is against data caps stupid, and this probably won’t go well at the FCC, where a lot of staff are against data caps.

But to counter the Mediacom argument you only have to look back to see how Comcast implemented their data caps earlier this year to see how data caps are really just all about greed and greater revenues. Comcast had a data cap of 250 GB for many years, although it was rarely enforced. The company raised the cap to 300 GB and then starting enforcing it in various trials around the country. They offered two options to customers that exceeded the cap: either pay $30 more per month to get unlimited data or else pay $10 for every 50 GB over the cap.

Both of those options increased revenues for Comcast significantly. And that’s where the greed came to bear. If this was not about making more money Comcast could have implemented data caps with a rate rebalancing. As an example, they could have lowered all data plan rates by $10, so that people who don’t use a lot of data would save money. Only customers who exceeded the caps would pay more. If the rate rebalancing was done right, then Comcast would keep the same revenues as before and customers would be paying more in line with their usage. To use Mr. Young’s analogy, if Comcast wanted to get prices right they should have started out by first right-pricing the small pack of Oreos. Instead Comcast was satisfied that the small pack of Oreos cost as much as the large pack, and they then jacked up the price of the large pack.

This was clearly a money-making scheme for Comcast, and the public outcry was so big that it got a lot of attention from the FCC. Comcast backed down and unilaterally raised the data cap on most plans to one terabyte. But new last week show that they want to impose the same pricing scheme on the 1 terabyte limit. This won’t affect many users today, but within a decade it will affect a significant percentage of Comcast’s users.

If Comcast had rebalanced rates they would have been lauded instead of vilified. While those that paid more might be yelling, the millions who paying less would largely offset that. But instead Comcast went straight for the money grab and to their chagrin, everybody was watching.

The other thing that Comcast missed is that, for most products we buy, the prices charged have some semblance to their costs. It certainly costs more to make a big pack of Oreos than a small one. But the public gets upset when prices greatly exceed costs – just look at the recent outcry about the EpiPen. Comcast’s big problem is that the public understands that there is very little difference in cost between most Internet users. Yes, those who use huge amounts of data cost an ISP more money, but there is very little difference in cost to Comcast between a household using 200 GB and one using 500 GB in a month. There is no gigabyte spigot at Comcast that is equivalent to a gas pump that would justify a big price differential between these two households. There would have been a lot less public outrage had the overage charges been $5 rather than $30.

As a big user I am obviously not nuts about the idea of paying more for broadband. But I wouldn’t have great qualms if a rate rebalancing brought very cheap prices to my mother (who barely uses any bandwidth) while I am charged more. But that’s not what we are seeing with price caps in the market. Instead we have low bandwidth products that are overpriced and the ISPs wanting to charge even more to somebody who actually uses what they have purchased.

My Thoughts on AT&T AirGig

PoleBy now most of you have seen AT&T’s announcement of a new wireless technology they are calling AirGig. This is a technology that can bounce millimeter wave signals along a series of inexpensive plastic antennae perched at the top of utility poles.

The press release is unclear about the speeds that might be delivered from the technology. The press release says it has the potential to deliver multi-gigabit speeds. But at the same time it talks about delivering 4G cellular as well as 5G cellular and fixed broadband. The 4G LTE cellular standard can deliver about 15 Mbps while the 5G cellular standard (which is still being developed) is expected to eventually increase cellular speeds to about 50 Mbps. So perhaps AT&T plans to use the technology to deploy micro cell sites while also being able to deliver millimeter wave wireless broadband loops. The link above includes a short video which doesn’t clarify this issue very well.

Like any new radio technology, there is bound to be a number of issues involved with moving the technology from the lab to the field. I can only speculate at this point, but I can foresee the following as potential issues with the millimeter wave part of the technology:

  • The video implies that the antennas will be used to deliver bandwidth using a broadcast hotspot. I’m not entirely sure that the FCC will even approve this spectrum to be used in this manner – this is the same spectrum used in microwave ovens. It can be dangerous to work around for linemen climbing poles and it can create all sorts of havoc by interfering with cable TV networks and TV reception.
  • Millimeter wave spectrum does not travel very far when used as a hot spot. This spectrum has high atmospheric attenuation and is absorbed by gases in the atmosphere. When focused in a point-to-point the spectrum can work well to about half a mile. But in a hot spot mode it’s good, at best, for a few hundred feet and loses bandwidth quickly with distance traveled. The bandwidth is only going to reach to homes that are close to the pole lines.
  • Millimeter wave spectrum suffers from rain fade and during a rain storm almost all of the spectrum is scattered.
  • The spectrum doesn’t penetrate foliage, or much of anything else. So there is going to have to be a clear path between the pole unit and the user. America is a land of residential trees and even in the open plains people plant trees closely around their house as a windbreak.
  • The millimeter wave spectrum won’t penetrate walls, so this will require some sort of outdoor receiver to catch millimeter wave signals.
  • I wonder how the units will handle icing. Where cables tend to shake ice off within a few days, hardware mounted on poles can be ice-covered for months.
  • The technology seems to depend on using multiple wireless hops to go from unit to unit. Wireless hops always introduce latency into the signal and it will be interesting to see how much latency is introduced along rural pole runs.
  • For any wireless network to deliver fast speeds it has to be connected somewhere to fiber backhaul. There are still many rural counties with little or no fiber.

We have always seen that every wireless technology has practical limitations that make it suitable for some situations and not others. This technology will be no different. In places where this can work it might be an incredible new broadband solution. But there are bound to be situations where the technology will have too many problems to be practical.

I’ve seen speculation that one of the major reasons for this press release is to cause a pause to anybody thinking of building fiber. After all, why should anybody build fiber if there is cheap multi-gigabit wireless coming to every utility pole? But with all of the possible limitations mentioned above (and others that are bound to pop up in the real world) this technology may only work in some places, or it might not work well at all. This could be the technology we have all been waiting for or it could be a flop. I guess we’ll have to wait and see.

Wall Street and Programmers

wall-streetIn an intriguing development, analyst Michael Nathanson has downgraded Discovery Networks and Scripps Networks Interactive from ‘neutral’ to ‘sell’. His reason is that he sees a poor future for programmers that don’t carry live TV events like sports or news.

Discovery Networks produces the various Discovery channels along with Animal Planet, TLC, Science, Velocity, OWN and American Heroes Channel. Scripps produces HGTV, the Food Network, DIY Network, the Cooking Channel, the Great American Country, the Travel Channel and TVN.

Nathanson believes that advertising is starting to chase live content and is abandoning other content. There is a major trend in the country for people to skip traditional broadcast ads using DVRs and video on demand. He further recognizes that all cable channels are losing viewers to OTT alternatives like Netflix. This all will add up to a significant drop in advertising revenues for traditional cable networks that stream shows paid for by advertising.

These networks are also feeling pressure from cable subscriptions. We know, for example, that ESPN lost millions of customers since 2015 and one has to think that the same thing is happening to all of the other networks. The ESPN losses seem to be due in part to cord cutting, but even more to cord shaving where customers are downsizing their cable packages. I listen to a lot of radio and I constantly hear ads from DirecTV and others to buy their new skinny bundles. Each time somebody picks a skinny bundle or an alternative like Sling TV, a whole lot of channels lose a monthly subscription.

This might be the first crack in the programmers’ armor. For nearly two decades they have been able to raise rates to cable companies while also enjoying ever-increasing advertising revenues. And this ever-growing revenue made the programmers a favorite of Wall Street which rewards revenues that grow quarter after quarter. But we are starting to see advertising revenues abandoning cable and moving to online venues. This year is the first year when web advertising will eclipse TV advertising.

It seems for these networks we are seeing a perfect storm. Advertising in general is leaving cable – and within that shift, if Nathanson is right, it will leave traditional cable channels much faster than those offering live programming. We are also seeing traditional cable subscriptions shifting to skinny bindles and OTT. There is no doubt that all of this is going to add up to smaller revenues for these networks. And since contracts between programmers and cable companies are for 3 -5 years the programmers don’t have the ability to raise subscription rates quickly enough to make up for these losses. Even if they tried to maintain growth through rate increases it’s likely today that they would get a lot of pushback from cable companies.

It’s hard to feel any sympathy for the programmers because it is their greed that has made cable too expensive for many homes. Programming rates in recent years have increased nearly 10% per year – many multiples faster than general inflation. Those rate increases were clearly done to please Wall Street, but it didn’t take a crystal ball to see that the increases were not sustainable.

The way that we value large companies in the US is perverse. These networks make a lot of money. And even with all of these changes they are going to continue to make a lot of money for a long time to come. But companies that fall out favor with Wall Street generally have huge problems. These companies are going to be pressured to somehow fix the situation, but there doesn’t seem to be any way for them to do that. We are likely to see them start ditching unprofitable channels. The companies might be sold or split up into smaller companies. It’s unlikely once Wall Street abandons a company for it to just sit still.

The programmers have held almost all of the power in the industry for a long time – but maybe we are starting to see a change. That can only be a good thing for the industry.

Cable Companies under Regulatory Siege?

FCC_New_LogoEarlier this year Michael Powell (the head of the National Cable Television Association) complained that the FCC has launched a regulatory assault again cable companies – and in some ways he is probably right. Some of the regulations ordered or contemplated are clearly aimed at cable companies – yet much of the new regulation was aimed at somebody else but still affects the cable companies.

Consider all of the changes affecting the cable companies right now:

  • Net neutrality has meant that cable companies and other ISPs can’t make lucrative deals with content providers to bundle content as part of broadband access.
  • But the biggest change from the net neutrality order is the advent of Title II regulation of the internet. This is resulting in a raft of new regulations for broadband. All of a sudden the FCC is looking at data caps. The agency has demanded that all ISPs disclose all of the details of their broadband connections to customers. Cable companies are suddenly covered by customer privacy regulations – the biggest being that they probably can’t use the information they gather as an ISP without a customer’s approval.
  • The cable companies have become huge sellers of broadband transport and data pipes to businesses. The FCC is about to make major changes in the special access market and that is likely going to lower prices for these products. Special access rates are incredibly high and cable companies and CLECs have made a living out of selling services to businesses at a discount from the published special access rates. The result is that businesses pay a gigantic premium for dedicated broadband connections, and everybody expects the FCC to lower rates across the market.
  • The FCC’s move to somehow eliminate settop boxes is aimed right at the cable companies. To a large extent the industry brought this on themselves as they’ve raised rates to rent a settop box from $5 to $10 or more in most markets. But the idea that there can be some sort of generic solution that can work on every type of network sounds idealistic, at best.
  • The FCC seems to want to allow anybody to carry video content on the Internet without saddling the new providers with the same rules that govern cable companies. So cable companies, for now, are stuck with rules that force them to offer certain kinds of tiers of service while OTT providers can cook up any creative package they can cobble together.

As a telecom guy I find this all to be somewhat ironic. I remember when I first read through the Telecommunications Act of 1996 that my first reaction was that the FCC had let the cable companies completely off the hook. The big telcos were being forced to unbundle their networks to offer voice loops and DSL connections while the cable companies had no corresponding obligation to unbundle for cable modem connections. In the decade following the Act, most state Commissions also excused cable companies from most forms of voice regulation. The cable companies were able to somehow characterize the voice on their networks as VoIP and got out of most voice regulations – but from a customer perspective the cable voice product was indistinguishable from telco voice products. It’s one of the first times that the FCC made an exception for a product based upon the technology used to deliver it – a trend that has since led to some very odd regulatory rulings.

So now it seems that the wheel has turned and the cable companies are being brought back into the regulatory arena with everybody else. I think Powell is right and those in charge of a cable company must feel like they are under regulatory siege. But except for the settop box issue, which is an odd set of regulations clearly aimed at the cable companies – the other regulations can mostly be described as leveling the playing field – something that the cable companies have always said should apply to municipal broadband providers.

But from a regulatory perspective the protections provided to consumers ought to be the same across all broadband technologies. It makes a lot of sense to finally require cable companies to provide privacy protection and to disclose the details and terms of the products they are selling. I have to laugh once in a while about regulation. Five years ago a colleague of mine said he could foresee the end of telecom regulation. But I countered by saying that regulators like to regulate, and sure enough it seems like we have as many – or more! –  regulations today as ever.

Verizon Phasing Out Copper Services

fios vanVerizon has asked the FCC for permission to discontinue a number of legacy copper-based products in the Northeast. This signals the company’s ongoing plan to pull down copper wires and go to an all-digital network.

Specifically Verizon wants to eliminate their Voice Grade, WATS Access Line, Bonded Digital Link, Digital Data, and DIGIPATH Digital Service II. These are somewhat obscure services, mostly used by businesses, and which for the most part have been supplanted by better products over the years.

What this filing doesn’t specifically say is that Verizon will eventually accompany this tariff change by a request to remove their copper network. That’s what they did earlier this year in New Jersey.

I find it hard to criticize the company for wanting to move customers from copper to fiber. I have a lot of small telco clients who have done the same thing over the last decade. There are a few customers that worry about such a transition because they have some legacy function like fax machines, health monitors, burglar alarms or T1s that they are afraid won’t work with the updated technology. For the most part there are not very many such applications around that can’t be made to work on fiber. Fiber technology can emulate almost every TDM copper-based function.

There comes a point where it doesn’t make economic sense to maintain an old copper network for a tiny handful of customers using old applications. I have a hard time thinking that customers have a right to stay on copper when there is something better available.

But I also think the public unease over these transitions is because the public doesn’t trust Verizon. Verizon got a lot of bad press after hurricane Sandy hit Fire Island and the company wanted to replace the destroyed copper with cellular service.

The problem is that Verizon doesn’t have fiber everywhere –not even close to everywhere. What happens where there is no fiber availability? When Verizon built FiOS they only built fiber where the costs to do so were low, and this resulted in a patchwork fiber network – where one street or one subdivision has fiber, but the next doesn’t. The company also largely built fiber in the suburbs of the major cities and they largely ignored downtown urban neighborhoods as well as smaller towns and all rural areas.

AT&T is being open about its plans to move homes to a fixed cellular connection. But as Verizon starts pulling down more copper they are either going to have to build new fiber to people or offer the same kind of cellular connections as AT&T. And it doesn’t seem likely that Verizon is going to extend FiOS fiber networks today to neighborhoods they judged too expensive to build fifteen years ago.

Verizon’s union members have been complaining for years that the company has been neglecting the copper plant – and these technicians are right. It’s a behavior we have seen from all of the large telcos for decades. Twenty years ago Verizon started trying to find a buyer for their network in West Virginia. It took them more than a decade to finally sell it to Frontier, and during the interim they cut maintenance to the bone. But this is not a singular example and huge parts of the Verizon, AT&T and CenturyLink networks are in bad shape due to many years of neglect.

The shift away from copper is inevitable. A lot of these networks were built in the post WWII decades and they have lasted longer than intended. It’s a testament to the high standards of the old Ma Bell system that these networks are still working today. Critics of Verizon want the company to maintain the copper networks for a few more decades – but that is unrealistic, and in many cases becoming technically impossible, and it’s time to shift focus to make sure Verizon doesn’t start quietly dropping homes and leaving them stranded with no communications options.

Regulatory Shorts for September 2016

Wi-FiToday’s blog contains a few items that are cautionary tales of things not to do.

FCC Funding for Broadband. The FCC recently rejected three petitions for reconsideration filed by companies that had been awarded the experimental broadband grants last year, but then failed to meet the grant criteria. Two of the companies were start-ups and did not have audited financial statements. Another had an audit but filed it after the grant-specified deadline. The FCC refused to give the three companies the grants since they didn’t meet all of the requirements.

The cautionary tale here is that anybody filing for a grant should make certain ahead of time that they meet all of the requirements. There were a lot of applicants who received broadband grants out of the stimulus funding a few years ago that did not qualify, but for whom the government accepted waivers. However, that was an extraordinary circumstance due to the huge size of the grants and the desire of the government to use the funding. In more normal circumstances it’s exceedingly hard to get a waiver for a company that doesn’t meet all of the qualifications.

Fines for Unauthorized Wireless Connections. The FCC recently fined AT&T $450,000 for using licensed wireless connections that were made without FCC authorization or that had not been properly licensed. These were mostly fixed microwave connections, something that is relatively easy to get licensed.

The cautionary tale is to be sure to take care of the paperwork when deploying wireless systems. It’s becoming fairly routine for companies to deploy microwaves to provide wireless backhaul for point-to-multipoint wireless networks or for serving cellular sites. There are also now licensing requirements for anybody deploying subscriber radios that use the 3.65 GHz spectrum. Failure to obtain the microwave licenses could end up with fines like AT&T just paid. But since there are places in the country where it’s not legal to deploy the 3.65 GHz radios, failure to clear it first with the FCC could even end up in having your systems shut down.

Blocking WiFi. FCC Commissioner Jessica Rosenworcel has asked the agency to investigate a seeming restriction of WiFi for those attending the first presidential debate. Apparently, Hofstra University had blocked people from establishing WiFi hotspots from their cellphones and instead wanted journalists to buy a $200 connection for the evening. The FCC in the past has come down hard on this kind of blocking against Marriott hotels and others. The FCC has levied large fines in almost every such case brought to their attention.

The cautionary tale here is not to block WiFi. Companies and cities are often tempted to do this as a security measure and the technology to block WiFi is readily available. But WiFi is a public spectrum and it’s always against the law to block somebody from establishing their own hotspot.

Using E-Rate Broadband Off-Campus. Two school districts have petitioned the FCC to allow them to extend broadband that is subsidized by the E-Rate program to students and others living close to the campus. Under current rules, if an E-Rate network is used for applications other than the school, then the cost of the system has to be allocated between the school and other uses. These petitions ask that the FCC allow them to use the school broadband to serve students and other parties that can benefit from the broadband, without having to allocate the costs.

There is both a caution here as well as an opportunity. The caution is to beware if an E-Rate broadband connection is used by more than the school. For example, it’s not unusual (particularly with private schools) to have other organizations or entities collocated with the school. In such a case the E-Rate applicant needs to make sure to allocate the costs between the school and the other entities. Failure to do so could end up with a loss of the E-Rate subsidy. But the opportunity also exists with wireless networks to provide home broadband to students who live close to the school. If these petitions are successful it could open up many possibilities for schools to benefit nearby residents.

Two Approaches to Low-Income Broadband

slow-downLarge ISPs are taking different approaches to bringing broadband to low-income households. Consider recent news about Comcast and AT&T:

Comcast has a program called Internet Essentials that provides broadband to low-income households. Comcast delivers 10 Mbps download speeds to qualifying households for $10 per month. The program was created as a condition by the FCC for the purchase of NBC Universal in 2011. For a long time the program was very low key and the company barely advertised it to customers. But over the years the company added 600,000 to the program.

Now that the FCC has created a federal Lifeline subsidy for broadband the company has become more vigorous in seeking customers and reported recently that it now has over 3 million customers in the program. The qualifying requirement for Internet Essentials has been to have at least one child in a household eligible for free or reduced price lunches. But recently Comcast has allowed households without children to apply for the program.

AT&T was also required to provide low-cost broadband as the result of its purchase of DirecTV. AT&T uses a different requirement for eligibility and will provide assistance to customers who take part in the SNAP (food stamp) program. Since there are about 21 million households in the country in the SNAP program there are a lot of eligible households in the AT&T footprint.

The FCC conditions from the DirecTV merger required that AT&T would provide broadband at different prices according to the technology they have available in different neighborhoods. The program has several tiers: 10 Mbps download for $10 per month, 5 Mbps download for $10 per month and 3 Mbps download for $5 per month. The company is supposed to provide the fastest of these speeds available in a given area.

But AT&T is now in hot water at the FCC because they are denying the program to a lot of households by using the argument that they have many neighborhoods where they can’t deliver the 3 Mbps speed required for the lowest tier. In such neighborhoods they are not offering the program.

Ironically, in neighborhoods where the fastest speed is 1.5 Mbps they will still sell that broadband for more than $50 per month, but they won’t offer the same product for the reduced price. AT&T is basing the refusal to offer low-income prices on language in the merger agreement that said they would only have to offer subsidized broadband ‘where technically feasible’ – and they are arguing that they are technically unable to deliver the lowest 3 Mbps speeds required by the program.

This is not an isolated problem. For example, the FCC’s broadband mapping system shows that 21% of the census blocks in Detroit can’t get broadband speeds greater than 1.5 Mbps. These large swaths of old and slow DSL are a result of the company’s decision over the years to not invest in faster DSL in poor neighborhoods.

It’s easy to think of very slow broadband as a rural issue. But the FCC’s records make it clear that there are a lot of neighborhoods in urban areas that have been bypassed by ISPs. These are families that connect with old first generation DSL equipment and who live in homes or apartments that are not connected to the cable TV networks.

For many years Comcast fought against the agreement they had made with the FCC to offer low-income broadband. It’s good to see them finally embrace the plan, and I’m sure that has a lot to do with the federal Lifeline program that will provide $9.25 per month to Comcast for every qualifying customer. Added to what they are getting from customers the product should be profitable. This would be even better if the company would offer real broadband at 25 Mbps for the same low price.

It’s hard to understand why AT&T is weaseling out of their obligation. One would think that this program would generate a lot of revenue from copper that has been paid for generations ago. It can’t cost the company very much to provide a broadband connection at any of the speeds they offer in these neighborhoods, particularly those with the 1.5 Mbps speeds. AT&T freely agreed to offer low-income broadband when they bought DirecTV and it’s hard to think of a valid reason for them to renege. The only reason I can think of for their position is that perhaps they have plans to start tearing down copper in these neighborhoods and don’t want a lot of customers using the networks.