The Recent ALEC Letter to the FCC

Every once in a while I see an idea in this industry that makes me shake my head. Recently ALEC (the American Legislative Exchange Council) wrote a letter to the FCC asking them to override all state and local laws pertaining to making broadband connections. They specifically cite the issues associated with the placement of small cell sites. They urge the FCC to declare broadband to be an ‘interstate’ service and use that as the basis for setting nationwide rules.

It’s easy to understand where ALEC is coming from. It’s an organization that is funded by the largest corporations in the county and the biggest telcos and cable companies help to fund ALEC. Over the years the organization has drafted proposed legislation that benefits the large ISPs, and in recent years ALEC was behind many of the state legislative initiatives to block municipalities from building broadband networks. ALEC has also commissioned various white papers that espouse the positions of the big ISPs. The white papers are generally intended to be used to lobby with state and federal legislators.

I don’t think anybody in the industry is unsympathetic to some of the worst stories being told about locating small cell sites. There certainly are cases where local rules are definitely a barrier to deployment. But this is an area where states and cities are allowed to create local rules.

And it’s not hard to understand why ALEC would petition the FCC. This has to be the most big-carrier friendly FCC in the last century. It’s clear that this FCC would grant the ISPs many of the things on their wish list. If the FCC adopted what ALEC is asking for then the big ISPs could solve their problems in this area in one fell swoop – and they could stop the expensive lobbying effort at the state and local level.

But there are a few flaws in ALEC’s arguments. First, many of the FCC’s rules are the result of legislation passed by Congress. For instance, the FCC has no authority to override anything that was required by the Telecommunications Act of 1996 or many other congressional laws, and that law provided states and localities the right to make local rules concerning rights-of-ways and connections on poles or in conduits. I find it doubtful that the FCC can arbitrarily preempt those specific parts of that Act.

But the most important reason this makes no sense is that it comes just a few weeks before the FCC is likely to reverse Title II regulation of broadband. Once the FCC does that they will have effectively taken themselves out of the broadband regulation business. They will be handing off things like broadband privacy to the Federal Trade Commission, but many areas of broadband will become purely unregulated. The FCC can’t declare broadband to be an interstate service if they don’t regulate broadband.

It’s kind of ironic that the only way the FCC could try to do what ALEC is asking would be by maintaining Title II regulation – the only tool they have for regulating broadband. But once they renounce Title II authority the FCC is greatly weakened in making any regulations concerning broadband. I’ve always asserted that the big ISPs need to be regulated in some manner and this is a perfect example why. A friendly FCC with the authority to regulate broadband could give the big ISPs the things they most want – but the trade-off of being regulated is that it also means accepting things the ISPs don’t want . This request by ALEC is the perfect example of the ISPs wanting things both ways – they want to be regulated where they need it and unregulated where they don’t – but there is no logical ways to have it both ways.

The ISPs biggest fear with having Title II regulation is that some future FCC could use the authority to impose rules they don’t like. They particularly fear a future FCC that tries to regulate broadband prices. The ISPs don’t really have a lot of concerns about this FCC, but these companies have seen the FCC change over time with changes in administrations and they know that the pendulum always eventually swings the other way.

So I sit here and just scratch my head over this. ALEC is asking this FCC, which wants to reduce regulations, to create new regulations. And they want the FCC to use their authority to regulate this issue while at the same time they don’t want the FCC to use that same authority to regulate any other broadband issues. I don’t know if I’ve ever seen a better example of somebody that ‘wants their cake and eats it too’!

The San Francisco Broadband Experiment

The City of San Francisco seems poised to tackle building fiber to everybody in the city. They have conducted several studies looking at the cost of building fiber. The city also created a Blue Ribbon Panel that has recommended in this recent report that the city should construct a fiber network.

The city is proposing to finance a fiber network in a new way. The city is looking at fiber connections in the same way as any other utility like electricity or water. The concept is that everybody in the city would pay a monthly utility fee that would fund the construction and operation of a fiber network. The number that was tossed around earlier this year was an average monthly fee of $26 per month to be charged to every household and business in the city. It’s hard to tell from the various reports if that’s the number still being considered. The Blue Ribbon report does recommend that the city seek private investment which would be used to lower that number.

The city wants to build the fiber network to everybody in the city, which differs from the typical ISP demand model that only builds to those that buy a broadband connection. The city does not want to be an ISP and wants to emulate some of the large cities in Europe which open up their fiber networks to multiple ISPs. The hope is that multiple ISPs using the network for a minimal charge will create competition and low-price broadband.

It’s an interesting concept. There are smaller municipalities in the country that are financing fiber with municipal bonds – but in most cases the expectation is that the fiber project will generate enough revenue to repay the bonds. But fiber construction is expensive in big cities and the utility fee is needed to finance a network that will cost more than $1 billion in San Francisco.

The city’s rationale for considering this is to provide world-class broadband to everybody. This is a city that is in direct economic competition with cities in Japan, Taiwan and South Korea – and the city views fiber as a necessary component to long-term financial success. Comcast is the biggest ISP in the city and they have fast broadband today with speeds now up to a gigabit download. AT&T offers DSL plus has built fiber to large businesses and MDUs. Sonic has been building some fiber to residences in the Bay Area. And like in every large city there has been some fiber built by ISPs and CLECs to selected locations in the city.

But the city is concerned that a significant percentage of the public can’t afford fast broadband access today. The Blue Ribbon Panel notes that the government-sponsored fiber network in Singapore reduced broadband prices from $90 per month down to $30 – $40 today while speeds leaped to a symmetrical gigabit connection.

No NFL city has yet tried to build fiber and this proposal is going to meet a lot of resistance. Certainly Comcast, AT&T and other big ISPs will do everything possible to derail such an effort. The city says that they don’t want to directly compete with commercial ISPs, but if the fiber network really lowers gigabit prices to $30 – $40 that will clearly get most of the customers in the city.

I foresee all sorts of attempts to try to stop this project. The big ISPs are enjoying unprecedented support today in Congress and the FCC, and one ISP tactic might be to legislate against the project – either at the federal or state level. My fear is that a legislative approach might also stop more traditional municipal broadband projects. I would also expect to see numerous lawsuits from ISPs challenging the project. It’s such a new concept that it’s hard to envision the basis for such lawsuits, but I fully expect them. I can also envision a few citizen lawsuits trying to stop a mandatory new utility fee – picture forcing Comcast employees to pay to construct a competing network.

The final big hurdle will be in getting enough quality ISPs on the network to offer real customer choice. The few open access networks in this country have not attracted the many quality ISPs. The open access model works in Europe because the old state-monopoly telcos and cable companies have been forced into competing with each by the formation of the European Union. And perhaps quality ISPs will take a chance on a network in an NFL city. But in this country there seems to be agreement among cable companies to not compete with each other and it’s unlikely that we would see Charter, Mediacom or others stepping in to compete against Comcast.

This is a really interesting idea and it could be a viable way to get gigabit broadband to everybody in a big city. The city has not made the decision to take the leap forward, and if they do they will certainly face an uphill battle to make it work. But this could be the first trial in trying to bring the European open access model to the US.

Consolidation of Telecom Vendors

It looks like we might be entering a new round of consolidation of telecom vendors. Within the last year there have been the following announced consolidation among vendors:

  • Cisco is paying $5.5 billion for Broadsoft, a market leader in cloud services and software for applications like call centers.
  • ADTRAN purchased CommScope, a maker of EPON fiber equipment that is also DOCSIS compliant to work with cable networks.
  • Broadcom is paying $5.9 billion to buy Brocade Communications, a market leader in data storage devices as well as a range of telecom equipment.
  • Arris is buying Ruckus Wireless as part of a spinoff from the Brocade acquisition. Arris has a goal to be the provider of wireless equipment for the large cable TV companies.

While none of these acquisitions will cause any immediate impact on small ISPs, I’ve been seeing analysts predict that there is a lot of consolidation coming in the telecom vendor space. I think most of my clients were impacted to some degree by the last wave of vendor consolidation back around 2000. And that wave of consolidation impacted a lot of ISPs.

There are a number of reasons why the industry might be ripe for a round of mergers and acquisitions:

  • One important technology trend is the move by a lot of the largest ISPs, cable companies and wireless carriers to software defined networking. This means putting the brains to technology into centralized data centers which allows cheaper and simpler electronics at the edge. The advantages of SDN are huge for these big companies. For example, a wireless company could update the software in thousands of cell sites simultaneously instead having to make upgrades at each site. But SDN means less costly and complicated gear.
  • The biggest buyers of electronics are starting to make their own gear. For example, the operators of large data centers like Facebook are working together under the Open Compute Project to create cheap routers and switches for their data centers, which is tanking Cisco’s switch business. In another example, Comcast has designed its own settop box.
  • The big telcos have made it clear that they are going to be backing out of the copper business. In doing so they are going to drastically cut back on the purchase of gear used in the last mile network. This hurts the vendors that supply much of the electronics for the smaller telcos and ISPs.
  • I think we will be seeing an overall shift over the next few decades of more customers being served by cable TV and wireless networks. Spending on electronics in those markets will benefit few small ISPs.
  • There are not a lot of vendors left in the industry today, and so every merger means a little less competition. Just consider FTTH equipment. Fifteen years ago there was more than a dozen vendors working in this space, but over time that has cut in half.

There are a number of reasons why these trends could foretell future trouble for smaller ISPs, possibly within the next decade:

  • Smaller ISPs have always relied on bigger telcos to pave the way in developing new technology and electronics. But if the trend is towards SDN and towards large vendors designing their own gear then this will no longer be the case. Consider FTTP technology. If companies like Verizon and AT&T shift towards software defined networking and electronics developed through collaboration there will be less development done with non-SDN technology. One might hope that the smaller companies could ride the coattails of the big telcos in an SDN environment – but as each large telco develops their own proprietary software to control SDN networks that is likely to not be practical.
  • Small ISPS also rely on larger vendors to buy enough volume of electronics to hold down prices. But as the big companies buy fewer standard electronics the rest of us use you can expect either big price increases or, worse yet, no vendors willing to serve the smaller carrier market. It’s not hard to envision smaller ISPs reduced to competing in the grey market for used and reconditioned gear – something some of my clients already do who are operating ten-year old FTTP networks.

I don’t want to sound like to voice of gloom and I expect that somebody will step into voids created by these trends. But that’s liable to mean smaller ISPs will end up relying on foreign vendors that will not come with the same kinds of prices, reliability or service the industry is used to today.

The Net Neutrality Furor

It seems pretty clear now that the FCC is going to reverse the net neutrality decision of a few years ago at their upcoming December meeting. They mechanism they will use to reverse the order is by reversing the decision to place broadband under Title II regulation. That move will take the FCC out of the business of regulating broadband, meaning that not only would net neutrality rules be reversed, but the FCC would no longer regulate things like broadband privacy. The FCC expects that washing their hands of broadband sends privacy and other issues to the Federal Trade Commission.

A lot of the public is up in arms over this FCC direction and the topic is all over the news and social media. But unfortunately, I think the public is fighting to maintain net neutrality for the wrong reasons. People seem to fear that without net neutrality that the ISPs will begin abusing their customers in dreadful ways. I’ve seen social media warnings that the end of net neutrality means that the ISPs will block or throttle any web site that is not under their economic control. People fear that the ISPs will block content they don’t like such as porn or political content they disagree with.

I have a hard time buying these arguments. The ISPs have no economic incentive to badly antagonize customers. Removing the net neutrality rules now does not mean that ISPs can’t be regulated again in the future. Congress always has the power to regulate them in any manner desired, and if the ISPs start doing crazy things some future Congress will likely react. The net neutrality rules have only been in place for a few years and the ISPs didn’t abuse customers in these feared ways before these rules. I find it unlikely that would do the extreme things that people are warning about.

But I still think people are right to support net neutrality. But the issue they should care about is not net neutrality, but the basic Title II regulation. That is the framework the FCC used as the basis for passing the net neutrality rules. These rules largely allow the FCC to regulate broadband in the same manner they have regulated telephone service. The ISPs challenged the FCC’s Title II regulations in court and the courts have upheld the FCC’s right to regulate broadband.

The ISPs hate Title II regulation, but not because it imposes the net neutrality principles. Their real fear is that the FCC will use these rules to regulate broadband prices. A lot of analysts think that the big ISPs are planning on significant rate increases over the next few years. While the Wheeler FCC said they would not regulate rates, the Title II rules grants the FCC authority to do so at any future time. And the FCC can regulate more than just prices and has the authority to regulate things like data caps.

The big ISPs have been working hard to repeal the Title II regulation due to the threat of price regulation – not because they don’t want the net neutrality principles. There are numerous quotes from the CEOs of the big ISPs saying that they could live with the net neutrality principles – and I largely believe them.

Interestingly there is already at least one ISP that is completely flouting the net neutrality rules. T-Mobile now includes Netflix for free with its cellular plans. This practice is called zero rating and is in violation of the paid prioritization principle of net neutrality. It’s likely that many T-Mobile customers won’t buy other video content since they are already getting Netflix for ‘free’. This practice clearly puts other OTT providers at a disadvantage on the T-Mobile network. And yet, I don’t hear any public outcry about T-Mobile’s practice and suspect their customers really love this feature. This is what zero net neutrality rules looks like – ISPs are likely to bundle in features that a large percentage of their customers like. But the negative consequence to this is not to directly disadvantage customers, but rather to pick winners and losers among web companies. But my guess is that the ISPs will bundle with platforms a lot of people already like and that this bundling will be largely popular, like the T-Mobile bundling of Netflix.

I honestly believe that the big ISPs are largely laughing at the public on this issue. The ISPs understand that the public has badly interpreted their real reason for attacking Title II regulation. The ISPs want the unfettered ability to raise prices. Without regulation it’s true that the ISPs could probably do the sorts of things the public is so stirred up about – but it would be bad business to do so. Can you imagine the furor if AT&T started blocking web sites? Since the ISPs and the FCC understand the real game they can brush off the public hysteria that is concentrating on the wrong issues, and they can now get down to the business of raising rates.

Why I am Thankful – 2017

Every year at Thanksgiving I take a pause to look at the positive things happening with the small carrier industry. This is not the easiest year to make a list because we currently have an FCC that clearly is in the pocket of the big ISPs like Verizon, AT&T and Comcast. While some of the new FCC policies supporting those big companies will benefit all ISPs, in many cases the FCC decisions are given the big ISPs a leg up over competition. But there are still things to be thankful about in our industry:

Demand for Broadband Intensifies. In the work I have been doing in rural communities it’s becoming clear that broadband has moved from a nice-to-have feature to a must-have commodity. I see evidence of this in several different ways. First, rural communities and their citizens are making a lot of noise to politicians about not having broadband. The broadband issue has become the top priority in many communities. I also see evidence of rural broadband demand when looking at the high penetration rates that come from projects being built in areas that didn’t have good broadband. Over the last few years I’ve seen such projects getting customer penetration rates between 65% and 85%. I call this a good news topic for rural carriers since it means there are still lots of opportunities for expansion, and enough customer demand to help pay for broadband projects. It’s not a positive that there are still so many communities with no broadband, but the positive here is that communities are making demands, which is the first step towards finding a solution.

Public Private Partnerships are Thriving. Very few government entities want to be an ISP and they are instead hoping to find commercial partners to bring better broadband to their communities. In just this last year I’ve worked with half a dozen local governments that have contributed funding to public private partnerships, where the government acts like the bank and the ISP owns and operates the network. Since rural broadband projects are often a challenge to finance this is a promising new trend.

ACAM Money is Financing Fiber. The ACAM money from the Universal Service Fund is being used to expand fiber and advance broadband in rural areas all over the country. The fact that some rural communities are getting fiber is helping to drive the demand for other who want the same thing. We’ll have to wait until next year to see of the CAF II reverse auctions drive similar results.

Wireless Technology Getting a Lot Better. I have a lot of clients who are now deploying point-to-multipoint radios for broadband deployment. Over the last three years these radios have improved dramatically. They are more reliable, almost approaching plug-and-play. By combining multiple frequency bands they deliver bigger broadband pipes, faster speeds and a much-improved customer experience. Depending on customer density the networks can be designed to deliver 25 Mbps to a lot of customers with some speeds as fast as 100 Mbps. There are still big issues with the technology in heavily wooded or hilly areas, but there are a lot of places where the technology is now delivering a great broadband connection.

New Revenue Opportunities Materializing. While voice revenues continue to decline and many of clients are getting clobbered on cable TV, I see a number of them doing well with new products. I have clients getting decent penetration rates with managed WiFi. I have some clients doing well with security. And I have clients making some good margins on smart home technologies. Selling new products is out of the comfort zone for many small ISPs and it requires some new thinking to successfully sell a new product – but I’ve seen enough success stories to see that it can work.

A Managed WiFi Product

A number of my clients are now selling a managed WiFi product. But the product they are offering customers under that name varies widely, and so today I thought I’d discuss a few of the different products being sold under this name.

The simplest product is one that I would call a WiFi network. Historically, ISPs that provided WiFi placed a single WiFi router near to where the broadband connection terminated into the home. And it was typical to include the WiFi functionality directly embedded into the DSL or cable modem router. This product has been around for a while and I got my first WiFi router when Verizon supplied an all-in-one router on my FiOS connection nearly 15 years ago.

But as homes have added numerous connected WiFi devices, a single WiFi router is often inadequate. With today’s greater demand for bandwidth by devices a single WiFi router often can’t reach to all parts of the home or connect smoothly to numerous devices. Most of my clients tell me that WiFi problems are now the biggest cause of customer dissatisfaction and in in many cases have surpassed cable TV issues. Many customers supply their own WiFi routers and ISPs get frustrated when a customer’s inadequate WiFi device or poor router placement ruins a good broadband delivery to the home.

Today there are numerous brands of WiFi network devices available. These systems deploy multiple WiFi routers around the home that are connected with each other to create one ubiquitous network. The routers can be connected wirelessly in a mesh or hard-wired to a broadband connection. These devices are widely available and many customers are now installing these networks – I’ve connected an eero network in my home that has vastly improved my WiFi quality.

I have a number of clients that sell the WiFi networks. They will place the WiFi units in the home in a manner that maximizes WiFi reception. The revenue play for this product is simple equipment rental and they charge each month for the devices. ISPs generally set up the routers so that they can peer into them for troubleshooting since customers inevitably will unplug a router, move one to a less than ideal place or place some big object near one that blocks the WiFi signal. But that’s about all that comes with the product – expert placement of routers and simple troubleshooting or replacement if there are problems.

At the other end of the spectrum are a few clients who really manage the customer WiFi experience. For example, customers can call when they buy a new WiFi device and the NOC technicians will connect the device to the network and maximize the WiFi connection. They will assign devices to different frequencies and channels to maximize the WiFi experience. These ISPs have invested in software that tracks and keep records of all of the devices connected to the WiFi network, meaning they can see a history of the performance of each customer device over time.

The ISPs monitor the WiFi performance and are usually proactive when they see problems, in the same manner than many ISPs track performance of fiber ONTs. The WiFi network moves the ISP deeper into the customer home and allows the ISP to make certain that customers are getting the bandwidth they are paying for.

Nobody know what to charge for this yet and I see monthly rates for the managed WiFi that range from $10 to almost $25 per month. I don’t have enough experience with this to yet suggest the right price. Like any new product the success is going to be due mostly to the marketing effort expended. I have a few clients who have already gotten penetration rates of 25% or more with prices in the $15 – $20 range.

But this product isn’t for everybody. For example, I have clients that don’t want to take on the product due to the extra truck rolls. But almost all of my clients have worries about eventually becoming dumb pipe providers and the managed WiFi product provides a tangible way to maintain contact with a customer to demonstrate the ISPs value proposition. And like with any equipment rental play the revenue stream is good. Once the cost of the hardware and initial installation have been recovered the product is almost all margin.

 

 

The Beginning of the End for Copper

The FCC voted last Thursday to relax the rules for retiring copper wiring. This change was specifically aimed at Verizon and AT&T and is going to make it a lot easier for them to tear down old copper wiring.

The change eliminates some of the notification process to customers and also allows the telcos to eliminate old copper wholesale services like resale. But the big consequence of this change is that many customers will lose voice services. This change reverses rules put in place in 2014 that required that the telcos replace copper with service that is functionally as good as the copper facilities that are being removed.

Consider what this change will mean. If the telcos tear down copper in towns then customers will lose the option to buy DSL. While cable modems have clobbered DSL in the market there are still between 15% and 25% of broadband customers on DSL in most markets. DSL, while slower, also offers lower cost broadband options which many customers find attractive.

I don’t envision AT&T and Verizon tearing down huge amounts of copper in towns immediately. But there are plenty of neighborhoods where the copper is dreadful and the telcos can now walk away from that copper without offering an alternative to customers. This will give the cable companies a true monopoly in towns or neighborhoods where the copper is removed. Customers losing low-cost DSL will face a price increase if they want to keep broadband.

The rural areas are a different story. In most of rural America the copper network is used to deliver telephone service and there are still a lot of rural customers buying telephone service. You might think that people can just change to cellular service if they lose their landlines, but it’s not that simple. There are still plenty of rural places that have copper telephone service where there is no good cellular service. And there are a lot more places where the cellular service is too weak to work indoors and customers need to go outside to find the cellular sweet spots (something we all remember doing in airports a decade ago).

Of a bigger concern in rural areas will be losing access to 911. A lot of homes still keep landlines just for the 911 capabilities. Under the old rules the carriers had to demonstrate that customers would still have access to reliable 911, but it seems the carriers can now walk away without worrying about this.

The FCC seems to have accepted the big telcos arguments completely. For instance, Chairman Pai cited a big telco argument that carriers could save $40 to $50 per home per year by eliminating copper. That may be a real number, but the revenue from somebody buying voice service on copper is far greater than the savings. It seems clear that the big telcos want to eliminate what’s left of their rural work force and get out of the residential business.

This is a change that has been inevitable for years. The copper networks are deteriorating due to age and due even more to neglect. But the last FCC rules forced the telcos to work to find an alternative to copper for customers. Since AT&T and Verizon are cellular companies this largely meant guaranteeing adequate access to cellular service – and that meant beefing up the rural cellular networks where there aren’t a lot of customers. But without the functional equivalency requirement it’s unlikely that the carriers will beef up cellular service in the most remote rural places. And that means that many homes will go dark for voice.

This same ruling applies to other telcos, but I don’t think there will be any rush to tear down copper in the same manner as AT&T and Verizon. Telcos like Frontier and Windstream still rely heavily on their copper networks and don’t have a cellular product to replace landlines. And I don’t know any smaller telcos that would walk away from customers without first providing an alternative service.

It’s hard to think that the FCC is embracing a policy that will leave some households with no voice option. The FCC is purposefully turning a blind eye to the issue, but anybody who knows rural America knows this will happen. There are still a lot of rural places where copper is the only communications option today. Our regulators once prided themselves on the fact that we brought telephone service to every place that had electricity. We had a communications network that was the envy of the world, and connecting everybody was a huge boon to the economy. We could still keep those same universal service policies for cellular service if we had the will to do so. But this FCC clearly sides with the big carriers over the public and they are not going to impose any rules that the big telcos and cable companies don’t want.

Broadband and Education

I’ve always taken it as a given that broadband is important for education. I know as I travel around the country and meet with folks in rural counties that education is at the top of the list of reasons why rural areas want better broadband. I’ve heard countless stories of the major efforts rural families undertake to help their kids keep up with schoolwork.

I recently saw a study that looks at the impact of lack of broadband on education. The study comes from the ICUF (Independent Colleges and Universities of Florida) – a group of 30 universities in the state. This study correlates lack of broadband with lower high school graduation rates, lower percentages of college degrees and lower per capita income.

The study says that 700,000 Floridians don’t have enough broadband to take part in distance learning. Distance learning is used for numerous college degree programs and the ICUF institutions have over 600 distance learning degree programs.

But distance learning is now also a big part of K-12 education and students are expected to be able to use distance learning tools for homework or to make up for work missed during absences. High schools also use distance learning to offer a wider variety of classes to students on subjects where it would otherwise be hard to justify hiring a teacher. My daughter finished high school in Florida last year and she took a distance learning math class when she was unable to otherwise fit it into her schedule.

The study concludes that students need broadband speeds at something similar to the FCC definition of broadband of 25 Mbps down and 2 Mbps up in order to successfully use distance learning. I would also add that distance learning requires low latency in order to maintain a live connection – this is not something that can be done, for example, with a satellite broadband connection.

The study identified 13 counties in the state that have inadequate broadband, ranging from Madison County where 41% of residents can’t get broadband to Dixie County where 99% of households don’t have broadband access. These counties have significantly fewer citizens with college degrees than the 19 counties that are at the top of the list in terms of broadband access.

But the 13 county statistic is misleading because every county has pockets of students without good broadband. As soon as you get outside city limits almost anywhere the availability of broadband quickly diminishes. A few years ago I looked at my own county, Charlotte County, and I found several pockets of homes without broadband even inside suburban neighborhoods.

The state of Florida has a goal to have 55% of its population with a college degree or advanced education certificate by 2025. They think this is needed to keep the state competitive in the global economy. The areas without broadband are far below that target with college graduation rates between 12% and 27%. A few of the urban counties in the state already have as many as 54% of residents with a college degree or certificate.

This study doesn’t reach any conclusions on how to close the rural broadband gap (something a whole lot of us are struggling with). But they see this study as a cry to develop policies and funding to close the gap. The conclusion of the study is that areas without broadband will fall further behind than they are today unless we can find broadband solutions.

Some Unexpected News

In an attempt to stop the massive bleeding of traditional cable TV customers AT&T has cut the prices for cable on both the DirecTV and U-verse platforms. The company lost almost 400,000 linear TV customers in the recent third quarter.

As an example, DirecTV’s ‘Select’ bundle of 150 channels will now be priced for a two-year contract at $35 for the first year and $76 for the second year, compared to the recent prices of $50 for the first and $90 for the second. All of the other packages have similar drops of $10 to $15 in the first year and lower second year prices.

I call this unexpected news because it goes against every trend in the rest of the industry. The average monthly revenue for the 2-year Select contract just fell from $70 per month to $55.50 per month – more than a 20% discount. From what I know about programming prices it’s hard to think that AT&T has any margin at the new prices and they are clearly under water for the first year, spending more for programming than what they will collect in revenue.

This price reduction brings a couple of different ideas to mind. First, it’s clear that AT&T still wants to have traditional linear cable TV customers. Even at little or no margin they see value in that, although I honestly can’t see what that benefit might be. Certainly, one benefit might be to prop up DirecTV through sheer volumes of customers. I think AT&T envisions the future of cable TV to be more in line with the smaller on-line packages being sold as DirecTV Now. But the general public largely is not yet ready to make the shirt to totally online and so perhaps AT&T wants to keep people using its products until that is a more likely shift.

But this price drop also talks about the market elasticity of cable TV. We’ve known for years that customers that cut the cord almost all say they are leaving traditional cable TV because of the cost. That was already happening before the plethora of new on-line alternatives like Sling TV and Playstation Vue. These new alternatives products have created what is called in economic terms as a substitute. Over 900,000 households changed to one of these online cable products in the most recent third quarter, and so it’s obvious that many people now view a skinny bundle like Sling TV to be a reasonable substitute for the big cable packages.

And this makes sense. We know that most households don’t watch many different channels even on a 200-channel cable offering, and so as long as a smaller lineup has channels a household is comfortable with then skinny bundles become economic substitutes for the traditional big cable bundle.

And of course, all of this is compounded by OTT providers like Netflix, Hulu and Amazon prime that provide a huge array of online content that is another competitor to cable TV. I can tell you personally that I am far happier with having one skinny bundle (currently Playstation Vue) and access to OTT content than I ever was with the big cable bundle. I remember channel surfing through the big cable packages at 3:00 in the morning (a time I am often awake) and finding nothing but bad programming and infomercials. The choice from online programming are far better for my tastes and style of watching TV.

This change makes me wonder if we aren’t seeing the end of the tolerance of the public towards costly cable TV products. If the idea that traditional cable TV packages are no longer worth the price we could be seeing a watershed moment in the industry – one where a huge cable provider makes a last stab to keep customers.

It will be interesting to see if any of the other cable providers react the same way. This is a bold move by AT&T and one would think that those seeking a cheaper alternative might be attracted to these new bundles. But of course, every customer that takes one of these packages will probably be bailing on a traditional package from one of the cable companies. This is going to be an interesting battle to watch.

The Future of WiFi

There are big changes coming over the next few years with WiFi. At the beginning of 2017 a study by Parks Associates showed that 71% of broadband homes now use WiFi to distribute the signal – a percentage that continues to grow. New home routers now use the 802.11ac standard, although there are still plenty of homes running the older 802.11n technology.

But there is still a lot of dissatisfaction with WiFi and many of my clients tell me that most of the complaints they get about broadband connections are due to WiFi issues. These ISPs deliver fast broadband to the home only to see WiFi degrading the customer experience. But there are big changes coming with the next generation of WiFi that ought to improve the performance of home WiFi networks. The next generation of WiFi devices will be using the 802.11ax standard and we ought to start seeing devices using the standard by early 2019.

There are several significant changes in the 802.11ax standard that will improve the customer WiFi experience. First is the use of a wider spectrum channel at 160 MHz, which is four times larger than the channels used by 802.11ac. A bigger channel means that data can be delivered faster, which will solve many of the deficiencies of current WiFi home networks. This will improve the network performance using the brute strength approach of pushing more data through a connection faster.

But probably more significant is the use in 802.11ax of 4X4 MIMO (multiple input / multiple output) antennas. These new antennas will be combined with orthogonal frequency division multiple access (ODMFA). Together these new technologies will provide for multiple and separate data streams within a WiFi network. In layman’s terms think of the new technology as operating four separate WiFi networks simultaneously. By distributing the network load to separate channels the interference on any given channel will decrease.

Reducing interference is important because that’s the cause of a lot of the woes of current WiFi networks. The WiFi standard allows for unlimited access to a signal and every device within the range of a WiFi network has an equal opportunity to grab the WiFi network. It is this open sharing that lets us connect lots of different devices easily to a WiFi network.

But the sharing has a big downside. A WiFi network shares signals by shutting down when it gets more than one request for a signal. The network pauses for a short period of time and then bursts energy to the first network it notices when it reboots. In a busy WiFi environment the network stops and starts often causing the total throughput on the network to drop significantly.

But with four separate networks running at the same time there will be far fewer stops and starts and a user on any one channel should have a far better experience than today. Further, with the ODMFA technology the data from multiple devices can coexist better, meaning that a WiFi router can better handle more than one device at the same time, further reducing the negative impacts of completing signals. The technology lets the network smoothly mix signals from different devices to avoid network stops and starts.

The 802.11ax technology ought to greatly improve the home WiFi experience. It will have bigger channels, meaning it can send and receive data to WiFi connected devices faster. And it will use the MIMO antennas to make separate connections with devices to limit signal collision.

But 802.11ax is not the last WiFi improvement we will see. Japanese scientists have made recent breakthroughs in using what is called the TeraHertz range of frequency – spectrum greater than 300 GHz per second. They’ve used the 500 GHz band to create a 34 Gbps WiFi connection. Until now work in these higher frequencies have been troublesome because the transmission distances for data transmission has been limited to extremely short distances of a few centimeters.

But the scientists have created an 8-array antenna that they think can extent the practical reach of fast WiFi to as much as 30 feet – more than enough to create blazingly fast WiFi in a room. These frequencies will not pass through barriers and would require a small transmitter in each room. But the scientists believe the transmitters and receivers can be made small enough to fit on a chip – making it possible to affordably put the chips into any device including cell phones. Don’t expect multi-gigabit WiFi for a while. But it’s good to know that scientists are working a generation or two ahead on technologies that we will eventually want.