New Connect America Funds

auction-845x321Our regulatory world is messed up sometimes – that’s the only way to describe it. The FCC last week announced that there would be an auction for the Connect America Fund to provide $2 billion of funding to build rural broadband. The funds are for places where the large telcos elected to not take the Connect America Funds. Verizon seems to have largely just decided that they aren’t interested in upgrading their rural networks. But I have to imagine that places that were not selected by the other large telcos like Windstream have to be because the cost of building those places is too high.

The new funding will be awarded by reverse auction, meaning the company willing to take the least amount of money for a given service area will be awarded the funds. And this is the first area where this whole process is messed up. The FCC handed out $6 billion to the large telcos with no auction and no such low bid requirement and so the big companies get every penny of that FCC funding, without contention.

But any company bidding in this new reverse auction is going to worry that somebody will bid slightly lower than them to get the funding, and so most bidders are likely to bid for less than the full potential funding. The bottom line of this is that the big telcos got every penny of funding available to them without having to worry about somebody else wanting to use it while the remaining companies are likely to get something less.

The original award of funds should have also been a reverse auction. There are plenty of smaller telcos, electric coops and local governments that would have vigorously bid on the original $6 billion, and in doing so would have brought real broadband to the millions of people in those areas that are going to instead get a lousy DSL upgrade to speeds that aren’t even broadband by today’s standards. The FCC is only requiring speeds of 10 Mbps download and 1 Mbps upload, and even then allows the big telcos six years to get this done.

The original $6 billion award of the Connect America Fund was basically a hand-out to the big telcos. There’s really no other way to characterize it. I saw right after these awards that companies like Frontier got a big bump in stock valuation since they are claiming the Connect America Fund as revenue. I know a number of people who speculate that the big telcos will not upgrade everywhere they are supposed with this funding and will just shrug and weakly apologize. And there is likely to be no penalty for that.

To make matters even worse, the new funding (as well as the old) allows carriers to impose a 150 GB monthly data usage cap on customers covered by the funding. This is telling rural people, “Here’s the broadband you’ve been waiting many years for, but now, don’t actually use it”. My many clients report to me that the average residential monthly download is already somewhere between 150 GB and 200 GB per month, so that cap is already too low even by today’s standards. And we all know that broadband usage in homes keeps increasing exponentially and has been doubling every three years.

So there is already $6 billion being used to provide inadequate DSL upgrades from the large incumbent telcos. And when the people in those areas finally get upgraded to 10 Mbps bandwidth sometime during the next five years they will be told there is a 150 GB monthly data cap on monthly usage. We could have instead used that $6 billion to seed hundreds of rural fiber projects that would have brought real broadband to a lot of homes. That is my definition of messed up.

Nationwide Number Portability

RotarydialIn November of last year the FCC asked the North American Numbering Council (NANC) to explore the issues associated with implementing nationwide number portability. That would mean that a customer could move to anywhere in the US and keep their telephone number with no regard to state boundaries, jurisdictional borders or distance from the original rate center. There is already number portability for cellphones and I’ve moved my own cellphone from Washington DC to the Virgin Islands and then to Florida. But there are a lot more restrictions on moving landline numbers.

The NANC returned a report to the FCC last week, and the following are some of their more interesting responses:

  • They don’t think that nationwide number portability would have any significant impact on the North America Numbering Plan or on the FCC’s process for forecasting the needs for future numbers (NRUF).
  • They did foresee changes in the way that taxes are assessed. For example, today there are a number of fees and taxes such as 911 fees or surcharges for expanded local calling scopes that telcos assign to customers based upon their phone number. The way of assessing such taxes would have to be changed.
  • The also foresaw that this would change the way that many companies assign jurisdiction to a call. For example, many long distance rating tables rely on the NPA-NXX of a number as the manner to assign jurisdiction, and a lot of the carriers providing wholesale long distance also use this number as they are determining rates.
  • There was some concern that existing 911 processes might have trouble identifying the addresses of ported numbers.
  • They identified that allowing nationwide number portability would cause changes to a large number of industry and carrier processes including the routing and rating databases (LERG and BIRRDS), to telephone switching software, to billing systems and to provisioning systems.
  • They also ask the question of how all of the potential changes might be affected by the upcoming conversion of the PSTN to all-IP. Their conclusion was that an all-IP network would require the same basic changes to databases and systems.
  • They also listed the FCC rules that would need to be changed in order to accommodate nationwide number portability. These were all changes that are allowed under current FCC jurisdiction.

Overall NANC made no specific recommendation but instead listed the various issues that the FCC should consider when looking at the issue.

I find it interesting that the cellular carriers have been able to easily accommodate nationwide mobile number portability but that this looks to be such a daunting task for telcos. But there are a lot fewer cellular carriers and their systems and processes are a lot newer than those used by telcos. I am sure that the large telcos look at this kind of change and just shudder since they understand that so many of their internal processes are driven by the telephone number of their customers.

But the telcos have already been allowing limited number portability for a number of years. A customer can usually move somewhere within their local calling scope and retain their number. For instance, you can port numbers between Baltimore and Annapolis, Maryland. But such porting does not change the jurisdiction of the calls, and therein lies the big rub.

I’m not sure what prompted the FCC to look at the issue. With the steady decline of landlines I wouldn’t think that there is a huge public cry for nationwide number portability. Interestingly, the VoIP providers have been assigning numbers outside of the jurisdictional areas for years. I can order a number from any major city in the country to use at my house in Florida. But what I still can’t do is take a local Florida number owned by a telco to somewhere else.

Speed Tests

cheetah-993774Netflix just came out with a new speed test at fast.com which is intended to measure the download speed of Internet connections to determine if they are good enough to stream Netflix. The test only measures the speeds between a user and the Netflix servers. This is different than most other speed tests on the web that also look at upload speeds and latency.

This raises the question of how good speed tests are in general. How accurate are they and what do they really tell a user? There are a number of different speed tests to be found on the web. Over the years I have used the ones at speedtest.net (Ookla), dslreports.com, speed.io, the BandWidthPlace and TestMySpeed.

Probably the first thing to understand about speed tests is that they are only testing the speed of a ping between the user and the test site routers and are not necessarily indicative of the speeds for other web activities like downloading files, making a VoIP phone call or streaming Netflix. Each of those activities involves a different type of streaming and the speed test might not accurately report what a user most wants to know.

Every speed test uses a different algorithm to measure speed. For example, the algorithm for speedtest.net operated by Ookla discards the fastest 10% and the slowest 30% of the results obtained. In doing so they might be masking exactly what drove someone to take the speed test, such as not being able to hold a connection to a VoIP call. Ookla also multithreads, meaning that they open multiple paths between a user and the test site and then average the results together. This could easily mask congestion problems a user might be having with the local network.

Another big problem with any speed test is that it measures the connection between a customer device and the speed test site. This means that the customer parts of the network like the home WiFi network are included in the results. A lot of ISPs I know now claim that poor in-home WiFi accounts for the majority of the speed issue problems reported by customers. So a slow speed test doesn’t always mean that the ISP has a slow connection.

The speed of an Internet connection for any prolonged task changes from second to second. Some of the speed tests like Netflix Ookla show these fluctuations during the test. There are numerous issues for changing speeds largely having to do with network congestion at various points in the network. If one of your neighbors makes a big download demand during your speed test you are likely to see a dip in bandwidth. And this same network contention can happen at any one of numerous different parts of the network.

The bottom line is that speed tests are not much more than an indicator of how your network is performing. If you test your speed regularly then a slow speed test result can be an indicator that something is wrong. But if you only check it once in a while, then any one speed test only tells you about the minute that you took the test and not a whole lot more. It’s not yet time to call your ISP after a single speed test.

There have been rumors around the industry that the big ISPs fudge on the common speed tests. It would be relatively easy for them to do this by giving priority routing to anybody using one of the speed test web sites. I have no idea if they do this, but it would help to explain those times when a speed test tells me I have a fast connection and low latency and yet can’t seem to get things to work.

I think the whole purpose of the Netflix speed test is to put pressure on ISPs that can’t deliver a Netflix-capable connection. I don’t know how much good that will do because such connections are likely going to be on old DSL and other technologies where the ISP already knows the speeds are slow.

New Technology – May 2016

light beamsThe following are some new breakthroughs that might eventually benefit our industry:

Better Long-haul Fiber. Researchers at the Moscow University of Physic and Technology, ITMO University of St. Petersburg, and the Australian National University have demonstrated a new technology that can drastically increase the efficiency of long-haul fiber.

They have found that introducing silicon nanoparticles into a fiber optics path can increase what is called the Raman effect. The Raman effect is where light interacts with some materials to produce wavelengths of different colors. In these materials the light causes the affected molecules to increase in energy, and at the higher energy level the materials then re-emit a photon that has a lower energy level than the original light stream.

Today’s lasers use metallic particles to induce the desired color of wavelength but the silicon nanoparticles generate light nearly 100 times stronger than the technology used today. While it will take a while to go from lab to production, this has a huge potential for the efficiency and the distance between repeaters on long-haul fibers.

A New Form of Light. Scientists at Trinity College in Dublin Ireland have been able to produce a new kind of light.  Physics has viewed the properties of light as a fixed constant. This was based upon the angular momentum of photons and Planck’s constant. But the scientists have been able to produce a form of light with an angular momentum that is half the value of Planck’s constant.

Scientists have long theorized that different fractional angular momentum is possible but this is the first time it’s been produced. The first potential use for this new form of light is with fiber optics transmissions. This new kind of light looks to have properties that would allow for the transmission of significantly more bits of data than with normal light.

Faster G.Fast. Israeli chip-maker Sckipio has developed a G.Fast chipset that will double the effective speed of G.Fast. The chip can support symmetrical throughput speeds of 750 Mbps. Sckipio says that they already have another chip set on the drawing board that might double that speed to about 1.5 Gbps.

Their chipset is the first G.Fast design to have fast speeds in both directions and provides greater overall data throughput. While the only trial of G.Fast in the country that I’ve heard about is being done by CenturyLink, it’s been reported that AT&T is thinking about adopting the technology. The company has made numerous announcements about expanding their U-verse product to millions of homes and G.Fast is basically a fiber-to-the curb product that would let them string fiber in neighborhoods but use the existing copper network to bring the bandwidth into the home.

New Data Storage Technology. IBM Research announced the first successful trial of storing data using phase-change memory (PCM). This technology can store 3-bits of data instead of just one.  There are many advantages of PCM storage – it retains memory without power, it allows for faster read / write and it can be overwritten over 10 million times (compared to flash drives which wear out after around 3,000 write cycles).

IBM sees PCM being used in conjunction with flash storage to allow for very fast launching of devices like cell-phones and computers. It would also allow for much faster inquiries, speeding up computer processing speeds on any device.

 

Comcast and Competition

comcast-truck-cmcsa-cmcsk_largeThere was a short interview in Fortune recently with Comcast CEO Brian Roberts about Comcast’s views on competition. Roberts’ responses are a very good summary of the state of cable competition in the country in general.

First, when asked about competition today Roberts said that Comcast feels competitive pressure every minute and that he thinks the market is getting increasingly competitive. That’s an interesting comment. There are certainly markets where people are building fiber to compete against Comcast. We see CenturyLink building fiber in a number of large cities. Verizon has announced that it’s going to expand FiOS in Boston. Google is slowly building fiber, although not yet in many Comcast markets. And there are a tiny number of municipal fiber builds, mostly in more rural markets.

But to offset those new fiber competitors it’s obvious that Comcast is crushing DSL in its many markets. The older DSL equipment is just becoming too slow and DSL customers are finally upgrading to faster cable modem service. Overall Comcast added a net of almost 1.4 million new broadband customers in 2015. So for every customer they might have lost to fiber they have picked up many more from competing with DSL. Because Comcast’s cable modems are so superior to DSL, in the vast majority of its markets Comcast now has a virtual monopoly on real broadband.

Roberts was also asked about the company’s interest in competing outside of Comcast’s cable service territories. Unsurprisingly Roberts said that the company has no plans to expand its footprint. The FCC has been bringing the pressure on cable companies to become more competitive. But I am not aware of an example where one of the major cable companies has ever competed against one of their neighboring cable providers. In other places like Europe and Canada there are markets where cable companies compete against each other – but in this country there is an obvious tacit agreement among the cable companies to not step on each other’s monopoly turf.

Finally, Roberts was asked about current news that Comcast might be thinking about offering a competitive cellular product. The company tried this a few years ago together with some other cable companies but then ditched the attempt. Roberts says that the company is exploring the concept (something they probably have been doing for a decade) but that – unless the company can find some unique value proposition – they probably won’t enter the cellular market. Interestingly, one of the values of the merger between Charter and Time Warner is that it gives Charter the contracts that allow the company to offer an MVNO cellular product. We’ll have to watch to see if anything comes of that possibility.

In summary, Comcast says they see competition everywhere. It’s an interesting perspective because the company is overall as close to a monopoly as any company can be in the telecom space in this country. There is a lot of public relations and regulatory benefit for Comcast by acting besieged by competition. If Comcast was instead touting their monopoly advantage they would probably come more under the crosshairs at the FCC and at state regulatory bodies.

But I have a hard time seeing where competition is hurting the company. They are still adding customers like crazy. Revenues and profits are up. The company has made big headway in rolling out new security and home automation products. And in a large percentage of its markets the company is becoming a virtual monopoly.

Using Cellular for Home Broadband

slow-downFor some time both Verizon and AT&T have been telling the FCC and state Commissions that they want to replace rural telephone lines with cellular connections, which means bringing cellular data plans to rural areas. We’ve now finally seen Verizon’s plans for what rural cellular data plans will look like:

The headline on this Verizon web site is “use the power of the Verizon 4G LTE Network to give you a lightning-fast Internet connection in your home,” followed later on in the offer with the header “Ditch your Low-Speed Internet.”

Those phrases sound great until you then see the offered speeds: “Fast Internet access with average speeds of 5 – 12 Mbps download and 2 – 5 Mbps upload.” I guess for somebody who’s been on dial-up this might be lightning fast, but it’s awfully hard to call this broadband.

But then comes the real kicker when they list the price and the monthly data caps:

  • 10 GB monthly data cap $60
  • 20 GB monthly data cap $90
  • 30 GB monthly data cap $120
  • $10 per additional gigabit of usage.

Before I totally scoff at this, it’s important to realize that there are already many households trying to get by using today’s cellular plans for home data. Compared to those plans this new offer is a little better. But these new plans are not broadband and it displays the greed of the cellular companies that they can even put such a plan into the public with a straight face. What these plans say to anybody living in a rural Verizon or AT&T area is – you’re screwed.

It’s easy to put these plans into perspective. Just last week I was traveling in Minnesota and there was a day that I used my cellular data plan to power by laptop broadband. In just one day, doing only normal business things, I used over a gigabit of data. I didn’t watch video or do anything that was a blatant data hog. And so the $120 plan would not even power my one business laptop for a month and I’d be paying that much per month and facing $10 for every gigabit I went over 30 GB.

Cellular data in this country is among the most expensive data used anywhere in the world. When you look at charts that are occasionally compiled of worldwide data prices per megabit the only places more expensive are Antarctica, some parts of Africa, and remote islands. And Verizon wants to take that ultra-expensive cellular data and extend it into rural homes.

This pricing by Verizon should end once-and-for-all the arguments that I hear all of the time that the future of rural broadband is wireless. Verizon has it within their means to offer an affordable alternative broadband product from rural cell towers – and this is not it.

I can fully understand why cellular companies don’t want to sell broadband connections in urban areas that are used to streaming Netflix – busy cell sites are really not made for that and such a connection ties up a valuable channel for a long time. But in rural areas where there are fewer people using cell towers the wireless carriers potentially could offer an affordable product with a much larger data cap. They fact that they are choosing to not do so says more about their greed than anything else.

I hope the FCC is paying attention to this. A copy of this web site need to be attached to any filing that the cellular carriers make at the FCC asking to tear down rural copper and replace it with cellular data. If the FCC supports such an idea, even in the slightest – this is what they are agreeing to.

Returning to the Basics

turntableThere is a really interesting trend happening with the electronics and the devices we buy and use. For the past decade we have seen the smartphone kill off numerous other industries by turning their products into apps. Consider the implosion of industries like photography and music and the waning of other devices like PCs. As the chipsets in smartphones have become more powerful they have let our little handheld computers take on more and more tasks that were once performed by other devices.

But it looks like the smartphone is starting to lose some of its zing. People are not fired up to run out and buy the latest phones because the new ones are not dramatically better than the old ones. People are holding on to phones longer and it’s been reported that people are even exploring the potential on their phones less and are using fewer apps. Sales of iPhones are down for the first time along with the price of Apple stock.

Along with this downward trend in smartphones is the resurgence of some of the industries that the smartphone helped to decimate. I travel a lot and for a number of years it has been rare to see anybody but a few obvious foreigners carrying a camera through airports. But all of a sudden they are noticeable again. And this is happening at a time when the cameras in smartphones are getting much better. I know the camera in my new galaxy S7 takes pretty amazing pictures. But there seems to be a desire by people to go back to the past when taking a picture was more complicated, but was also, somehow, more satisfying.

To an old audiophile like me, I am even more blown away by the resurgence of turntables and stereo systems. It’s been reported that the sales of vinyl albums last year was the largest since 1990. In today’s digital music world, a return to analog turntables and vinyl albums is almost like stepping into a time machine.

It seems that the smartphone is transitioning from being exciting new technology and is now just considered as an everyday tool. I can’t imagine buying a new phone now unless I am having noticeable problems with my current phone. It’s hard to imagine a smartphone improvement amazing enough to lure me to upgrade otherwise.

There are a number of technologists who predict that within a decade or so that smartphones will become a thing of the past. Let’s face it, it can be a hassle to always remember to carry your phone everywhere and to always protect it against getting wet or breaking. Sometimes carrying a smartphone feels like more of a burden than a benefit.

Futurists differ in what they predict will replace the smartphone. Some think it will be wearables, some think it will be some sort of virtual reality device such as glasses, and some even think that we’ll transition to implants of some sort. All of these possible futures have one thing in common – the computer will be automatically with us and we’ll barely notice it as a separate device. Many predict that connectivity and technology will become naturally integrated into our daily lives.

I’m now of an age when I’ve seen a lot of technology come and go. There are so many technologies that have grown huge and within a decade or less disappeared to be replaced by something else. I owned a lot of it – reel-to-reel tape records, scientific calculators, 8-track tape players, VCRs, digital cameras, and iPods. And along with many of these devices, the companies that made them often faded along with the devices.

It’s a bit hard to think that we could be a decade away from a time when the smartphone as we know it has been replaced by something better. The smartphone has probably been the most disruptive technology in my lifetime, except perhaps for the PC. But its days are most likely numbered like every other technology we’ve seen come and go over the past fifty years.

I have always been intrigued by new technology as it has come along. But I now regret having finally given away my hundreds of albums when I decided that analog music was dead. Why didn’t somebody tell me that turntables would be back?

Politics and Municipal Partnerships

ppp_logoOne of the hot topics around the industry today is the creation of Public Private Partnerships (PPPs) with municipalities to provide fiber-based broadband. Today I want to talk a bit about the difference in partnering with a municipality compared to other commercial carriers.

Commercial carriers are often very used to partnering with each other. They will build fiber routes together and routinely share facilities. And many ISPs will outsource functions to another carrier when it makes economic sense. I see ISPs everywhere engaging in some very creative partnerships with other carriers.

But partnering with a municipality is different, mainly due to the very nature of how municipalities work. Any carrier that does not understand the differences and that doesn’t account for those differences in their plans is likely to get very frustrated over time with a municipal partner. Today I look at some political issues that arise in PPPs and I will look at financial and legal issues in subsequent blogs.

Municipalities are (by definition) political entities. The people at the top of the political pyramid are elected officials and that has to be considered when partnering with a municipality. The city you partner with today might not be the same city you find yourself working with in five years after a few elections. Change can happen with a commercial partner as well, but it’s rarely as abrupt or as expected.

I know one company that partnered with a city to build fiber and the city was an enthusiastic partner. But the next administration of the city came in with a bias against the city working to ‘enrich’ private businesses, and that partnership then became a lot more difficult to maintain. So the one thing that a good PPP needs is to be insulated from politics as much as possible. You don’t want to have the PPP structured in such a way that future decisions like raising rates or building new facilities must be approved by a city council.

It’s also important for a business to understand how slow municipalities are in making decisions. The whole municipal deliberative process is slow on purpose to give the public a chance to weigh in on things a city does. But it can drive a commercial entity crazy waiting for a municipal partner to make a decision when you are running a commercial business venture.

Another shock that those involved in PPPs are often surprised about is how everything they decide or do as part of the PPP is suddenly in the press. Local ISPs can often go for decades without making the paper for anything bigger than making a donation to a local charity. It’s very disturbing to see your business decisions discussed in the press, and often incorrectly.

Engaging in a PPP also can subject an ISP to an unusual kind of attack from the larger incumbent providers. They will make the argument that anything that a municipality provides as part of partnering with an ISP ought to be extended to all carriers. These arguments are labeled as ‘level playing field’ issues and incumbents can sound incredibly persuasive when talking about the unfair advantages given to one of their competitors (while ignoring the monopoly power they probably held over the city for decades before).

All of these issues can be managed as long as a carrier walks into a PPP arrangement fully aware of each of them and with a strategy for dealing with each one. Once a carrier has joined with a municipal partner they can never be free of these sorts of political issues – but they can structure the business arrangement in such a way as to minimalize the practical impact of them.

The FCC’s Special Access Order

FCC_New_LogoThe FCC started the process last week of changing the way that the large telcos sell transport products on their networks. Transport products are products like T1s and larger circuits that are used to send data from one point to another and the telco market for these products is called special access.

I’ve often heard people ask how the big telcos can still make so much money since they have lost most of the voice lines that were their bread and butter products for a hundred years. And a big part of the answer is in special access. There is still a gigantic market today for transport for things like connecting schools to each other, for connecting bank ATM networks, for connecting cellular towers or for connecting all of the locations in a market together for a government or large business.

While there is a lot of special access sold to businesses the majority of special access is sold to other carriers. Special access is what gives most CLECs and other competitive carriers the ability to compete – it’s how they actually make a connection to buildings in a market. Very few companies own wireline networks that connect to all of the businesses in a community. In most larger cities there is some fiber owned by non-telcos, but except for the rare city that has built fiber everywhere, such fiber is generally limited to fiber strung along major roads or to business parks.

There is still a surprisingly large amount of the country where the telco is the only one that has ever constructed wires to reach all of the nooks and crannies of a city. It’s not hard to understand why this is so since it’s expensive to build fiber. You read in this blog all of the time about companies that are building fiber to serve residential customers. But it’s even harder to justify building fiber to serve only businesses – unless the businesses are really large or unless there are a lot of them in a single place – like in a business high-rise or in a central business district.

The telcos have been adept at taking advantage of their monopoly position in the special access market. They have priced transport products high and also imposed scads of rules on these products over the years that are all in their favor. The recent order took the first swipe at abolishing some of the more unfair rules.

For one the FCC got rid of termination charges. The telcos have been selling special access only under term contracts. If a competitive carrier was buying a special access for a two or three years period and their end user customers stopped paying them – because they moved or shut down – the carrier was still on the hook to pay for the circuit for the rest of the contract period. The FCC also abolished a number of rules that twisted the arms of carriers to buy special access – rules called tie-ins. If a carrier wanted to buy a large quantity of circuits in one market they were pressured into buying special access in other markets where they might have instead found a competitive alternative.

The FCC has also started the process of trying to regulate special access by market. They are contemplating doing this by determining first if various markets are competitive, and then imposing stricter rules for markets that are considered as non-competitive. And my guess is that will be most markets since there are not a huge number of markets where somebody else owns a lot of alternative wires. Interestingly, this is happening at the time when Verizon is trying to buy XO Communications – one of the largest alternative fiber providers. The fear among other carriers is that merger will take a big pile of competitive fiber transport off the market and turn it into special access.

The FCC also said that that they are considering making the cable companies subject to the new transport rules. The cable companies are late comers to the transport world since they largely shunned building to businesses when they first built cable networks – not enough businesses bought TV to justify the construction. But cable companies have extended fiber markets in many places to cover business parks and business districts and have become a major player in the transport business. But like with other duopoly competition, they often ‘compete’ with prices and terms that are not that different from special access so that they and the telcos can split the lucrative revenue stream. So there will probably be non-competitive markets where the FCC’s new pricing rules apply to them as well.

The Need for Networked WiFi

Wi-FiThe way that broadband providers wire homes continues to evolve and ISPs are always looking for ways to provide good broadband while cutting down the amount of time they have to spend in a home for an installation.

Historically a lot of ISPs connected each of the triple play services in a home to the existing wiring for that service. But this meant dealing with any existing inside wire issues, and it also meant stringing new wires when there weren’t wires where the customer wanted service. It was not unusual for a FTTP installation to take 4- 5 hours for a crew of two installers.

Today some ISPs have gone to the other extreme. Comcast brought a coaxial cable into my house and tied it into the existing coax in the house and put a WiFi router where the coax entered the house. One installer came to my house, installed the drop and wired up the services in about an hour. (It should have taken a bit longer because he was not the person that buried drops and so he just laid it across my lawn.)

A lot of fiber providers are taking the same path as Comcast. They drop the cable TV onto existing coax and place a WiFi router. While this has cut down on installation time, I have clients who are now re-examining this decision because a large percentage of their trouble calls are now about WiFi problems and not network problems.

There are ISPs that use the powerlines in the home to move data from room to room. But companies are abandoning that for the same reason that WiFi is having problems today. The problem is the big increase in bandwidth demand in homes. Customers today want big bandwidth and they want mobility within the home. Every room in my house needs to have broadband today. We have the typical array of desktops, laptops, tablets, smartphones, a smart TV, some IOT devices, and two Amazon Echos. And we often use them all at the same time.

Some ISPs have started to battle the bandwidth demand by changing the way they wire. One new strategy is to run a wire directly to the devices that use a lot of bandwidth to lessen overall demand on the WiFi network. For instance, I have clients that offer centralized DVR service and they are now running a category 5 or 6 cable to the primary settop box which can have a huge bandwidth demand.

But the one thing almost no ISP is doing yet is making WiFi work right. A WiFi signal of any power deteriorates when it passes through solid impediments like walls. It doesn’t really matter if you are starting at the WiFi router with 50 Mbps or a gigabit, the WiFi signal will usually die by the time it gets to the far reaches of the house.

ISPs are already reconsidering the strategy of deploying only a single WiFi router. And they are coming to different conclusions. I know one ISP that no longer supplies WiFi routers and leaves that to the customer – this way they can blame poor WiFi performance on the customer. But most companies would see that as very bad customer service.

Some ISPs are going to the other extreme and installing the best WiFi router they can find. But one router, no matter how good it is is not a good solution for a significant percentage of homes. The solution that is needed is to install a WiFi network consisting of several hotspots all sharing the same core router.

I think the industry is missing a big new revenue stream from charging to set up and maintain networked WiFi. Companies have always known that they make money leasing out boxes – and settop boxes and that modems are some of the most profitable products they sell.

Since customers want their broadband to work all over the house they are probably willing to pay to make it work right. Networked WiFi requires spending more time again during the installation, but it’s probably worth that to get happy customers. And it ought to cut down on future truck rolls that are really WiFi problems. I know of a handful of ISPs starting to sell the service and they say it’s very popular. It’s not hard to see why – the majority of homes are unhappy with their broadband, at least in some parts of the home.