The Pros and Cons of Microtrenching

The cost of laying fiber is still the most expensive part of bringing broadband to new places. There is a relatively new construction technique that is worth considering for fiber construction. While it’s been around for a few years, microtrenching is gaining in acceptance as a less expensive way to lay fiber.

Microtrenching involves digging a narrow trench of one to two inches wide and up to two foot deep. These trenches can then hold multiple conduits for fiber. The technique can be used on open highways and is most easily explained by this short video from Ditch Witch. There are also smaller units that can be used to cross sidewalks, parking lots and driveways.

There are a few cities that are now encouraging microtrenching. The first was New York City that adopted the technique in 2013 and which now requires it unless there is a good reason to use some other technique. San Francisco just proposed requiring the technique a few weeks ago for all future fiber construction there. Google used microtrenching in some of their city fiber builds like in Austin and Charlotte, and is trying to get cities like San Antonio to approve the technique.

The pros for this technique are significant, mostly dealing with cost. The alternative to microtrenching for traversing sidewalks, driveways and parking lots is boring. The boring technique involves digging a somewhat deep hole of 3 – 5 feet and then using equipment to bore sideways underneath the concrete. There is significant labor involved in the process and there is always a danger of hitting other utilities, particularly when boring away from public rights-of-ways. Google says that microtrenching is vastly more efficient in areas with urban sidewalks and that they can microtrench as many as 50 customers in the time it used to take to bore to one customer.

There are places where this might be the only sensible technique. For example, I have a client that wanted to pass through national forest roads to connect two parts of a fiber network. The Forest Service would not let them use any technique that would disturb the soil off of the paved asphalt. But perhaps they would have allowed microtrenching.

But there are certainly downsides. Probably the biggest downside is that the microtrenching is a lot shallower than other kinds of fiber construction. That certainly is going to present a problem in later years when it’s time to repave streets. Most city streets around the country are on 30 – 40 year replacement cycles. During that time they usually get repaved with asphalt a few times, but at the end of the cycle the old paving must be excavated and the process started over. That replacement process generally digs down anywhere from 18 inches to three feet depending upon local soil and substrate conditions. And that could mean digging up all of the microtrenching. The replacement lifecycle for streets is so long that it would be easy for future road construction crews to not understand the unusually narrow depths of the microtrenched fibers.

The same concerns apply to parking lots and sidewalks, although there is often less chance of these being completely excavated to any great depth for replacement purposes. But there is often more localized construction done to replace or bring gas, water or electric utilities and future work could more easily cut microtrenched fiber.

Most cities have specific expectations of various utilities. They expect different utilities to be at specified depths, and anything that falls outside these expectations is likely to cause problems in the long run. This means that any locality that allows microtrenching needs to make sure to create ordinances that allows for it and to change the rules for other utility work and road work to account for it.

Another practical issue for many ISPs considering the technique is that there are not a lot of contractors doing microtrenching today – and that might make it harder to find somebody who can do the work for you. But the potential cost savings for microtrenching are so large that it’s something that should be more regularly considered as part of any major fiber construction plan.

 

 

The End of Data Privacy?

Congress just passed a law that reverses the privacy rules that were ordered by the prior FCC. Those rules were recently put on hold by the current FCC and this new laws makes sure the privacy rules never go into effect. Congress did this to ensure that a future FCC cannot implement privacy rules without Congressional approval. It’s important to note that this law applies equally to both terrestrial and cellular broadband.

On paper this law doesn’t change anything since the FCC privacy rules never went into effect. However, even before the prior FCC adopted the privacy rules they had been confronting ISPs over privacy issues which kept the biggest ISPs from going too far with using customer data. Just the threat of regulation has curbed the worst abuses.

How will the big ISPs be likely to now use customer data? We don’t have to speculate too hard because some of them have already used customer data in various ways in the recent past, all of which seem to be allowable under this new law.

Selling Data to Marketers. This is the number one opportunity for big ISPs. Companies like Facebook and Google have been mining customer data, but they can only do that when somebody is inside their platforms – they have no idea what else you do outside their domains. But your ISP can know every keystroke you make, every email your write, every website you visit, and with a cellphone, every place you’ve been. With deep data mining ISPs can know everything about your on-line life.

We know some of the big ISPs have already been mining customer data. For example, last year AT&T offered to sell connections that were not monitored for a premium price. AT&T also has a product that has been selling masses of customer phone and data usage to federal and local law enforcement. Probably other ISPs have been doing this as well, but this has been a well-guarded secret.

Inserting Ads. This is another big revenue opportunity for the ISPs. The companies will be able to create detailed profiles of customers and then sell targeted advertising to reach specific customers. Today Google and a few other large advertising companies dominate the online advertising business of inserting ads into web sites. With the constraints off, the big ISPs can enter this business since they will have better customer profiles than anybody else. We know that both AT&T and Charter have already been doing this.

Hijacking Customer Searches. Back in 2011 a bunch of large ISPs like Charter, Frontier and others were caught hijacking customer DNS searches. When customers would hit buttons on web sites or on embedded links in articles the ISPs would sometimes send users to a different web site than the one they thought they were selecting. The FCC told these companies to stop the practice then, but the new law probably allows the practice again.

Inserting Supercookies. Verizon Wireless inserted Supercookies on cellphones back in 2014. AT&T started to do this as well but quickly backed off when the FCC came down hard on Verizon. These were undetectable and undeletable cookies that allowed the company to track customer behavior. The advantage of the supercookies is that they bypass most security schemes since they grab customer info before it can be encrypted or sent through a secure connection. For example, this let the company easily track customers with iPhones.

Pre-installing Tracking Software on Cellphones. And even better than supercookies is putting software on all new phones that directly snags data before it can be encrypted. AT&T, T-Mobile and Sprint all did this in the past – just using a different approach than supercookies. The pre-installed software would log things like every website visited and sent the data back to the cellular carriers.

Who Wins with Cable Deregulation?

There has been a lot of press lately discussing what might happen if the FCC does away with Title II regulation of broadband. But broadband isn’t the only battle to be fought and we are also in for big changes in the cable industry. Since our new FCC is clearly anti-regulation I think the future of cable TV is largely going to be won by whoever best copes with a deregulated cable world.

Cable companies today are governed by arcane rules that rigidly define how to provide terrestrial cable TV. These rules, for example, define the three tiers of cable service – basic, expanded basic and premium – and it is these rules that have led us to the big channel line-ups that are quickly falling out of favor. Most households watch a dozen or so different channels regularly and even big cable users rarely watch more than 30 channels – but yet we have all been sucked into paying for 150 – 300 channel line-ups.

It’s likely that the existing rules governing cable will either be relaxed or ignored by the FCC. A lot of the cable rules were defined by Congress in bills like the Telecommunications Act of 1996, so only Congress can change those rules. But the FCC can achieve deregulation by inaction. Already today we see some of the big cable providers violating the letter of those rules. For example, Verizon has a ‘skinny’ package that does not fit into the defined FCC definition of the structure of cable tiers. The FCC has turned a blind eye to these kinds of changes, and if they are more overt about this then we can expect cable providers everywhere to start offering line-ups people want to watch – and at more affordable prices if the cable companies can avoid paying for networks they don’t want to carry.

The cable companies are now in a battle with the OTT providers like Netflix, Sling TV and others. It’s clear to the cable companies that if they don’t fight back that they are going to bleed customers faster and faster, similar to what happened to landline voice.

One way cable companies can fight back is to introduce programming packages that are similar to what the OTT providers are offering. This is going to require a change in philosophy at cable companies because the larger companies have taken to nickel and diming customer to death in the last few years. They sell a package at a low advertised price and then load on a $10 settop box fee, a number of other fees that are made to look like taxes, and the actual price ends up $20 higher than advertised. That’s not going to work when competing head-to-head with an OTT competitor that doesn’t add any fees.

The cable companies are also going to have to get nimble. I can currently connect and disconnect from a web service like Sling TV at will. Two or three clicks and I can disconnect. And if I come back they make it easy to reconnect. The cable companies have a long way to go to get to this level of customer ease.

Of course, the big ISPs can fight back in other ways. For example, I’ve seen speculation that they will try to get taxes levied on OTT services to become more price competitive. Certainly the big ISPs have a powerful lobbying influence in Washington and might be able to pull that off.

There is also speculation that the big ISPs might try to charge ‘access fees’ to OTT providers. They might try to charge somebody like Netflix to get to their customers, much in the same manner that the big telcos charge long distance carriers for using their networks. That might not be possible without Congressional action, but in today’s political world something like this is conceivable.

Another tactic the cable companies could take would be to reintroduce low data caps. If the FCC eliminates Title II regulation that is a possibility. The cable companies could make it costly for homes that want to watch a lot of OTT content.

And perhaps the best way for the cable companies to fight back against OTT is to join them. Just last week Comcast announced that it will be introducing its own OTT product. The cable companies already have the programming relationships – this is what made it relatively easy for Dish Network to launch Sling TV.

It’s impossible to predict where this might all go. But it seems likely that we are headed towards a time of more competition – which is good for consumers. But some of these tactics could harm competition and make it hard for OTT providers to be profitable. Whichever way it goes it’s going to be an interesting battle to watch.

Is Ultrafast Broadband a Novelty?

FCC Commissioner Michael O’Reilly recently said that ultrafast broadband is a novelty. He specifically said, “The outcry for things like ultrahigh-speed service in certain areas means longer waits for those who have no access or still rely on dialup service, as providers rush to serve the denser and more profitable areas that seek upgrades to this level. . . Today, ultrafast residential service is a novelty and good for marketing, but the tiny percentage of people using it cannot drive our policy decisions.

These statements are not surprising coming from Commissioner O’Reilly. He voted two years ago against setting the current 25/3 Mbps definition of broadband and thought that number was too high. In a dissent to that ruling he said the 25/3 definition was unrealistically high and said, “While the statute directs us to look at “advanced” telecommunications capability, this stretches the concept to an untenable extreme. Some people, for example, believe, probably incorrectly, that we are on the path to interplanetary teleportation. Should we include the estimated bandwidth for that as well? “

I don’t understand why Commissioner O’Reilly is still taking this position today. Most of the big ISPs have climbed on board the big bandwidth wagon. Comcast and Cox and other cable companies are upgrading their cable networks to DOCSIS 3.1 in order to provide gigabit speeds. CenturyLink built fiber past almost a million homes last year. Altice says they are tearing out their coaxial networks and replacing them with fiber. AT&T claims to have plans to build fiber to pass 12 million homes and businesses. Numerous small overbuilders around the country are offering gigabit speeds.

You don’t have to go back too many years to a time when the big ISPs all agreed with O’Reilly. The big cable companies in particular repeatedly made it clear that people didn’t need any more bandwidth than what the cable companies were delivering. The cable companies fiercely resisted increasing data speeds for many years and many cable networks kept data speeds in the 6 Mbps download range even though their networks were capable of delivering higher speeds without the need for upgrades.

Part of the old reasoning for that position was that the ISPs were afraid that if they gave people faster speeds then they would then use those speeds and swamp the networks. But Google came along and upset the whole ISP world by offering an inexpensive gigabit product. The cable companies in cities like Kansas City and Austin had little choice and increased speeds across the board. And once they increased in those markets they had little choice but to improve speeds everywhere.

The cable companies found the same thing that all of my clients have found when increasing data speeds. Generally a unilateral increase in customer data speeds does not cause a big increase in data usage unless the customers were throttled and constrained before the increase. Most customers don’t use any more data when speeds get faster – they just enjoy the experience more.

Of course, customers want to download more data every year and the amount of total download doubles about every three years. But that phenomenon is separate from data speeds. All of the things we do on the web requires more bandwidth over time. You scroll through a Facebook page today and you encounter dozens of videos, for example. But having faster speeds available does not directly lead to increased data usage. Speed just gets the things done faster and more enjoyably.

Commissioner O’Reilly thinks it would be better if ISPs would somehow invest to bring mediocre data speeds to everybody in the country rather than investing in ultrafast speeds to urban areas. No doubt that would make the FCC’s life easier if rural people all had broadband. But it’s fairly obvious that big ISPs wouldn’t be investing in their urban networks unless those investments made them more money. And it’s just as obvious that the big ISPs have figured out that they can’t make the profits they want in rural America.

I’m not sure what constituency Commissioner O’Reilly is trying to please with these statements. Certainly any urban customers that are happily buying the ultrafast speeds he is referring to. Certainly the ISPs investing in faster data speeds think it’s a good business decision.

I think Commissioner O’Reilly and others at the FCC would like to see the rural broadband issue go away. They hope that the CAF II investments being made by the big telcos will make the rural areas happy and that the issue will evaporate. They want to be able to claim that they fixed the broadband problems in America by making sure that everybody gets at least a little bit of bandwidth.

But it’s not going to work that way. Certainly many rural customers who have had no broadband will be happy to finally get speeds of 10 – 15 Mbps from the CAF II program. Those kind of speeds will finally allow rural homes to take some part in the Internet. But then those folks will look around and see that they still don’t enjoy the same Internet access as folks in the urban areas. Instead of solving the rural broadband problem I think the CAF II program is just going to whet the rural appetite for faster broadband and then rural folks will begin yelling even louder for better broadband.

Technology Right Around the Corner

Every once in a while I like to review technologies outside of telecom that are going to be impacting most of us in the near future. Today I’m writing about some technologies that seem likely to become commonplace within the next five years. Of course, as with any new innovation, the way these ideas are marketed and implemented will likely mean that some will become bigger than expected and others might fizzle.

Self-driving Trucks. It seems inevitable that we are going to eventually live in a world of smart cars that can drive themselves. But before we get to that place many industry experts believe that the first mass-adopted use of the new technologies will appear in long-haul trucking. The challenges for using self-driving trucks for local deliveries are a lot more complex and may not be solved until trucks are somehow paired with robots to load and unload local goods.

We spend a huge amount of money in this country moving things from one place to another, and our current system of using human drivers has some built-in inefficiencies. Trucking today is limited to a big extent due to the number of hours that a driver is allowed to drive per day due to safety regulations. Self-driving trucks can drive around the clock and only need to stop occasionally to refuel. The combination of eliminating truck-driver salaries and also extending the hours of daily drive time provides a huge economic incentive to make this work. There have already been trials of self-driving trucks. Another strategy being tried in Europe is to create truck convoys, with a live driver in the first truck leading a pack of self-driving trucks.

Enhanced Vision. IBM predicts that soon there will be inexpensive technology available that will enable us to ‘see’ in a wide range of spectrum including microwaves, millimeter waves and infrared. There have been infrared goggles available for decades, but IBM says that there will be glasses or small handheld devices that will operate in a similar manner and that will let us see in these other frequencies.

This opens up a wide range of products that will let people see at night, will let cars see through fog and rain, and will let workers and technicians see their work environment in a different and useful manner. In telecom picture a technician able to ‘see’ a millimeter-wave microwave beam to more efficiently install receivers. Imagine linemen able to climb and fix aerial cables easily at night.

But the possibilities for better vision are immense. Imagine policemen knowing in a glance if somebody is carrying a concealed weapon. Or consider a metal worker who can ‘see’ flaws in metal work that are not detectable with normal light. And perhaps best imagine being able to hike in the woods at night and able to see with the same clarity as the daytime.

Practical Quantum Computers. These have been on many lists of future technologies, but it looks like 2017 is the year that is finally going to see some practical developments of this new technology. There have been tiny steps taken in the field with D-Wave Systems of Canada now selling a precursor machine that uses a technology known as quantum annealing. But there is a lot of big money being put into the technology by Google, IBM, Microsoft and others that might lead to soon building a working quantum computer including the needed chips and the complex circuitry along with needed control software.

The challenge to building workable quantum computers has been the fact that the qubits – the basic unit of quantum information – are susceptible to interference. For qubits to work they must be able to achieve the dual states of quantum superposition (seeming to be in two physical states at the same time) and entanglement (the linking of a pair of qubits such that when something happens to one it simultaneously changes the paired qubit as well). The rewards for making this work means the development of computers that far exceed the reach of today’s best supercomputers. Various scientists working in the field say that breakthroughs are imminent.

The Cell Atlas. There have been great strides over the last decades in deciphering DNA and other chemical reactions within the human body. The next big challenge now being tackled is to create what is being called a cell atlas that will map all of the different types of cells in the human body. The goal is to understand in detail the exact function and location within the body of different kinds of cells as a way understand how cells interact with each other. It’s a huge undertaking since the human body contains over 37 trillion cells. Teams of scientists in the US, the UK, Sweden, Israel, Japan, and the Netherlands are undertaking this task. They are planning to catalog the different kinds of cells, assign each a different molecular signature and then map each kind of cell in a three-dimensional map of the body.

Many of the kinds of cells in our bodies have been studied in detail. But scientists expect the mapping process to uncover many additional kinds of cells and to also begin to let them start to understand the way that cells interface with the rest of the body. They are certain that this process will lead to many new discoveries and a far better understanding of the human body.

The process relies on three different technologies. The first is cellular microfluidics which allows scientists to isolate and manipulate individual cells and allows for a detailed analysis. The second are new machines that can rapidly decode individual cells for just a few cents per cell. These machines can decode as many as 10,000 cells per day. Finally there are new technologies that allow for labeling different kinds of cells on the basis of gene activity and to ‘map’ the location of the particular kind of cell within the body.

Broadband Shorts – March 2017

Today I’m writing about a few interesting topics that are not long enough to justify a standalone blog:

Google Scanning Non-user Emails. There has been an ongoing class action lawsuit against Google for scanning emails from non-Google customers. Google has been open for years about the fact that they scan email that originates through a Gmail account. The company scans Gmail for references to items that might be of interest to advertisers and then sell that condensed data to others. This explains how you can start seeing ads for new cars after emailing that you are looking for a new car.

There are no specific numbers available for how much they make from scanning Gmail, but this is part of their overall advertising revenues which were $79.4 billion for 2016, up 18% over 2015.  The class action suit deals with emails that are sent to Gmail users from non-Gmail domains. It turns out that Google scans these emails as well, although non-Gmail users have never agreed to the terms of service that applies to Gmail users. This lawsuit will be an important test of customer privacy rights, particularly if Google loses and appeals to a higher court. This is a germane topic right now since the big ISPs are all expected to do similar scanning of customer data now that the FCC and Congress have weakened consumer privacy rights for broadband.

Verizon FiOS and New York City. This relationship is back in the news since the City is suing Verizon for not meeting its promise to bring broadband to everybody in the city in 2008. Verizon has made FiOS available to 2.2 million of the 3.3 million homes and businesses in the city.

The argument is one of the definition of a passing. Verizon says that they have met their obligation and that the gap is due to landlords that won’t allow Verizon into their buildings. But the city claims that Verizon hasn’t built fiber on every street in the city and also that the company has often elected to not enter older buildings due to the cost of distributing fiber inside the buildings. A number of landlords claim that they have asked Verizon into their buildings but that the company either elected to not enter the buildings or else insisted on an exclusive arrangement for broadband services as a condition for entering a building.

New Applications for Satellite Broadband.  The FCC has received 5 new applications for launching geostationary satellite networks bringing the total requests up to 17. Now SpaceX, OneWeb, Telesat, O3b Networks and Theia Holdings are also asking permission to launch satellite networks that would provide broadband using the V Band of spectrum from 37 GHz to 50 GHz. Boeing also expanded their earlier November request to add the 50.4 GHz to 52.4 GHz bands. I’m not sure how the FCC picks winners from this big pile – and if they don’t we are going to see busy skies.

Anonymous Kills 20% of Dark Web. Last month the hackers who work under the name ‘Anonymous’ knocked down about 20% of the web sites from the dark web. The hackers were targeting cyber criminals who profit from child pornography. Of particular interest was a group known as Freedom Hosting, a group that Anonymous claims has over 50% of their servers dedicated to child pornography.

This was the first known major case of hackers trying to regulate the dark web. This part of the Internet is full of pornography and other kinds of criminal content. The Anonymous hackers also alerted law enforcement about the content they uncovered.

Is it Too Late to Save the Web?

Advocates of net neutrality say that we need to take a stand to protect the open web, and for those that have been using the web since its early days that sounds like a noble goal. But when I look at the trends, the statistics, and the news about the web, I have to wonder if it’s too late to save the web as we’ve known it.

The web was originally going to be a repository of human knowledge and people originally took the time to post all sorts of amazing content on the web. But nobody does that very much anymore and over time those old interesting web sites are dying. Mozilla says that people no longer web search much and that 60% of all non-video web traffic goes to a small handful of giant web companies like Facebook.

The average web user today seeks out a curated web experience like Facebook or other social platforms where content is brought to them instead of them searching the web. And within those platforms people create echo chambers by narrowing their focus over time until they only see content that supports their world view. People additionally use the web to do a few additional things like watching Netflix, paying bills, shopping at Amazon and searching on Google.

I don’t point out that trend as a criticism because this is clearly what people want from the web, and they vote by giant numbers to use the big platforms. It’s hard to argue that for the hundreds of millions of people who use the web in this manner that the web is even open for them any longer. People are choosing to use a restricted subset of the web, giving even more power to a handful of giant companies.

The trends are for the web to get even more restricted and condensed. Already today there are only two cellphone platforms – Android and iOS. People on cellphones visit even fewer places on the web than with landline connections. You don’t have to look very far into the future to see an even more restricted web. We are just now starting to talk to the web through Amazon Alexa and Apple Siri. The industry expects a large percentage of web interface to soon be accomplished though voice interface. And beyond that we are moving towards a world of wearables that will replace our cellphones. At some point most people’s web experience will be completely curated and the web we know today will largely become a thing of the quaint past.

It’s not hard to understand why people lean towards curated platforms. Many of them hear the constant news of hacking and ransomware and people don’t feel safe going to unknown websites. The echo chamber has been around as long as modern civilization has been around – people tend to do things they like with people that they know and trust. The echo chamber seems magnified by current social media because it can give the perception that people are part of something larger than themselves – but unless people take actions outside the web that’s largely an illusion.

There are those who don’t want to take part in the curated web. They don’t like the data gathering and the targeted marketing from the big companies. They tend towards platforms that are encrypted end-to-end like WhatsApp. They use browsers that don’t track them. And they stick as much as possible to websites using HTTPS. They are hopeful that the new TLS 1.3 protocol (transport layer security) is going to give them more anonymity than today. But it’s hard work to stay out of the sight of the big companies, and it’s going to get even harder now that the big ISPs are free again to gather and sell data on their customers’ usage.

Even though I’ve been on the web seemingly forever, I don’t necessarily regret the changes that are going on. I hate to see the big companies with such power and I’m one of the people that avoids them as much as I can. But I fully believe that within a few decades that the web as we know it will become a memory. Artificial intelligence will be built into our interfaces with the web and we will rely on smart assistants to take care of things for us. When the web is always with you and when the interfaces are all verbal, it’s just not going to be the same web. I’m sure at some point people will come up with a new name for it, but our future interfaces with computers will have very little in common with our web experiences of today.

Wireless Networks Need Fiber

As I examine each of the upcoming wireless technologies it looks like future wireless technology is still going to rely heavily on an underlying fiber network. While the amount of needed fiber will be less than building fiber to every customer premise, supporting robust wireless networks is still going to require significant construction of new fiber.

This is already true today for the traditional cellular network and most existing towers are fiber-fed, although some have microwave backhaul. The amount of bandwidth needed at traditional cell sites is already outstripping the 1 or 2 GB capacity of wireless backhaul technologies. Urban cell sites today are fed with as much as 5 – 10 GB pipes and most rural ones have (or would like to have) a gigabyte feed. I’ve seen recent contractual negotiations for rural cell sites asking for as much as 5 GB of backhaul within the next 5 – 10 years.

Looking at the specification for future 5G cellular sites means that fiber will soon be the only backhaul solution for cell sites. The specifications require that a single cell site be capable of as much as 20 GB download and 10 GB upload. The cellular world is currently exploring mini-cell sites (although that effort has slowed down) to some degree due to the issues with placing these devices closer to customers. To be practical these small cell sites must be placed on poles (existing or newly built), on rooftops and on other locations found near to areas with high usage demand. The majority of these small sites will require new fiber construction. Today these sites can probably use millimeter wave radio backhaul, but as bandwidth needs increase, this is going to mean bringing fiber to poles and rooftops.

Millimeter wave radios are also being touted as a way to bring gigabit speeds to consumers. But delivering fast speeds means getting the radios close to customers. These radios use extremely high frequencies, and as such travel for short distances. As a hot spot a millimeter wave radio is only good for a little over 100 feet. But even if formed into a tight microwave beam it’s a little over a mile – and also requires true line-of-sight. These radios will be vying for the same transmitter locations as mini-cell sites.

Because of the short distances that can be delivered by the millimeter wave radios, this technology is going to initially be of most interest in the densest urban areas. Perhaps as the radios get cheaper there will be more of a model for suburban areas. But the challenge of deploying wireless in urban areas is that is where fiber is the most expensive to build. It’s not unusual to see new fiber construction costs of $150,000 and $200,000 per mile in downtown areas. The urban wireless deployment faces the challenge of getting both fiber and power to poles, rooftops and sides of buildings. This is the issue that has already stymied the deployment of mini-cell sites, and it’s going to become more of an issue as numerous companies want to build competing wireless networks in our cities. I’m picturing having the four major cellular companies and half a dozen wireless ISPs all wanting access to the same prime transmitter sites. All of these companies will have to deal with the availability of fiber, or will need to build expensive fiber to support their networks.

Even rural wireless deployments needs a lot of fiber. A quality wireless point-to-point wireless network today needs fiber at each small tower. When that is available then the current technologies can deploy speeds between 20 Mbps and 100 Mbps. But using wireless backhaul instead of fiber drastically cuts the performance of these networks and there are scads of rural WISPs delivering bandwidth products of 5 Mbps or less. As the big telcos tear down their remaining rural copper, the need for rural fiber is going to intensify. But the business case is often difficult to justify to build fiber to supply bandwidth to only a small number of potential wireless or wireline customers.

All of the big companies that are telling Wall Street about their shift to wireless technologies are conveniently not talking about this need for lots of fiber. But when they go to deploy these technologies on any scale they are going to run smack into the current lack of fiber. And until the fiber issue is solved, these wireless technologies are not going to deliver the kinds of speeds and won’t be quickly available everywhere as is implied by the many press releases and articles talking about our wireless future. I have no doubt that there will eventually be a lot of customers using wireless last mile – but only after somebody first makes the investment in the fiber networks needed to support the wireless networks.

Ready or Not, IoT is Coming

We are getting very close to the time when just about every appliance you buy is going to be connected to the IoT, whether you want it or not. Chips are getting so cheap that manufacturers are going to soon understand the benefits of adding chips to most things that you buy. While this will add some clear benefits to consumers it also brings new security risks.

IoT in everything is going to redefine privacy. What do I mean by that? Let’s say you buy a new food processor. Even if the manufacturer doesn’t make the device voice-controlled they are going to add a chip. That chip is going to give the manufacturer the kind of feedback they never had before. It’s going to tell them everything about how you use your food processor – how long before you take it out of the box, how often you use it, how you use the various settings, and if the device has any problems. They’ll also be able to map where all of their customers are, but more importantly they will know who uses their food processor the most. And even if you never register the device, with GPS they are going to know who you are.

Picture that same thing happening with everything you buy. Remember that Tostitos just found it cost effective to add a chip to a million bags of chips for the recent Superbowl. So chips might not just be added to appliances, but could be built into anything where the manufacturer wants more feedback about the use of their product.

Of course, many devices are going to go beyond this basic marketing feedback and will also include interactions of various kinds with customers. For instance, it shouldn’t be very long until you can talk to that same food processor through your Amazon Alexa and tell it what you are making. It will know the perfect settings to make your guacamole and will help you blend a perfect bowlful. Even people who are leery of home automation are going to find many of these features to be too convenient to ignore.

There is no telling at this early stage which IoT applications will be successful. For instance, I keep hearing every year about smart refrigerators and I can’t ever picture that ever fitting into my lifestyle. But like with any consumer product, the public will quickly pick the winners and losers. When everything has a chip that can communicate with a whole-house hub like Alexa, each of us will find at least a few functions we love so much that we will wonder how we lived without them.

But all of this comes with a big price. The big thing we will be giving up is privacy. Not only will the maker of each device in our house know how we use that device, but anybody that accumulates the feedback from many appliances and devices will know a whole lot more about us than most of us want strangers to know. If you are even a little annoyed by targeted marketing today, imagine what it’s going to be like when your house is blaring everything about you to the world. And there may be no way to stop it. The devices might all talk to the cellular cloud and be able to bypass your home WiFi and security – that’s why both AT&T and Verizon are hyping the coming IoT cloud to investors.

There is also the added security risk of IoT devices being used in nefarious ways. We’ve already learned that our TVs and computers and other devices in the house can listen to all of our private conversations. But even worse than that, devices that can communicate with the world can be hacked. That means any hacker might be able to listen to what is happening in your home. Or it might mean a new kind of hacking that locks and holds your whole house and appliances hostage for a payment like happens today with PCs.

One of the most interesting things about this is that it’s going to happen to everybody unless you live in some rural place out of range of cell service. Currently we all have choices about letting IoT devices into our house, and generally only the tech savvy are using home automation technology. But when there are chips embedded in most of the things you buy it will spread IoT to everybody. It’s probably going to be nearly impossible to neutralize it. I didn’t set out to sound pessimistic in writing this blog, but I really don’t want or need my toaster or blender or food processor talking to the world – and I suspect most of you feel the same way.

The Death of WiFI Hotspots?

I’ve been thinking about the new unlimited data plans and wondering what impact they will have on public WiFi. As I wrote in a recent blog, none of the plans from the major cellular carriers are truly unlimited. But they have enough data available that somebody who isn’t trying to use one of these plans for a home landline connection will now have a lot more data available than ever before.

The plans from the big four carriers have soft monthly download caps of 22 Gigabytes or higher, at which point they throttle to slower speeds. But 22 to 30 GB is a huge cap for anybody that’s been living with caps under 5 GB or sharing family plans at 10 GB. And to go along with these bigger caps, the cellular companies are also now offering zero-rated video that customers can watch without touching the data caps. That combination is going to let cellphone users use a mountain of data during a month.

So I wonder how many people who buy these plans will bother to log onto WiFi in coffee shops, airports and hotels any longer? I know I probably will not. For the last few years I’ve seen articles almost weekly warning of the dangers of public WiFi and I’ve become wary of using WiFi in places like Starbucks. And WiFi in other public places has largely grown to be unusable. WiFi can be okay in business hotels in the early afternoon or at 3:00 in the morning, but is largely worthless in the prime time evening hours. And free airport WiFi in the bigger airports is generally already too slow to use.

If you think forward a few years you have to wonder how long it’s going to take before public WiFi wanes as a phenomenon? Huge numbers of restaurants, stores, doctor offices, etc. spend money today on broadband and on WiFi routers for their customers and you have to wonder why they would continue to do that if nobody is asking for it. And that’s going to mean a big decrease in sales of industrial grade WiFi routers and landline broadband connections. Many of these places already buy a second data connection for the public and those connections will probably be canceled in droves.

I wonder how much sense it makes for Comcast and others to keep pouring money into outdoor hotspots if people stop using them? You only have to go back a few years to remember when the concept of building the biggest outdoor hotspot network was the goal for some of the largest cable companies. Already today my wife has to turn off her WiFi when running in the neighborhood since her phone constantly drops her music stream through attempts to change to each Comcast WiFi connection she runs past. How many people with these unlimited plans will even bother to ever turn on their WiFi?

I also wonder if the cellular networks are really ready for this shift. There is a huge amount of data shifted today from cellphones to hotspots. As a business traveler I’m already thinking about how hard it might be soon to get a cellular data connection during the business hours if nobody is using the hotel WiFi. I know that 5G is going to fix this issue by offering many more connections per cell site, but we aren’t going to see widespread 5G cell sites for at least five years and probably a little longer.

I’ve always found it interesting how quickly changes seem to hit and sweep the cellular industry. There was virtually no talk a year ago about unlimited data plans. In fact, at that time both AT&T and Verizon were punishing those with legacy unlimited plans to try to drive them to some other plan. But the industry has finally plateaued on customer growth and cellular service is quickly becoming a commodity. I think a lot of us saw that coming, but I never suspected that the way it would manifest would be with competition of unlimited calling and the possible death of public WiFi. I don’t know if this industry will ever stop surprising us at times.

I guess a day could come soon when kids will have no memory of public hotspots. I can remember fondly when traveling to places like Puerto Rico or the Caribbean that the first thing you did on landing was find the locations of the Internet cafes. I remember back when our company decided to move out of our offices that one of my partners practically lived in a Starbucks for the next year. It was an interesting phase of our industry, but one whose days are probably now numbered.