Technology Right Around the Corner

Every once in a while I like to review technologies outside of telecom that are going to be impacting most of us in the near future. Today I’m writing about some technologies that seem likely to become commonplace within the next five years. Of course, as with any new innovation, the way these ideas are marketed and implemented will likely mean that some will become bigger than expected and others might fizzle.

Self-driving Trucks. It seems inevitable that we are going to eventually live in a world of smart cars that can drive themselves. But before we get to that place many industry experts believe that the first mass-adopted use of the new technologies will appear in long-haul trucking. The challenges for using self-driving trucks for local deliveries are a lot more complex and may not be solved until trucks are somehow paired with robots to load and unload local goods.

We spend a huge amount of money in this country moving things from one place to another, and our current system of using human drivers has some built-in inefficiencies. Trucking today is limited to a big extent due to the number of hours that a driver is allowed to drive per day due to safety regulations. Self-driving trucks can drive around the clock and only need to stop occasionally to refuel. The combination of eliminating truck-driver salaries and also extending the hours of daily drive time provides a huge economic incentive to make this work. There have already been trials of self-driving trucks. Another strategy being tried in Europe is to create truck convoys, with a live driver in the first truck leading a pack of self-driving trucks.

Enhanced Vision. IBM predicts that soon there will be inexpensive technology available that will enable us to ‘see’ in a wide range of spectrum including microwaves, millimeter waves and infrared. There have been infrared goggles available for decades, but IBM says that there will be glasses or small handheld devices that will operate in a similar manner and that will let us see in these other frequencies.

This opens up a wide range of products that will let people see at night, will let cars see through fog and rain, and will let workers and technicians see their work environment in a different and useful manner. In telecom picture a technician able to ‘see’ a millimeter-wave microwave beam to more efficiently install receivers. Imagine linemen able to climb and fix aerial cables easily at night.

But the possibilities for better vision are immense. Imagine policemen knowing in a glance if somebody is carrying a concealed weapon. Or consider a metal worker who can ‘see’ flaws in metal work that are not detectable with normal light. And perhaps best imagine being able to hike in the woods at night and able to see with the same clarity as the daytime.

Practical Quantum Computers. These have been on many lists of future technologies, but it looks like 2017 is the year that is finally going to see some practical developments of this new technology. There have been tiny steps taken in the field with D-Wave Systems of Canada now selling a precursor machine that uses a technology known as quantum annealing. But there is a lot of big money being put into the technology by Google, IBM, Microsoft and others that might lead to soon building a working quantum computer including the needed chips and the complex circuitry along with needed control software.

The challenge to building workable quantum computers has been the fact that the qubits – the basic unit of quantum information – are susceptible to interference. For qubits to work they must be able to achieve the dual states of quantum superposition (seeming to be in two physical states at the same time) and entanglement (the linking of a pair of qubits such that when something happens to one it simultaneously changes the paired qubit as well). The rewards for making this work means the development of computers that far exceed the reach of today’s best supercomputers. Various scientists working in the field say that breakthroughs are imminent.

The Cell Atlas. There have been great strides over the last decades in deciphering DNA and other chemical reactions within the human body. The next big challenge now being tackled is to create what is being called a cell atlas that will map all of the different types of cells in the human body. The goal is to understand in detail the exact function and location within the body of different kinds of cells as a way understand how cells interact with each other. It’s a huge undertaking since the human body contains over 37 trillion cells. Teams of scientists in the US, the UK, Sweden, Israel, Japan, and the Netherlands are undertaking this task. They are planning to catalog the different kinds of cells, assign each a different molecular signature and then map each kind of cell in a three-dimensional map of the body.

Many of the kinds of cells in our bodies have been studied in detail. But scientists expect the mapping process to uncover many additional kinds of cells and to also begin to let them start to understand the way that cells interface with the rest of the body. They are certain that this process will lead to many new discoveries and a far better understanding of the human body.

The process relies on three different technologies. The first is cellular microfluidics which allows scientists to isolate and manipulate individual cells and allows for a detailed analysis. The second are new machines that can rapidly decode individual cells for just a few cents per cell. These machines can decode as many as 10,000 cells per day. Finally there are new technologies that allow for labeling different kinds of cells on the basis of gene activity and to ‘map’ the location of the particular kind of cell within the body.

Broadband Shorts – March 2017

Today I’m writing about a few interesting topics that are not long enough to justify a standalone blog:

Google Scanning Non-user Emails. There has been an ongoing class action lawsuit against Google for scanning emails from non-Google customers. Google has been open for years about the fact that they scan email that originates through a Gmail account. The company scans Gmail for references to items that might be of interest to advertisers and then sell that condensed data to others. This explains how you can start seeing ads for new cars after emailing that you are looking for a new car.

There are no specific numbers available for how much they make from scanning Gmail, but this is part of their overall advertising revenues which were $79.4 billion for 2016, up 18% over 2015.  The class action suit deals with emails that are sent to Gmail users from non-Gmail domains. It turns out that Google scans these emails as well, although non-Gmail users have never agreed to the terms of service that applies to Gmail users. This lawsuit will be an important test of customer privacy rights, particularly if Google loses and appeals to a higher court. This is a germane topic right now since the big ISPs are all expected to do similar scanning of customer data now that the FCC and Congress have weakened consumer privacy rights for broadband.

Verizon FiOS and New York City. This relationship is back in the news since the City is suing Verizon for not meeting its promise to bring broadband to everybody in the city in 2008. Verizon has made FiOS available to 2.2 million of the 3.3 million homes and businesses in the city.

The argument is one of the definition of a passing. Verizon says that they have met their obligation and that the gap is due to landlords that won’t allow Verizon into their buildings. But the city claims that Verizon hasn’t built fiber on every street in the city and also that the company has often elected to not enter older buildings due to the cost of distributing fiber inside the buildings. A number of landlords claim that they have asked Verizon into their buildings but that the company either elected to not enter the buildings or else insisted on an exclusive arrangement for broadband services as a condition for entering a building.

New Applications for Satellite Broadband.  The FCC has received 5 new applications for launching geostationary satellite networks bringing the total requests up to 17. Now SpaceX, OneWeb, Telesat, O3b Networks and Theia Holdings are also asking permission to launch satellite networks that would provide broadband using the V Band of spectrum from 37 GHz to 50 GHz. Boeing also expanded their earlier November request to add the 50.4 GHz to 52.4 GHz bands. I’m not sure how the FCC picks winners from this big pile – and if they don’t we are going to see busy skies.

Anonymous Kills 20% of Dark Web. Last month the hackers who work under the name ‘Anonymous’ knocked down about 20% of the web sites from the dark web. The hackers were targeting cyber criminals who profit from child pornography. Of particular interest was a group known as Freedom Hosting, a group that Anonymous claims has over 50% of their servers dedicated to child pornography.

This was the first known major case of hackers trying to regulate the dark web. This part of the Internet is full of pornography and other kinds of criminal content. The Anonymous hackers also alerted law enforcement about the content they uncovered.

Is it Too Late to Save the Web?

Advocates of net neutrality say that we need to take a stand to protect the open web, and for those that have been using the web since its early days that sounds like a noble goal. But when I look at the trends, the statistics, and the news about the web, I have to wonder if it’s too late to save the web as we’ve known it.

The web was originally going to be a repository of human knowledge and people originally took the time to post all sorts of amazing content on the web. But nobody does that very much anymore and over time those old interesting web sites are dying. Mozilla says that people no longer web search much and that 60% of all non-video web traffic goes to a small handful of giant web companies like Facebook.

The average web user today seeks out a curated web experience like Facebook or other social platforms where content is brought to them instead of them searching the web. And within those platforms people create echo chambers by narrowing their focus over time until they only see content that supports their world view. People additionally use the web to do a few additional things like watching Netflix, paying bills, shopping at Amazon and searching on Google.

I don’t point out that trend as a criticism because this is clearly what people want from the web, and they vote by giant numbers to use the big platforms. It’s hard to argue that for the hundreds of millions of people who use the web in this manner that the web is even open for them any longer. People are choosing to use a restricted subset of the web, giving even more power to a handful of giant companies.

The trends are for the web to get even more restricted and condensed. Already today there are only two cellphone platforms – Android and iOS. People on cellphones visit even fewer places on the web than with landline connections. You don’t have to look very far into the future to see an even more restricted web. We are just now starting to talk to the web through Amazon Alexa and Apple Siri. The industry expects a large percentage of web interface to soon be accomplished though voice interface. And beyond that we are moving towards a world of wearables that will replace our cellphones. At some point most people’s web experience will be completely curated and the web we know today will largely become a thing of the quaint past.

It’s not hard to understand why people lean towards curated platforms. Many of them hear the constant news of hacking and ransomware and people don’t feel safe going to unknown websites. The echo chamber has been around as long as modern civilization has been around – people tend to do things they like with people that they know and trust. The echo chamber seems magnified by current social media because it can give the perception that people are part of something larger than themselves – but unless people take actions outside the web that’s largely an illusion.

There are those who don’t want to take part in the curated web. They don’t like the data gathering and the targeted marketing from the big companies. They tend towards platforms that are encrypted end-to-end like WhatsApp. They use browsers that don’t track them. And they stick as much as possible to websites using HTTPS. They are hopeful that the new TLS 1.3 protocol (transport layer security) is going to give them more anonymity than today. But it’s hard work to stay out of the sight of the big companies, and it’s going to get even harder now that the big ISPs are free again to gather and sell data on their customers’ usage.

Even though I’ve been on the web seemingly forever, I don’t necessarily regret the changes that are going on. I hate to see the big companies with such power and I’m one of the people that avoids them as much as I can. But I fully believe that within a few decades that the web as we know it will become a memory. Artificial intelligence will be built into our interfaces with the web and we will rely on smart assistants to take care of things for us. When the web is always with you and when the interfaces are all verbal, it’s just not going to be the same web. I’m sure at some point people will come up with a new name for it, but our future interfaces with computers will have very little in common with our web experiences of today.

Wireless Networks Need Fiber

As I examine each of the upcoming wireless technologies it looks like future wireless technology is still going to rely heavily on an underlying fiber network. While the amount of needed fiber will be less than building fiber to every customer premise, supporting robust wireless networks is still going to require significant construction of new fiber.

This is already true today for the traditional cellular network and most existing towers are fiber-fed, although some have microwave backhaul. The amount of bandwidth needed at traditional cell sites is already outstripping the 1 or 2 GB capacity of wireless backhaul technologies. Urban cell sites today are fed with as much as 5 – 10 GB pipes and most rural ones have (or would like to have) a gigabyte feed. I’ve seen recent contractual negotiations for rural cell sites asking for as much as 5 GB of backhaul within the next 5 – 10 years.

Looking at the specification for future 5G cellular sites means that fiber will soon be the only backhaul solution for cell sites. The specifications require that a single cell site be capable of as much as 20 GB download and 10 GB upload. The cellular world is currently exploring mini-cell sites (although that effort has slowed down) to some degree due to the issues with placing these devices closer to customers. To be practical these small cell sites must be placed on poles (existing or newly built), on rooftops and on other locations found near to areas with high usage demand. The majority of these small sites will require new fiber construction. Today these sites can probably use millimeter wave radio backhaul, but as bandwidth needs increase, this is going to mean bringing fiber to poles and rooftops.

Millimeter wave radios are also being touted as a way to bring gigabit speeds to consumers. But delivering fast speeds means getting the radios close to customers. These radios use extremely high frequencies, and as such travel for short distances. As a hot spot a millimeter wave radio is only good for a little over 100 feet. But even if formed into a tight microwave beam it’s a little over a mile – and also requires true line-of-sight. These radios will be vying for the same transmitter locations as mini-cell sites.

Because of the short distances that can be delivered by the millimeter wave radios, this technology is going to initially be of most interest in the densest urban areas. Perhaps as the radios get cheaper there will be more of a model for suburban areas. But the challenge of deploying wireless in urban areas is that is where fiber is the most expensive to build. It’s not unusual to see new fiber construction costs of $150,000 and $200,000 per mile in downtown areas. The urban wireless deployment faces the challenge of getting both fiber and power to poles, rooftops and sides of buildings. This is the issue that has already stymied the deployment of mini-cell sites, and it’s going to become more of an issue as numerous companies want to build competing wireless networks in our cities. I’m picturing having the four major cellular companies and half a dozen wireless ISPs all wanting access to the same prime transmitter sites. All of these companies will have to deal with the availability of fiber, or will need to build expensive fiber to support their networks.

Even rural wireless deployments needs a lot of fiber. A quality wireless point-to-point wireless network today needs fiber at each small tower. When that is available then the current technologies can deploy speeds between 20 Mbps and 100 Mbps. But using wireless backhaul instead of fiber drastically cuts the performance of these networks and there are scads of rural WISPs delivering bandwidth products of 5 Mbps or less. As the big telcos tear down their remaining rural copper, the need for rural fiber is going to intensify. But the business case is often difficult to justify to build fiber to supply bandwidth to only a small number of potential wireless or wireline customers.

All of the big companies that are telling Wall Street about their shift to wireless technologies are conveniently not talking about this need for lots of fiber. But when they go to deploy these technologies on any scale they are going to run smack into the current lack of fiber. And until the fiber issue is solved, these wireless technologies are not going to deliver the kinds of speeds and won’t be quickly available everywhere as is implied by the many press releases and articles talking about our wireless future. I have no doubt that there will eventually be a lot of customers using wireless last mile – but only after somebody first makes the investment in the fiber networks needed to support the wireless networks.

Ready or Not, IoT is Coming

We are getting very close to the time when just about every appliance you buy is going to be connected to the IoT, whether you want it or not. Chips are getting so cheap that manufacturers are going to soon understand the benefits of adding chips to most things that you buy. While this will add some clear benefits to consumers it also brings new security risks.

IoT in everything is going to redefine privacy. What do I mean by that? Let’s say you buy a new food processor. Even if the manufacturer doesn’t make the device voice-controlled they are going to add a chip. That chip is going to give the manufacturer the kind of feedback they never had before. It’s going to tell them everything about how you use your food processor – how long before you take it out of the box, how often you use it, how you use the various settings, and if the device has any problems. They’ll also be able to map where all of their customers are, but more importantly they will know who uses their food processor the most. And even if you never register the device, with GPS they are going to know who you are.

Picture that same thing happening with everything you buy. Remember that Tostitos just found it cost effective to add a chip to a million bags of chips for the recent Superbowl. So chips might not just be added to appliances, but could be built into anything where the manufacturer wants more feedback about the use of their product.

Of course, many devices are going to go beyond this basic marketing feedback and will also include interactions of various kinds with customers. For instance, it shouldn’t be very long until you can talk to that same food processor through your Amazon Alexa and tell it what you are making. It will know the perfect settings to make your guacamole and will help you blend a perfect bowlful. Even people who are leery of home automation are going to find many of these features to be too convenient to ignore.

There is no telling at this early stage which IoT applications will be successful. For instance, I keep hearing every year about smart refrigerators and I can’t ever picture that ever fitting into my lifestyle. But like with any consumer product, the public will quickly pick the winners and losers. When everything has a chip that can communicate with a whole-house hub like Alexa, each of us will find at least a few functions we love so much that we will wonder how we lived without them.

But all of this comes with a big price. The big thing we will be giving up is privacy. Not only will the maker of each device in our house know how we use that device, but anybody that accumulates the feedback from many appliances and devices will know a whole lot more about us than most of us want strangers to know. If you are even a little annoyed by targeted marketing today, imagine what it’s going to be like when your house is blaring everything about you to the world. And there may be no way to stop it. The devices might all talk to the cellular cloud and be able to bypass your home WiFi and security – that’s why both AT&T and Verizon are hyping the coming IoT cloud to investors.

There is also the added security risk of IoT devices being used in nefarious ways. We’ve already learned that our TVs and computers and other devices in the house can listen to all of our private conversations. But even worse than that, devices that can communicate with the world can be hacked. That means any hacker might be able to listen to what is happening in your home. Or it might mean a new kind of hacking that locks and holds your whole house and appliances hostage for a payment like happens today with PCs.

One of the most interesting things about this is that it’s going to happen to everybody unless you live in some rural place out of range of cell service. Currently we all have choices about letting IoT devices into our house, and generally only the tech savvy are using home automation technology. But when there are chips embedded in most of the things you buy it will spread IoT to everybody. It’s probably going to be nearly impossible to neutralize it. I didn’t set out to sound pessimistic in writing this blog, but I really don’t want or need my toaster or blender or food processor talking to the world – and I suspect most of you feel the same way.

The Death of WiFI Hotspots?

I’ve been thinking about the new unlimited data plans and wondering what impact they will have on public WiFi. As I wrote in a recent blog, none of the plans from the major cellular carriers are truly unlimited. But they have enough data available that somebody who isn’t trying to use one of these plans for a home landline connection will now have a lot more data available than ever before.

The plans from the big four carriers have soft monthly download caps of 22 Gigabytes or higher, at which point they throttle to slower speeds. But 22 to 30 GB is a huge cap for anybody that’s been living with caps under 5 GB or sharing family plans at 10 GB. And to go along with these bigger caps, the cellular companies are also now offering zero-rated video that customers can watch without touching the data caps. That combination is going to let cellphone users use a mountain of data during a month.

So I wonder how many people who buy these plans will bother to log onto WiFi in coffee shops, airports and hotels any longer? I know I probably will not. For the last few years I’ve seen articles almost weekly warning of the dangers of public WiFi and I’ve become wary of using WiFi in places like Starbucks. And WiFi in other public places has largely grown to be unusable. WiFi can be okay in business hotels in the early afternoon or at 3:00 in the morning, but is largely worthless in the prime time evening hours. And free airport WiFi in the bigger airports is generally already too slow to use.

If you think forward a few years you have to wonder how long it’s going to take before public WiFi wanes as a phenomenon? Huge numbers of restaurants, stores, doctor offices, etc. spend money today on broadband and on WiFi routers for their customers and you have to wonder why they would continue to do that if nobody is asking for it. And that’s going to mean a big decrease in sales of industrial grade WiFi routers and landline broadband connections. Many of these places already buy a second data connection for the public and those connections will probably be canceled in droves.

I wonder how much sense it makes for Comcast and others to keep pouring money into outdoor hotspots if people stop using them? You only have to go back a few years to remember when the concept of building the biggest outdoor hotspot network was the goal for some of the largest cable companies. Already today my wife has to turn off her WiFi when running in the neighborhood since her phone constantly drops her music stream through attempts to change to each Comcast WiFi connection she runs past. How many people with these unlimited plans will even bother to ever turn on their WiFi?

I also wonder if the cellular networks are really ready for this shift. There is a huge amount of data shifted today from cellphones to hotspots. As a business traveler I’m already thinking about how hard it might be soon to get a cellular data connection during the business hours if nobody is using the hotel WiFi. I know that 5G is going to fix this issue by offering many more connections per cell site, but we aren’t going to see widespread 5G cell sites for at least five years and probably a little longer.

I’ve always found it interesting how quickly changes seem to hit and sweep the cellular industry. There was virtually no talk a year ago about unlimited data plans. In fact, at that time both AT&T and Verizon were punishing those with legacy unlimited plans to try to drive them to some other plan. But the industry has finally plateaued on customer growth and cellular service is quickly becoming a commodity. I think a lot of us saw that coming, but I never suspected that the way it would manifest would be with competition of unlimited calling and the possible death of public WiFi. I don’t know if this industry will ever stop surprising us at times.

I guess a day could come soon when kids will have no memory of public hotspots. I can remember fondly when traveling to places like Puerto Rico or the Caribbean that the first thing you did on landing was find the locations of the Internet cafes. I remember back when our company decided to move out of our offices that one of my partners practically lived in a Starbucks for the next year. It was an interesting phase of our industry, but one whose days are probably now numbered.

More on 5G Standards

I wrote a blog last week about the new 5G standard being developed by the International Telecommunications Union (ITU). This standard is expected to be passed this November. However this standard is not the end of the standards process, but rather the beginning. The ITU IMT-2020 standard defines the large targets that define a fully developed 5G product. Basically it’s the wish list and a fully-compliant 5G product will meet the full standard.

But within 5G there are already a number of specific use cases for 5G that are being developed. The most immediate three are enBB (enhanced mobile broadband, or better functioning cellphones), URLLC (ultra-low latency communications to enhance data connectivity) and mMTC (massive machine type communications, to communicate with hordes of IoT devices). Each use case requires a unique set of standards to define how those parts of the 5G network will operate. And there will be other use cases.

The primary body working on these underlying standards is the 3GPP (3rd Generation Partnership Project). This group brings together seven other standards bodies – ARIB, ATIS, CCSA, ETSI, TSDSI, TTA, TTC – which demonstrates how complicated it is to develop a new wireless technology that will be accepted worldwide. I could talk about what each group does, but that would take a whole blog. Each standards group looks at specific aspects of radio communications such as the modulating schemes to be used, or the format of information to be passed so that devices can talk to each other. But the involvement of this many different standards groups explains a bit about why it takes so long to go from a new technology concept like 5G to functioning wireless products.

There is currently a lot work being done to create the specific standards for different portions of a 5G network. This includes the Radio Access Network (RAN), Services and System Aspects (SA) and Core Network and Terminals (CT).

The 5G RAN group, which looks at radio architecture, began work in 2015. Their first phase of work (referred to as Release 15) is looking at both the eMBB and the URLCC use cases. The goal is to define the specific architecture and feature set that is needed to meet the 5G specification. This first phase is expected to be finished in the fourth quarter of 2018. The 5G RAN group is also working on Release 16, which looks more specifically at getting radios that can comply with all of the aspects of IMT-2020 and is targeted to be completed in December of 2019.

The 5G SA group has already been actively working on the services and systems aspects of 5G. The preliminary work from this group was finished last year and final approval of their phase 1 work was just approved at the Mobile World Congress. But the SA group and the RAN group worked independently and it’s expected that there will be work to be done at the end of each phase of the RAN group to bring the two groups into sync.

The work on the core network has begun with some preliminary testing and concepts, but most of their work can’t be started until the RAN group finishes its work in 2018 and 2019.

The reason I am writing about this is to demonstrate the roadblocks that still remain to rolling out any actual 5G products. Manufacturers will not commit to making any mass-produced hardware until they are sure it’s going to be compatible with all parts of the 5G network. And it doesn’t look like any real work can be done in that area until about 2020.

Meanwhile there is a lot of talk from AT&T, Verizon and numerous vendors about 5G trials, and these press releases always make it sound like 5G products will quickly follow these trials. But for the most part these trials are breadboard tests of some of the concepts of the 5G architecture. These tests provide valuable feedback on problems developed in the field and on what works and doesn’t work.

And these companies are also making 5G claims about some technologies that aren’t really 5G yet. Most of the press releases these days are talking about point-to-point or point-to-multipoint radios using millimeter wave frequencies. But in many cases these technologies have been around for a number of years and the ‘tests’ are attempts to use some of the 5G concepts to goose more bandwidth out of existing technology.

And that’s not a bad thing. AT&T, Verizon, Google and Starry, among others, are looking for ways to use high-bandwidth wireless technologies in the last mile. But as you can see by the progress of the standards groups defining 5G, the radios we see in the next few years are not going to be 5G radios, no matter what the marketing departments of those companies call them.

A Regulatory Definition of Broadband

In one of the more bizarre filings I’ve seen at the FCC, the National Cable Television Association (NCTA) asked the FCC to abandon the two-year old definition of broadband set at 25 Mbps down and 3 Mbps up. NCTA is the lobbying and trade association of the largest cable companies like Comcast, Charter, Cox, Mediacom, Altice, etc. Smaller cable companies along with smaller telephone companies have a different trade association, the American Cable Association (ACA). This was a short filing that was a follow-up to an ex parte meeting, and rather than tell you what they said, the gist of the letter is as follows:

We urged the Commission to state clearly in the next report that “advanced telecommunications capability” simply denotes an “advanced” level of broadband, and that the previously adopted benchmark of 25 Mbps/3 Mbps is not the only valid or economically significant measure of broadband service. By the same token, we recommended that the next report should keep separate its discussion of whether “advanced telecommunications capability” is being deployed in a reasonable and timely manner, on the one hand, and any discussion of the state of the “broadband” marketplace on the other.  We noted that the next report presents an opportunity for the Commission to recognize that competition in the broadband marketplace is robust and rapidly evolving in most areas, while at the same time identifying opportunities to close the digital divide in unserved rural areas.

The reason I call it bizarre is that I can’t fathom the motivation behind this effort. Let me look at each of the different parts of this statement. First, they don’t think that the 25/3 threshold is the ‘only valid or economically significant measure of broadband service.’ I would think the 25/3 threshold would please these companies because these big cable companies almost universally already deploy networks capable of delivering speeds greater than that threshold. And in many markets their competition, mostly DSL, does not meet these speeds. So why are they complaining about a definition of broadband that they clearly meet?

They don’t offer an alternative standard and it’s hard to think there can be a standard other than broadband speed. It seems to me that eliminating the speed standard would help their competition and it would allow DSL and wireless WISPs to claim to have the same kind of broadband as a cable modem.

They then ask the FCC to not link discussions about broadband being deployed in a reasonable and timely manner with any actual state of the broadband marketplace. The FCC has been ordered by Congress to report on those two things and it’s hard to think of a way to discuss one without the other. I’m not sure how the FCC can talk about the state of the broadband industry without looking at the number of consumers buying broadband and showing the broadband speeds that are made available to them. Those FCC reports do a great job of highlighting the regional differences in broadband speeds, and more importantly the difference between urban and rural broadband speeds.

But again, why do the cable companies want to break that link in the way that the FCC reports broadband usage? The cable companies are at the top of the heap when it comes to broadband speeds. Comcast says they are going to have gigabit speeds available throughout their footprint within the next few years. Cox has announced major upgrades. Even smaller members like Altice say they are upgrading to all fiber (which might get them tossed out of NCTA). These FCC reports generally highlight the inadequacy of DSL outside of the cable company footprints and don’t show urban broadband in a bad light.

Finally, they want the FCC to recognize that there is robust competition in broadband. And maybe this is what is bothering them because more and more the cable companies are being referred to as monopolies. The fact is there is not robust competition for broadband. Verizon has FiOS in the northeast and a few other major cities have a fiber competitor in addition to the cable and telephone incumbents. But in other markets the cable companies are killing the telephone companies. Cable companies continue to add millions of new customers annually at the expense of DSL. AT&T and Verizon are currently working to tear down rural copper, and in another decade they will begin tearing down urban copper. At that point the cable companies will have won the landline broadband war completely unless there is a surprising upsurge in building urban fiber.

The only other reason the cable companies might be asking for this is that both Comcast and Charter are talking about getting into the wireless business. As such they could begin selling rural LTE broadband – a product that does not meet the FCC’s definition of broadband. I can’t think of any other reason, because for the most part the big cable companies have won the broadband wars in their markets. This filing would have been business as usual coming from the telcos, but it’s a surprising request from the cable companies.

AT&T’s Broadband Trials

John Donovan, the chief strategy officer for AT&T, spoke at the Mobile World Congress recently and said that the company was trying five different technologies for the last mile. This includes WLL (wireless local loop), G.Fast, 5G, AirGig and fiber-to-the-premise. He said the company would be examining the economics of all of different technologies. Let me look at each one, in relation to AT&T.

Wireless Local Loop (WLL). The technology uses the companies LTE bandwidth but utilizes a point-to-multipoint network configuration. By using a small dish on the house to receive the signal the company is getting better bandwidth than can be received from normal broadcast cellular. The company has been doing trials on various different versions of the technology for many years. But there are a few recent trials of the newest technology that AT&T will be using for much of its deployment in rural America as part of the CAF II plan. That plan requires the ISP to deliver at least 10/1 Mbps. AT&T says that the technology is delivering speeds of 15 to 25 Mbps. The company says that even at the edge of a cellular network that a customer can get 10 Mbps about 90% of the time.

G.Fast. This is a technology that uses high frequencies to put more bandwidth on telephone copper wire. Speeds are reported to be as high as 500 Mbps, but only for very short distances under 200 feet. AT&T recently announced a G.Fast trial in an apartment building in Minneapolis. The technology is also being tested by CenturyLink and Windstream. All of these trials are using existing telephone copper inside of existing apartment buildings to deliver broadband. So this is not really a last mile technology. AT&T brings fiber to the apartment complex and then uses G.Fast as an inside wire technology. If they find it to be reliable this would be a great alternative to rewiring apartments with fiber.

5G. AT&T recently announced a few trials of early 5G technologies in Austin. They are looking at several technology ideas such carrier aggregation (combining many frequencies). But these are just trials, and AT&T is one of the companies helping to test pre-5G ideas as part of the worldwide effort to define the 5G specifications. These are not tests of market-ready technologies, but are instead field trials for various concepts needed to make 5G work. There is no doubt that AT&T will eventually replace LTE wireless with 5G wireless, but that transition is still many years in the future. The company is claiming to be testing 5G for the press release benefits – but these are not tests of a viable last mile technology – just tests that are moving lab concepts to early field trials.

AirGig. This one remains a mystery. AT&T says it will begin trialing the technology later this year with two power companies. There has been a little bit of clarification of the technology since the initial press release. This is not a broadband over powerline technology – it’s completely wireless and is using the open lines-of-sight on top of power poles to create a clear path for millimeter wave radios. The company has also said that they don’t know yet which wireless technology will be used to go from the poles into the home – they said the whole range of licensed spectrum is under consideration including the LTE frequencies. And if that’s the case then the AirGig is a fiber-replacement, but the delivery to homes would be about the same as WLL.

FTTP. Donovan referred to fiber-to-the-home as a trial, but by now the company understands the economics of fiber. The company keeps stretching the truth a bit about their fiber deployments. The company keeps saying that they have deployed fiber to 4 million homes, with 8 million more coming in the next three years. But the fact is they have actually only passed the 4 million homes that they can market to as is disclosed on their own web site. The twelve million home target was something that was dictated by the FCC as part of the settlement allowing the company to buy DirecTV.

We don’t know how many fiber customers AT&T has. They are mostly marketing this to apartment buildings, although there are residential customers around the country saying they have it. But they have not sold big piles of fiber connections like Verizon FiOS. This can be seen by looking at the steady drop in total AT&T data customers – 16.03 million in 2014, 15.78 million in 2015 and 15.62 million at the end of the third quarter of 2016. AT&T’s fiber is not really priced to be super-competitive, except in markets where they compete with Google Fiber. Their normal prices elsewhere on fiber are $70 for 100 Mbps, $80 for 300 Mbps and $99 for a gigabit.

OTT News, March 2017

There is a lot of activity going on with web-based video. There are offerings that are starting to look like serious contenders to traditional cable packages.

Comcast Integrates YouTube. Comcast has made a deal with Google to integrate YouTube into the Comcast X1 settop box. This follows last year’s announcement that Comcast is also integrating Netflix. Comcast also says they are working to integrate other SVOD platforms.

Comcast is making a lot of moves to keep themselves relevant for customers and to make the X1 box a key piece of electronics in the home. The box also acts as the hub for their smart home product, Xfinity Home.

One has to think that Comcast has worked out some sort of revenue sharing arrangements with Google and Netflix, although all details of these arrangements have not been reported. The most customer-friendly aspect of these integrations is that the Comcast X1 box is now voice-activated and customers can surf Netflix and YouTube by talking to the box.

Sling TV Adds More Sports. Sling TV has made another move that will make it attractive to more customers by adding the Comcast regional sports networks (RSNs) to their line-up. This includes CSN California, CSN Bay Area, CSN Chicago and CSN Mid-Atlantic. These networks carry a lot of unique sports content that is not easily available anywhere else on-line today. The networks carry pro basketball, pro baseball and a number of college sports. For example, CSN Bay Area is the home station for the popular Golden state Warriors. CSN Mid-Atlantic is the home station for the Baltimore Orioles.

I know in talking to my sports-centric friends that the narrow sports content on-line is the number one issue holding them back from switching to an OTT package. There are still other networks that Sling TV would need to add, like the Big Ten Network and the NFL Channel, to be a totally rounded sports provider. But they have already added a credible sports line-up that includes all the ESPN channels, the SEC Network, the ACC Network, NBA TV, the NHL Channel, the PAC12 Network and a few other sports networks like Univision TDN.

YouTube Launching an OTT Line-up. Cable TV just got another new OTT competitor. The new service is called YouTube TV and brings a fourth major OTT competitor along with Sling TV, PlayStation Vue, and DirecTV Now. The platform is going to launch sometime in the next few months, with no firm release date yet. The basic product will be $35 per month and allows customers to turn the service off and on at will.

YouTube TV will carry the typical network channels as well as ESPN, Disney, Bravo and Fox News – a line-up that sounds similar to its competition. The service will come with unlimited cloud DVR storage. It will allow 3 simultaneous streams per account and 6 user profiles per account. They will first launch in a few major urban markets (probably due to the availability of the local channels for various network channels).

If YouTube has any advantage in the marketplace it’s that they are becoming the preferred content choice for a lot of millennials. The company says they now are delivering over a billion hours per day of content. Millennials are leading the trend of cord cutters (and even more so of cord nevers), and if YouTube can tap that market they should do great.

Dish Network Predicts OTT will Replace Traditional TV. For the first time, Dish Networks Chairman and CEO said he thought that OTT programming is the real future of video. Until now the company, which owns Sling TV, has said that their product was aimed at bringing video to cord cutters.

But Sling TV and the other OTT products are getting a lot better. Sling TV now has over 100 channels that provide a wide set of options for customers. And these channels are not packed into a giant must-take line-up like traditional cable packages, and instead provide a number of smaller packages that a customer can add to the Sling TV base package. Sling TV and the other providers also make it easy for customers to add or subtract packages or come and go from the whole platform at will – something that can’t be done with cable companies.

Certainly Sling TV has made a difference for Dish. The company has been bleeding satellite customers and had customer losses for the last ten quarters. But the company had a small customer gain of 28,000 customers in the fourth quarter due to the popularity of Sling TV. The company does not report customers by satellite and OTT, so we don’t know the specific numbers.