Categories
The Industry

Happy Birthday Wi-Fi

This year is the twentieth anniversary of the formation of the Wi-Fi Alliance and the launch of commercial Wi-Fi. Wi-Fi has become so ubiquitous in our lives that it’s hard to believe that it’s only been twenty years since all broadband connections came with wires.

In 1999 most people were still using dial-up and that’s the year when early adapters started buying DSL. I remember having incredibly long phone cords so that I could use my laptop at different places around the house. When I bought DSL, I became tied to the desk with the DSL modem because I couldn’t find equally long cords to carry DSL all over the house.

I remember the day I bought my first Linksys Wi-Fi router. At that time, I think the only device in my home that would talk to Wi-Fi was my laptop. I was able to use that laptop everywhere around the house, and I remember how liberating it felt to be able to use the laptop on the front porch. I got my first carrier-class Wi-Fi router when I upgraded to fiber on Verizon FiOS. Even then I think the only devices in my house that communicated with the Wi-Fi router were a desktop and some laptops – the world had not yet started to build Wi-Fi into numerous devices. Today my home is crammed full of Wi-Fi-capable devices and it’s hard to imagine going without the wireless technology.

There’s an article in the current Wired by Jeff Abramowitz discussing how Wi-Fi as we know it almost didn’t happen. At the time that 802.11b was introduced there was a competing technology called HomeRF that was being pushed as a home wireless solution. We easily could have ended up with HomeRF used in the home and 802.11b used in the office. That would have meant no easy transition of devices from office to home, which would likely have stymied the ubiquitous Wi-Fi we have today.

The growth of Wi-Fi required free spectrum to thrive, and for that, we can thank microwave ovens. Microwave ovens were first developed in the 1940s and emitted radiation the 2.45 GHz frequency. In the 1960s practically every home bought a microwave oven, and at that time the devices didn’t have great shielding. Since the microwave ovens polluted the spectrum on both sides of the 2.45 GHz band, the FCC decided in 1985 to add frequency bands on both sides of that spectrum, creating the ISM band that was open for anybody to use. With the radio technology available at the time nobody wanted to put any commercial usage too close to leaky microwave ovens. Since then, microwave ovens have better shielding and radios are more accurate in pinpointing narrow channels, and we can now use most of what the FCC had considered in 1985 to be junk spectrum.

I am amused every time I hear somebody in the industry say that broadband is going wireless – and by that, they mean 5G cellular. Today the average cellphone customer uses about 6 GB of cellular data per month. What the cellphone companies don’t talk about is that the average cellphone user also consumes three times that much data each month on Wi-Fi connection. The fact is that our cellphones are mostly Wi-Fi devices that can change to cellular data when we’re out of reach of our homes, schools, and offices.

Wi-Fi is about to take another big leap forward as WiFi 6 is being officially released this month. This newest version of Wi-Fi uses less energy, reduces latency, increases performance in crowded wireless environments, and allows for faster speeds. Wi-Fi has gotten a lot more sophisticated with the introduction of techniques like beamforming and the technology is light years ahead of what first came out in 1999. In those early days, a Wi-Fi modem was just good enough to handle the 1 Mbps DSL and cable modems broadband of the day.

Device manufacturers love Wi-Fi. Estimates vary, but there are predictions that there will be something like 10 billion worldwide Wi-Fi connected devices in 2020 and 22 billion by 2025 – which would be nearly three Wi-Fi devices for every person on the planet. Those are unbelievable numbers for a technology that only came into existence twenty years ago. The manufacturers must be thrilled knowing that we’ll all be buying new devices to upgrade to Wi-Fi 6 over the next few years.

If Wi-Fi was a person, I’d bake them a cake or buy them a drink to honor this birthday. I’ll have to settle for thanking all of those who have contributed over the years to turn the Wi-Fi concept into the robust products that have changed all of our lives.

Categories
The Industry

The Downside to Smart Cities

I read almost daily about another smart city initiative somewhere in the country as cities implement ideas that they think will improve the quality of life for citizens. I just saw a statistic that says that over two-thirds of cities have now implemented some form of smart city technology. Some of the applications make immediately noticeable differences like smart electric grids to save power, smart traffic lights to improve traffic flow, and smart streetlights to save electricity.

But there are a few downsides to smart city technology that can’t be ignored. The two big looming concerns are privacy and security. There was an article in Forbes earlier this year that asked the question, “Are Privacy Concerns Halting Smart Cities Indefinitely?” Citizens are pushing back against smart city initiatives that indiscriminately gather data about people. People don’t trust the government to not misuse personal data.

Some smart city initiatives don’t gather data. For instance, having streetlights that turn off when there is nobody in the area doesn’t require gathering any data on people. But many smart city applications gather mountains of data. Consider smart traffic systems which might gather massive amounts of data if implemented poorly. Smart traffic systems make decisions about when to change lights based upon looking at images of the cars waiting at intersections. If the city captures and stores those images, it accumulates a massive database of where drivers were at specific times. If those images are instantly discarded, never stored and never available for city officials to view then a smart traffic system would not be invading citizen privacy. But the natural inclination is to save this information. For instance, analysts might want to go back after a traffic accident to see what happened. And once the records are saved, law enforcement might want to use the data to track criminal behavior. It’s tempting for a city to collect and store data – all for supposedly good reasons – but eventually, the existence of the data can lead to abuse.

Many people are very leery of systems that capture public video images. If you look at smart city sales literature, it’s hard to find sensor systems that don’t toss in video cameras as part of any street sensor device. I just saw a headline saying that over 400 police departments now partner with Ring, the video cameras people install at their front door – which allow police to have massive numbers of security cameras in a city. It’s incredibly easy for such systems to be abused. Nobody is uncomfortable with using surveillance systems to see who broke into somebody’s home, but it’s highly disturbing if a policeman is using the same system to stalk an ex-wife. Video surveillance isn’t the only sensitive issue and smart city technology can gather all sorts of data about citizens.

What I find scarier is security since smart city systems can be hacked. Security experts recently told Wired that smart city networks are extremely vulnerable to hacking. Municipal computer systems tend to be older and not updated as regularly. Municipal computer systems have the same problems seen in corporations – weak passwords, outdated and ignored security patches, and employees that click on spam emails.

Smart city networks are more vulnerable to attack than corporate networks that sit behind layered firewalls because a smart city network can be attacked at the sensor edge devices. It’s well known that IoT devices are not as rigorously updated for security as other components of computer networks. I’ve seen numerous articles of hackers who were able to quickly defeat the security of IoT devices.

While there might be a concern that city employees will abuse citizen data there is no doubt that hackers will. It’s not hard to envision hackers causing mischief by messing with traffic lights. It’s not hard to envision terrorists paralyzing a city by shutting down everything computer-related.

But the more insidious threat is hackers who quietly gain access to city systems and don’t overtly cause damages. I have one city client that recently found a system they believe has been compromised for over a decade. It’s not hard to envision bad actors accessing video data as a tool to use for burglary or car theft. It’s not hard to imagine a bad actor selling the data gathered on city networks to players on the dark web.

I’m not against smart city technology, and that’s not the point of this blog. But before a city deploys networks of hundreds of thousands of sensors, they need to have planned well to protect citizen data from misuse by city employees and by abuse from hackers. That sounds like a huge challenge to me and I have to wonder how many cities are capable of doing it right. We’ve seen numerous large corporations get hacked. Smart city networks with huge numbers of sensors are far less secure and look to be an open invitation to hackers.

Categories
The Industry

Trusting Big Company Promises

When AT&T proposed to merge with Time Warner in 2016, attorneys at the Justice Department argued against the merger and said that the combined company would have too much power since it would be both a content provider and a content purchaser. Justice Department lawyers and various other antitrust lawyers warned that the merger would result in rate hikes and blackouts. AT&T counterargued that they are good corporate citizens and that the merger would be good for consumers.

In retrospect, it looks like the Justice Department lawyers were right. Soon after the merger, AT&T raised the prices for DirecTV and its online service DirecTV Now by $5 per month. The company raised the rates on DirecTV Now again in April of this year by $10 per month. AT&T accompanied the price increases with a decision to no longer negotiate promotional prices with TV customers. In the first two quarters of this year DirecTV lost over 1.3 million customers as older pricing packages expired and the company insisted that customers move to the new prices. AT&T says they are happy to be rid of customers that were not contributing to their bottom line.

In July of this year, CBS went dark for 6.5 million DirecTV and AT&T U-verse cable customers. AT&T said that CBS wanted too much money to renew a carriage deal. The two companies resolved the blackout in August.

Meanwhile, AT&T and Dish networks got into a dispute in late 2018 which resulted in turning off HBO and Cinemax on Dish Network. This blackout has carried into 2019 and the two sides still have not resolved the issue. The dispute cost Dish a lot of customers when the company was unable to carry the Game of Thrones. Dish says that half of its 334,000 customer losses in the fourth quarter of 2018 were due to not having the Game of Thrones.

I just saw headlines that AT&T is headed towards a rate fight with ESPN and warns there could be protracted blackouts.

It’s hard to fully fault any one of the AT&T decisions since they can be justified to some degree as smart business practices. But that’s how monopoly abuses generally work. AT&T wants to pay as little as possible when buying programming from others and wants to charge as much as possible when selling content. In the end, it’s consumers who pay for the AT&T practices – something the company had promised would not happen just months before the blackouts.

Programming fights don’t have to be so messy. Consider Comcast which is also a programmer and the biggest cable TV company. Comcast has gotten into a few disputes over programming, particularly with regional sports programming. In a few of these disputes, Comcast was leveraging its programming power since it also owns NBC and other programming. But these cases mostly got resolved without blackouts.

Regulators are most worried about AT&T’s willingness to allow prolonged blackouts because during blackouts the public suffers. Constantly increasing programming costs have caused a lot of angst for cable TV providers, and yet most disputes over programming don’t result in turning off content. AT&T is clearly willing to flex its corporate muscles since it is operating from a position of power in most cases, as either an owner of valuable content or as one of the largest buyers of content.

From a regulatory perspective this raises the question of how the government can trust the big companies that have grown to have tremendous market power. The Justice Department sued to challenge the AT&T and Time Warner merger even after the merger was approved. That was an extraordinary suit that asked to undo the merger. The Justice Department argued that the merger was clearly against the public interest. The courts quickly ruled against that suit and it’s clear that it’s nearly impossible to undo a merger after it has occurred.

The fact is that companies with monopoly power almost always eventually abuse that power. It’s incredibly hard for a monopoly to decide not to act in its own best interest, even if those actions are considered as monopoly abuses. Corporations are made up of people who want to succeed and it’s human nature for people to take any market advantages their corporation might have. I have to wonder if AT&T’s behavior will make regulators hesitate before the next big merger. Probably not, but AT&T barely let the ink dry on the Time Warner merger before doing things they promised they wouldn’t do.

Categories
The Industry

The Hidden World of Undersea Fiber

Since the first undersea cable was completed in 1858 to deliver telegraph messages between the US and England, we’ve has an extensive network of undersea cable networks that enable communications between continents.

Earlier this year there were 378 undersea fiber cables in place that stretch over 745,000 miles. Here’s an interactive map that shows all of the cables and also allows highlighting of individual cables. What’s most intriguing about the map is that are a few cities around the world where numerous cables terminate. One such place is New York City, and Superstorm Sandy cut the connections of several fibers and the connection between the US and Europe went dark for a few hours. The companies building the cables are now considering diversifying the terminal locations of the fiber cables.

Cables also routinely get cut from other events such as earthquakes, underwater mudslides, ship anchors and even a tiny number from sharks. There is an average of about 25 underseas fiber cuts per year. Repairs are made by ships that pull the cut ends of the fiber to the surface and splice the ends back together. There have a few fiber cuts where there was suspicion of sabotage, but it’s never been proven. There is no real way to provide security for undersea cables and the routes of the cables are well known. It’s been a poorly kept secret that spy agencies around the world tap into various cables to monitor traffic.

Undersea fibers are made differently than other fiber. Since the biggest danger from fiber cuts is in shallow water, the cable for shallow locations is as thick as a coke can and is routinely buried under the surface. At deeper depths below 8,000 feet, where the dangers of fiber cuts are minimal, the cables are only as thick as a magic marker. There are cables laid as deep as 25,000 feet below the surface. One unusual aspect of underseas fibers is the use of an underlying copper layer that is used to transmit the electricity needed to power fiber repeaters along the long underseas paths. The cables can be powered with as much as 10,000 volts to force the power along the longest Pacific routes.

The undersea fiber paths carries over 99% of the traffic between continents, with the small remainder carried by satellites. Satellites are never expected to carry more than a tiny fraction of the traffic due to the gigantic, and constantly growing volume of worldwide data traffic. The FCC estimated that only 0.37% of the US international data traffic is carried by satellite. The capacity of the newer cables is mind-boggling – the Marea cable that was completed between Spain and Virginia in 2018 has a capacity of 208 terabits per second. No satellite network is ever going to be able to carry more than a tiny fraction of that kind of capacity. Worldwide bandwidth usage is exploding as the users on the Internet continues to grow (there were 1 million new users added to the Web every day in 2018). And just like in the US, usage per person is growing everywhere at an exponential rate.

One thing that is obvious from the fiber map is there are parts of the world or routes that don’t exist. The companies that fund the cables build them to satisfy existing broadband needs, which is why there are so many cables between places like the US and Europe or between countries in the Mediterranean. There are no routes between places like Australia and South America because there is not enough specific traffic between the two places to justify the cost of a new cable route. While cable routes terminate to India and China, one would expect to see more fibers added in coming years. These two countries are currently seeing the biggest number of new Internet users (in 2018 there were 100 million new users in India and 80 million new users in China).

The cables have traditionally been built and owned by the world’s biggest telecom companies. But in recent years, companies like Google, Facebook, Microsoft, and Amazon have been investing in new undersea fibers. This will allow them to carry their own traffic between continents in the same way they are also now carrying terrestrial traffic.

Undersea cables are designed for a 25-year life, and so cables are regularly being retired and replaced. Many cables aren’t reaching the 25-year life because the built-in repeaters become obsolete and it’s often more profitable to lay a larger capacity newer cable.

Categories
Regulation - What is it Good For?

Funding the USF

The Universal Service Fund (USF) has a bleak future outlook if the FCC continues to ignore the funding crisis that supports the program. The fund continues to be funded with a fee levied against the combined Interstate and international portion of landlines, cellphones and certain kinds of traditional data connections sold by the big telcos. The ‘tax’ on Interstate services has grown to an indefensible 25% of the retail cost of the Interstate and international portion of these products.

The FCC maintains arcane rules to determine the interstate portion of things like a local phone bill or a cellular bill. There are only a tiny handful of consultants that specialize in ‘separations’ – meaning the separation of costs into jurisdictions – who understand the math behind the FCC’s determination of the base for assessing USF fees.

The USF has done a lot of good in the past and is poised to do even more. The segment of the program that brings affordable broadband to poor schools and libraries is a success in communities across the country. The USF is also used to subsidize broadband to non-profit rural health clinics and hospitals. I would argue that the Lifeline program that provides subsidized phone service has done a huge amount of good. The $9.25 per month savings on a phone or broadband bill isn’t as effective today as it once was because the subsidy isn’t pegged to inflation. But I’ve seen firsthand the benefits from this plan that provided low-cost cellphones to the homeless and connected them to the rest of society. There are numerous stories of how the subsidized cellphones helped homeless people find work and integrate back into society.

The biggest potential benefit of the fund is bringing broadband solutions to rural homes that still aren’t connected to workable broadband. We’ve gotten a hint of this potential in some recent grant programs, like the recent CAF II reverse auction. We’re ready to see the USF create huge benefits as the FCC starts awarding $20.4 billion in grants from the USF, to be dispersed starting in 2021. If that program is administered properly then huge numbers of homes are going to get real broadband.

This is not to say that the USF hasn’t had some problems. There are widespread stories about fraud in the Lifeline program, although many of those stories have been exaggerated in the press. A decent amount of what was called fraud was due to the ineptitude of the big phone companies that continued to collect USF funding for people who die or who are no longer eligible for the subsidy. The FCC has taken major steps to fix this problem by creating a national database of those who are eligible for the Lifeline program.

The biggest recent problem with the USF came when the FCC used the fund to award $11 billion to the big telcos in the CAF II program to upgrade rural broadband to speeds of at least 10/1 Mbps. I’ve heard credible rumors that some of the telcos pocketed much of that money and only made token efforts to tweak rural DSL speeds up to a level that households still don’t want to buy. It’s hard to find anybody in the country who will defend this colossal boondoggle.

However, we’ve learned that if used in a smart way that the USF can be used to bring permanent broadband to rural America. Every little pocket of customers that gets fiber due to this funding can be taken off the list of places with no broadband alternatives. Areas that get fixed wireless are probably good for a decade or more, and hopefully, those companies operating these networks will pour profits back into bringing fiber (which I know some USF fund recipients are doing).

But the USF is in real trouble if the FCC doesn’t fix the funding solution. As traditional telephone products with an interstate component continue to disappear the funds going into the USF will shrink. If the funding shrinks, the FCC is likely to respond by cutting awards. Somebody might win $1 million from the upcoming grant program but then collect something less as the fund decreases over time.

The fix for the USF is obvious and easy. If the FCC expands the funding base to include broadband products, the percentage contribution would drop significantly from the current 25% and the fund could begin growing again. The current FCC has resisted this idea vigorously and it’s hard to ascribe any motivation other than that they want to see the USF Fund shrink over time. This FCC hates the Lifeline program and would love to kill it. This FCC would prefer to not be in the business of handing out grants. At this point, I don’t think there is any alternative other than waiting for the day when there is a new FCC in place that embraces the good done by the USF rather than fight against it.

Categories
Technology The Industry

Farms Need Broadband Today

I recently saw a presentation by Professor Nicholas Uilk of South Dakota State University. He is the head of the first bachelor degree program in the country for Precision Agriculture. That program does just what the name suggests – they are teaching budding farmers how to use technology in farming to increase crop yields – and those technologies depend upon broadband.

Precision agriculture is investigating many different aspects of farming. Consider the following:

  • There has been a lot of progress creating self-driving farm implements. These machines have been tested for a few years, but there are not a lot of farmers yet willing to set machines loose in the field without a driver in the cab. But the industry is heading towards the day when driverless farming will be an easily achievable reality.
  • Smart devices have moved past tractors and now include things like automated planters, fertilizer spreaders, manure applicators, lime applicators, and tillage machines.
  • The most data-intensive farming need is the creation of real-time variable rate maps of fields. Farmers can use smart tractors or drones to measure and map important variables that can affect a current crop like the relative amounts of key nutrients, moisture content, and the amount of organic matter in the soil. This mapping creates massive data files that are sent off-farm. Experts agronomists review the data and prepare a detailed plan to get the best yields from each part of the field. The problem farms have today is promptly getting the data to and from the experts. Without fast broadband, the time required to get these files to and from the experts renders the data unusable if the crop grows too large to allow machines to make the suggested changes.
  • Farmers are measuring yields as they harvest so they can record exactly which parts of their fields produced the best results.
  • SDSU is working with manufacturers to develop and test soil sensors that could wirelessly transmit real-time data on pH, soil moisture, soil temperature, and transpiration. These sensors are too expensive today to be practical – but the cost of sensors should drop over time.
  • Research is being done to create low-cost sensors that can measure the health of individual plants.
  • Using sensors for livestock is the most technologically advanced area and there are now dairy farms that measure almost everything imaginable about every milking cow. The sensors for monitoring pigs, chickens, and other food animals are also advanced.
  • The smart farm today measures an immense amount of data on all aspects of running the business. This includes gathering data for non-crop parts of the business such as the performance of vehicles, buildings, and employees. The envisioned future is that sensors will be able to sense a problem in equipment and a send a replacement part before a working machine fails.
  • One of the more interesting trends in farming is to record and report on every aspect of the food chain. When the whole country stopped eating romaine last year because of contamination at one farm, the industry has started to develop a process where each step of the production of crops is recorded, with the goal to report the history of food to the consumer. In the not-too-distant future, a consumer will be able to scan a package of lettuce and know where the crop was grown, how it was grown (organic) when it was picked, shipped and brought to the store. This all requires creating a blockchain with an immutable history of each crop, from farm to store.

The common thread of all of these developments in precision agriculture is the need for good broadband. Professor Uilk says that transmitting the detailed map scans for crop fields realistically requires 100 Mbps upload to get the files to and from the experts in a timely exchange. That means fiber to the farm.

A lot of the other applications require reliable wireless connections around the farm, and that implies a much better use of rural spectrum. Today the big cellular carriers buy the rights to most spectrum and then let it lie fallow in rural areas. We need to find a way to bring spectrum to the farm to take advantage of measuring sensors everywhere and for directing self-driving farm equipment.

Categories
Regulation - What is it Good For? The Industry

Gaining Access to Multi-tenant Buildings

In 2007 the FCC banned certain kinds of exclusivity arrangements between ISPs and owners of multi-tenant buildings. At the time of the order, the big cable companies had signed contracts with apartment owners giving them exclusive access to buildings. The FCC order in 2007 got rid of the most egregious types of contracts – in many cases, cable company contracts were so convoluted that building owners didn’t even understand the agreements were exclusive.

However, the FCC order was still a far cry away from ordering open access for ISPs to buildings and there are many landlords still today who won’t allow in competitors. The most common arrangements liked by landlords are revenue share arrangements where the building owner makes money from an arrangement with an ISP. While such arrangements aren’t legally exclusive, they can be lucrative enough to make landlords favor an ISP and give them exclusive access.

WISPA, the industry association for wireless ISPs has asked the FCC to force apartment owners to allow access to multiple ISPs. WISPA conducted a survey of its members and found that wireless companies are routinely denied access to apartment buildings. Some of the reasons for denying access include:

  • Existing arrangements with ISPs that make the landlord not want to grant access to an additional ISP.
  • Apartment owners often deny access because wireless ISPs (WISPs) are often not considered to be telephone or cable companies – many WISPs offer only broadband and have no official regulatory status.
  • Building owners often say that an existing ISP serving the building has exclusive rights to the existing wiring, including conduits that might be used to string new wiring to reach units. This is often the case if the original cable or telephone company paid for the inside wiring when the building was first constructed.
  • Many landlords say that they already have an existing marketing arrangement with an ISP, meaning they get rewarded for sending tenants to that ISP.
  • Many landlords will only consider revenue sharing arrangements since that’s what they have with an existing ISP. Some landlords have even insisted on a WISP signing a revenue-sharing arrangement even before negotiating and talking pricing and logistics.

These objections by landlords fall into two categories. One is compensation-based where a landlord is happy with the financial status quo relationship with an existing ISP. The other primary reason is some contractual relationship with an existing ISP that is hard or impossible for a landlord to preempt.

The concerns of WISPs are all valid, and in fact, the same list can be made by companies that want to build fiber to apartment buildings. However, landlords seem more open to fiber-based ISPs since saying that their building has fiber adds cachet and is valued by many tenants.

WISPs sometimes have unusual issues not faced by other ISP overbuilders. For example, one common wireless model is to beam broadband to a roof of an apartment building. That presents a challenge for gaining access to apartments since inside wiring generally begins in a communications space at the base of a building.

The issue is further clouded by the long history of FCC regulation of inside wiring. The topic of ownership and rights for inside wiring has been debated in various dockets since the 1990s and there are regulatory rulings that can give ammunition to both sides of wiring arguments.

The WISPs are facing an antagonistic FCC on this issue. The agency recently preempted a San Francisco ordinance that would have made all apartment buildings open access – meaning available to any ISP. This FCC has been siding with large incumbent cable and telephone companies on most issues and is not likely to go against them by allowing open access to all apartment buildings.

Categories
The Industry

The Digital Redlining of Dallas

In 2018 Dr Brian Whitacre, an economist from Oklahoma State University looked in detail at the broadband offered by AT&T in Dallas County, Texas. It’s an interesting county in that it includes all of the City of Dallas as well as wealthy suburban areas. Dr. Whitaker concluded that AT&T has engaged for years in digital redlining – in providing faster broadband only in the more affluent parts of the area.

Dr. Whitaker looked in detail at AT&T’s 477 data at the end of 2017 provided to the FCC. AT&T reports the technology used in each census blocks as well as the ‘up-to’ maximum speed offered in each census block.

AT&T offers three technologies in Dallas county:

  • Fiber-to-the-home with markets speeds up to 1 Gbps download. AT&T offers fiber in 6,287 out of 23,463 census blocks (26.8% of the county). The average maximum speed offered in these census blocks in late 2017 according to the 477 data was 300 Mbps.
  • VDSL, which brings fiber deep into neighborhoods, and which in Dallas offers speeds as fast as 75 Mbps download. AT&T offers this in 10,399 census blocks in Dallas (44.3% of the county). AT&T list census blocks with maximum speeds of 18, 24, 45, and 75 Mbps. The average maximum speed listed in the 477 data is 56 Mbps.
  • ADSL2 or ADSL2+, which is one of the earliest forms of DSL and is mostly deployed from central offices. The technology theoretically delivers speeds up to 24 Mbps but decreases rapidly for customers more than a mile from a central office. AT&T still uses ADSL2 in 6,777 census blocks (28.9% of the county). They list the maximum speeds of various census blocks at 3, 6, 12, and 18 Mbps. The average speed of all ADSL2 census blocks is 7.26 Mbps.

It’s worth noting before going further that the above speed differences, while dramatic, doesn’t tell the whole story. The older ADSL technology has a dramatic drop in customer speeds with distances and speeds are also influenced by the quality of the copper wires. Dr. Whitaker noted that he had anecdotal evidence that some of the homes that were listed as having 3 Mbps of 6 Mbps might have speeds under 1 Mbps.

Dr. Whitaker then overlaid the broadband availability against poverty levels in the county. His analysis started by looking at Census blocks have at least 35% of households below the poverty level. In Dallas County, 6,777 census blocks have poverty rates of 35% or higher.

The findings were as follows:

  • Areas with high poverty were twice as likely to be served by ADSL – 56% of high-poverty areas versus 24% of other parts of the city.
  • VDSL coverage was also roughly 2:1 with 25% of areas with high poverty served by VDSL while 48% of the rest of the city had VDSL.
  • Surprisingly, 19% of census blocks with high poverty were served with fiber. I’m going to conjecture that this might include large apartment complexes where AT&T delivers one fiber to the whole complex – which is not the same product as fiber-to-the-home.

It’s worth noting that the findings are somewhat dated and rely upon 477 data from November 2017. AT&T has not likely upgraded any DSL since then, but they have been installing fiber in more neighborhoods over the last two years in a construction effort that recently concluded. It would be interesting to see if the newer fiber also went to more affluent neighborhoods.

I don’t know that I can write a better conclusion of the findings than the one written by Dr. Whitacre: “The analysis for Dallas demonstrates that AT&T has withheld fiber-enhanced broadband improvements from most Dallas neighborhoods with high poverty rates, relegating them to Internet access services which are vastly inferior to the services enjoyed by their counterparts nearby in the higher-income Dallas suburbs…”

This study was done as a follow-up to work done earlier in Cleveland, Ohio and this same situation can likely be found in almost every large city in the country. It’s not hard to understand why ISPs like AT&T do this – they want to maximize the return on their investment. But this kind of redlining is not in the public interest and is possibly the best argument that can be made for regulating broadband networks. We regulated telephone companies since 1932, and that regulation resulted in the US having the best telephone networks in the world. But we’ve decided to not regulate broadband in the same way, and until we change that decision we’re going to have patchwork networks that create side-by-side haves and have-nots.

Categories
Technology

Unlicensed Millimeter Wave Spectrum

I haven’t seen it talked about a lot, but the FCC has set aside millimeter wave spectrum that can be used by anybody to provide broadband. That means that entities will be able to use the spectrum in rural America in areas that the big cellphone companies are likely to ignore.

The FCC set aside the V band (60 GHz) as unlicensed spectrum. This band provides 14 GHz of contiguous spectrum available for anybody to use. This is an interesting spectrum because it has a few drawbacks. This particular spectrum shares a natural harmonic with oxygen and thus is more likely to be absorbed in an open environment than other bands of millimeter wave spectrum. In practice, this will shorten bandwidth delivery distances a bit for the V band.

The FCC also established the E band (70/80 GHz) for public use. This spectrum will have a few more rules than the 60 GHz spectrum and there are light licensing requirements for the spectrum. These licenses are fairly easy to get for carriers, but it’s not so obvious that anybody else can get the spectrum. The FCC will get involved with interference issues with the spectrum – but the short carriage distances of the spectrum make interference somewhat theoretical.

There are several possible uses for the millimeter wave spectrum. First, it can be focused in a beam and used to deliver 1-2 gigabits of broadband for up to a few miles. There have been 60 GHz radios on the market for several years that operate for point-to-point connections. These are mostly used to beam gigabit broadband in places where that’s cheaper than building fiber, like on college campuses or in downtown highrises.

This spectrum can also be used as hotspots, as is being done by Verizon in cities. In the Verizon application, the millimeter wave spectrum is put on pole-mounted transmitters in downtown areas to deliver data to cellphones as fast as 1 Gbps. This can also be deployed in more traditional hot spots like coffee shops. The problem of using 60 GHz spectrum for this use is that there are almost no devices yet that can receive the signal. This isn’t going to get widespread acceptance until somebody builds this into laptops or develops a cheap dongle. My guess is that cellphone makers will ignore 60 GHz in favor or the licensed bands owned by the cellular providers.

The spectrum could also be used to create wireless fiber-to-the-curb like was demonstrated by Verizon in a few neighborhoods in Sacramento and a few other cities earlier this year. The company is delivering residential broadband at speeds of around 300 Mbps. These two frequency bands are higher than what Verizon is using and so won’t carry as far from the curb to homes, so we’ll have to wait until somebody tests this to see if it’s feasible. The big cost of this business plan will still be the cost of building the fiber to feed the transmitters.

The really interesting use of the spectrum is for indoor hot spots. The spectrum can easily deliver multiple gigabits of speed within a room, and unlike WiFi spectrum won’t go through walls and interfere with neighboring rooms. This spectrum would eliminate many of the problems with WiFi in homes and in apartment buildings – but again, this needs to first be built into laptops, sart TVs and other devices.

Unfortunately, the vendors in the industry are currently focused on developing equipment for the licensed spectrum that the big cellular companies will be using. You can’t blame the vendors for concentrating their efforts in the 24, 28, and 39 GHz ranges before looking at these alternate bands. There is always a bit of a catch 22 when introducing any new spectrum – a vendor needs to make the equipment available before anybody can try it, and vendors won’t make the equipment until they have a proven market.

Electronics for millimeter wave spectrum is not as easily created as equipment in lower frequency bands. For instance, in the lower spectrum bands, software-defined radios can easily change between nearby frequencies with no modification of hardware. However, each band of millimeter wave spectrum has different operating characteristics and specific antenna requirements and it’s not nearly as easy to shift between a 39 GHz radio and a 60 GHz radio – they requirements are different for each.

And that means that equipment vendors will need to enter the market if these spectrum bands are ever going to find widespread public use. Hopefully, vendors will find this worth their while because this is a new WiFi opportunity. Wireless vendors have made their living in the WiFi space and they need to be convinced that they have the same with these widely available spectrum bands. I believe that if some vendor builds indoor multi-gigabit routers and receivers, the users will come.

Categories
The Industry

Broadband Price Increases

Back in late 2017 Wall Street analyst Jonathan Chaplin of New Street predicted that ISPs would begin flexing their market power and within three or four years would raise broadband rates to $100. His prediction was a little aggressive, but not by much. He also predicted that we’re going to start seeing perpetual annual broadband rate increases.

Stop the Cap! reports that Charter will be raising rates in September, only ten months aftertheir last rate increase in November 2018. The company will be increasing the price of unbundled broadband by $4 per month from $65.99 to $69.99.  Charter is also increasing the cost of using their WiFi modem from $5.00 to $7.99. This brings their total cost of standalone broadband for their base product (between 100 – 200 Mbps) with WiFi to $78.98, up from $70.99. Charter also announced substantial price increases for cable TV.

Even with this rate increase Charter still has the lowest prices for standalone broadband among the major cable companies. Stop the Cap! reports that the base standalone broadband product plus WiFi costs $93 with Comcast, $95 with Cox and $106.50 with Mediacom.

Of course, not everybody pays those full standalone prices. In most markets we’ve studied, around 70% of customers bundle products and get bundling discounts. However, the latest industry statistics show that millions of customers are now cutting the cord annually and will be losing those discounts and will face the standalone broadband prices.

MoffettNathenson LLC, the leading analysts in the industry, recently compared the average revenue per user (ARPU) for four large cable companies – Comcast, Charter, Altice and Cable ONE. The most recent ARPU for the four companies are: Comcast ($60.86), Charter ($56.57), Altice ($64.58), and Cable One ($71.80). You might wonder why the ARPU is so much lower than the price of standalone broadband. Some of the difference is from bundling and promotional discounts. There are also customers on older, slower, and cheaper broadband products who are hanging on to their old bargain prices.

The four companies have seen broadband revenue growth over the last two years between 8.1% and 12%. The reason for the revenue growth varies by company. A lot of the revenue growth at Comcast and Charter still comes from broadband customer growth and both companies added over 200,000 new customers in the second quarter of this year. In the second quarter, Comcast grew at an annualized rate of 3.2% and Charter grew at 4%. This contrasts with the smaller growth at Altice (1.2%) and Cable ONE (2%), and the rest of the cable industry.

The ARPU for these companies increased for several reasons beyond customer growth. Each of the four companies has had at least one rate increase during the last two years. Some of the ARPU growth comes from cord cutters who lose their bundling discount.

For the four cable companies:

  • Comcast revenues grew by 9.4% over the two years and that came from a 4.4% growth in ARPU and 5% due to subscriber growth.
  • Charter broadband revenues grew by 8.1% over two years. That came from a 3.2% increase in ARPU and 4.9% due to subscriber growth.
  • Altice saw a 12% growth in broadband revenues over two years that comes from a 9.8% growth in ARPU and 2.2% due to customer growth.
  • Cable ONE saw a 9.7% growth in broadband revenues over two years due to a 7.5% growth in ARPU and 2.2% increase due to customer growth.

Altice’s story is perhaps the most interesting and offers a lesson for the rest of the industry. The company says that it persuades 80% of new cord cutters to upgrade to a faster broadband product. This tells us that homes cutting the cord believe they’ll use more broadband and are open to the idea of buying a more robust broadband product. This is something I hope all of my clients reading this blog will notice.

Cable ONE took a different approach. They have been purposefully raising cable cable prices for the last few years and do nothing to try to save customers from dropping the cable product. The company is benefitting largely from the increases due to customers who are giving up their bundling discount.

MoffettNathanson also interprets these numbers to indicate that we will be seeing more rate increases in the future. Broadband growth is slowing for the whole industry, including Comcast and Charter. This means that for most cable companies, the only way to continue to grow revenues and margins will be by broadband rate increases. After seeing this analysis, I expect more companies will put effort into upselling cord cutters to faster broadband, but ultimately these large companies will have to raise broadband rates annually to meet Wall Street earnings expectations.

Exit mobile version