Robocalls and Small Carriers

In July, NTCA filed comments in the FCC docket that is looking at an industry-wide solution to fight against robocalls. The comments outline some major concerns about the ability of small carriers to participate in the process.

The industry solution to stop robocalls, which I have blogged about before, is being referred to as SHAKEN/STIR. This new technology will create an encrypted token that verifies that a given call really originated with the phone number listed in the caller ID. Robocalls can’t get this verification token. Today, robocallers spoof telephone numbers, meaning that they insert a calling number into the caller ID that is not real. These bad actors can make a call look like it’s coming from any number – even your own!

On phones with visual caller ID, like cellphones, a small token will appear to verify that the calling party is really from the number shown. Once the technology has been in place for a while, people will learn to ignore calls that don’t come with the token. If the industry does this right, it will become easier to spot robocalls, and I imagine a lot of people will use apps that will automaticlly block calls without a token.

NTCA is concerned that small carriers will be shut out of this system, causing huge harm to them and their customers. Several network prerequisites must be in place to handle the SHAKEN/STIR token process. First, the originating telephone switch must be digital. Most, but not all small carriers now use digital switches. Any telco or CLEC using any older non-digital switch will be shut out of the process, and to participate they’d have to buy a new digital switch. After the many-year decline in telephone customers, such a purchase might be hard to cost-justify. I’m picturing that this might also be a problem for older PBXs – the switches operated by private businesses. The world is full of large legacy PBXs operated by universities, cities, hospitals and large businesses.

Second, the SHAKEN/STIR solution is likely to require an expensive software upgrade for the carriers using digital switches. Again, due to the shrinking demand for selling voice, many small carriers are going to have a hard time justifying the cost of a software upgrade. Anybody using an off-brand digital switch (several switch vendors folded over the last decade) might not have a workable software solution.

The third requirement to participate in SHAKEN/STIR is that the entire path connecting a switch to the public switched telephone network (PSTN) must be end-to-end digital. This is a huge problem and most small telcos, CLECs, cable companies, and other carriers connect to the PSTN using the older TDM technology (based upon multiples of T1s).

You might recall a decade ago there was a big stir about what the FCC termed a ‘digital transition’. The FCC at the time wanted to migrate the whole PSTN to a digital platform largely based upon SIP trunking. While there was a huge industry effort at the time to figure out how to implement the transition, the effort quietly died and the PSTN is still largely based on TDM technology.

I have clients who have asked for digital trunking (the connection between networks) for years, but almost none of them have succeeded. The large telcos like AT&T, Verizon, and CenturyLink don’t want to spend the money at their end to put in new technology for this purpose. A request to go all-digital is either a flatly refused, or else a small carrier is told that they must pay to transport their network traffic to some distance major switching point in places like Chicago or Denver – an expensive proposition.

What happens to a company that doesn’t participate in SHAKEN/STIR? It won’t be pretty because all of the calls originating from such a carrier won’t get a token verifying that the calls are legitimate. This could be devastating to rural America. Once SHAKEN/STIR is in place for a while a lot of people will refuse to accept unverified calls – and that means calls coming from small carriers won’t be answered. This will also affect a lot of cellular calls because in rural America those calls often originate behind TDM trunking.

We already have a problem with rural call completion, meaning that there are often problems trying to place calls to rural places. If small carriers can’t participate in SHAKEN/STIR, after a time their callers will have real problems placing calls because a lot of the world won’t accept calls that are not verified with a token.

The big telcos have assured the FCC that this can be made to work. It’s my understanding that the big telcos have mistakenly told the FCC that the PSTN in the country is mostly all-digital. I can understand why the big telcos might do this because they are under tremendous pressure from the FCC and Congress to tackle the robocall issue. These big companies are only looking out for themselves and not the rest of the industry.

I already had my doubts about the SHAKEN/STIR solution because my guess is that bad actors will find a way to fake the tokens. One has to only look back at the decades-old battles against spam email and against hackers to understand that it’s going to require a back-and-forth battle for a long time to solve robocalling – the first stab of SHAKEN/STIR is not going to fix the problem. The process is even more unlikely to work if it doesn’t function for large parts of the country and for whole rural communities. The FCC needs to listen to NTCA and other rural voices and not create another disaster for rural America.

Is There a Business Case for Fast Cellular?

We’ve gotten a glimpse of the challenges of marketing faster cellular usage since the two major cellular providers in South Korea made a big push in offering ultrafast cellular broadband. South Korea has two primary cellular carriers – SK Telecom and KT – and both have implemented cellular products using millimeter wave spectrum in Seoul and other dense urban areas.

The technology is nearly identical to the technology introduced by Verizon is small sections of major US cities. The technology uses millimeter wave hot spots from small cell sites to beam broadband to phones that are equipped to use the ultra-high spectrum. In South Korea, both companies are selling a millimeter wave spectrum version of the Samsung Gallery. In the US there are still barely any handset options.

5G hotspot data is not the same as traditional cellular data. The small cells blast out gigabit broadband that carries for only short distances of 500 to 800 feet. The signals can bounce off buildings in the right circumstances and can be received sporadically at greater distances from the transmitters. Millimeter wave spectrum won’t pass through any obstacle and the broadband signal reception can be blocked by any obstacle in the environment, including the body of the person using the cellphone.

Even with those limitations, the speeds delivered with this technology are far faster than traditional cellular data speeds. Verizon has reported peak speeds as fast as 600 Mbps in trials being deployed in US cities. That’s an amazing amount of bandwidth to deliver to a cellphone since a cellphone is, by definition a single user device. Since the average 4G LTE data speed is less than 25 Mbps, our cellphone apps are not designed to be bandwidth hogs. Current 4G speeds are more than adequate to stream video, and with the small screens, there’s no benefit to streaming in 4K or even in 1080p. All of the major cellular carriers already chop down the quality of video streams and thus use only a fraction of the bandwidth used to deliver a single video stream to homes. Cellphones are also not designed to multitask and handle multiple simultaneous tasks.

For now, the biggest benefit of millimeter wave spectrum for cellphones looks to be the ability to quickly download big files like movies, apps or software updates. There is certainly an appeal to downloading a big movie to watch later in less than 30 seconds rather than the more normal 10 minutes. But with data caps on even most unlimited plans I have to wonder how many people routinely download big movie files when they aren’t connected to WiFi.

Another way that faster cellular speeds could be beneficial is for faster web browsing. However, the slow cellphone browsing we experience today is not due to 4G LTE speeds, which are adequate for a decent browsing experience. The painfully slow browsing on cellphones is due to operating systems in cellphones that favor display over functionality – the cellular companies have chosen to downplay browsing speed in favor of maximizing the display for phone apps. Faster millimeter wave spectrum won’t overcome this inherent and deliverate software limitation.

There is another use for faster broadband. South Korea likely has a much higher demand for high-speed cellular because the country is game-crazy. A large majority of the population, including adults, are heavily involved in intensive gaming. There is obviously some appeal for having a fast gaming connection when away from a desktop.

South Korean market analysts are looking at the cost of millimeter wave deployment and the potential revenue stream and are already wondering if this is a good investment. SK Telecom expects to have 2 million customers for the faster broadband by the end of this year. In South Korea, sales of millimeter wave spectrum phones are going well. (these can’t be called 5G phones because they don’t handle frequency slicing or the other slew of 5G features that won’t be introduced for at least three more years).

If the analysts in South Korea don’t see the financial benefits, it’s much harder to see the benefits here. Remember that in South Korea that urban homes can already buy gigabit broadband at home for the equivalent of $30 per month. Moreover, the two big ISPs are in the process of upgrading everybody to 10 Gbps within the next five years. This is a country where everybody has been trained to expect an instant response online – and the faster cellular speeds can bring that expected response to mobility.

The business plan here in the US is a lot more challenging. In South Korea, a lot of people live in dense urban city centers unlike our spread-out population with far-stretching suburbs around cities. The network cost to deploy the millimeter wave technology here will be significantly higher to achieve the same kind of coverage seen in South Korea. At least for now, it’s also a lot harder to paint a picture in the US for large numbers of users willing to pay extra for faster cellular data. Several recent surveys indicate that US consumers think faster 5G data speeds should be offered at the same high prices we already pay for cellular broadband (the US has some of the highest cellular data prices among industrial countries).

I can’t see a major play here for ultra-fast cellular broadband outside of dense city centers and perhaps in places like stadiums and convention centers. It’s hard to think that somehow deploying this technology in the suburbs could ever be cost-justified. We are likely to upgrade cellular data to the more normal 5G using mid-range spectrum, and that’s going to nudge cellular data speeds in time up to 100 Mbps. I think most users here will love somewhat faster speeds but won’t be willing to pay extra for them. It’s hard to think that there are enough people in the US willing to pay even more for millimeter wave speeds that can justify the cost of deploying the networks. This is further compounded by the fact that these millimeter wave networks are outdoors only and the spectrum doesn’t penetrate buildings at all. The US has become an indoor society. At least where I live you rarely see teenagers outdoors in their home neighborhood – they are consuming broadband indoors. Does anybody really care about a fast outdoor network?

Why Aren’t We Talking about Technology Disruption?

One of the most interesting aspects of modern society is how rapidly we adapt to new technology. Perhaps the best illustration of this is the smartphone. In the short period of a decade, we went from a new invention to the point where the large majority of the American public has a smartphone. Today the smartphone is so pervasive that recent statistics from Pew show that 96% of those under between 18 and 29 have a smartphone.

Innovation is exploding in nearly every field of technology, and the public has gotten so used to change that we barely notice announcements that would have made worldwide headlines a few decades ago. I remembre as a kid when Life Magazine had an issue largely dedicated to nylon and polymers and had the world talking about something that wouldn’t even get noticed today. People seem to accept miracle materials, gene splicing, and self-driving cars as normal technical advances. People now give DNA test kits as Christmas presents. Nobody blinks an eye when big data is used to profile and track us all. We accept cloud computing as just another computer technology. In our little broadband corner of the technology world, the general public has learned that fiber and gigabit speeds are the desired broadband technology.

What I find perhaps particularly interesting is that we don’t talk much about upcoming technologies that will completely change the world. A few technologies get talked to death such as 5G and self-driving cars. But technologists now understand that 5G is, in itself, not a disruptive technology – although it might unleash other disruptive technologies such as ubiquitous sensors throughout our environment. The idea of self-driving cars no longer seems disruptive since I can already achieve the same outcome by calling an Uber. The advent of self-driving semi trucks will be far more disruptive and will lower the cost of the nationwide supply chain when we use fleets of self-driving electric trucks.

I’ve always been intrigued about those who peer into the future and I read everything I can find about upcoming technologies. From the things I read there are a few truly disruptive technologies on the horizon. Consider the following innovations that aren’t too far in the future:

Talking to Computers. This will be the most important breakthrough in history in terms of the interface between humans and technology. In a few short generations, we’ve gone from typing on keyboards, to using a mouse, to using cellphones – but the end game will be talking directly to our computers using natural conversational language. We’ve already seen significant progress with natural language processing and are on a path to be able to converse with computers in the same way we communicate with other people. That will trigger a huge transition in society. Computers will fade into the background since we’ll have the full power of the cloud anywhere that we’re connected to the cloud. Today we get a tiny inkling by seeing how people use Apple Siri or Amazon Alexa – but these are rudimentary voice recognition systems. It’s nearly impossible to predict how mankind will react to having the full power of the web with us all of the time.

Space Elevator. In 2012 the Japanese announced a nationwide goal of building a space elevator by 2050. That goal has now been pulled forward to 2045. A space elevator will be transformational since it will free mankind from the confines of the planet earth. With a space elevator we can cheaply and safely move people and materials to and from space. We can drag up the raw materials needed to build huge space factories that can then take advantage of the mineral riches in the asteroid belt. From there we can colonize the moon and mars, build huge space cities and build spaceships to explore nearby stars. The cost of the space elevator is still estimated to only be around $90 billion, the same as the cost of the high-speed rail system between Osaka and Tokyo.

Alternate Energy. We are in the process of weaning mankind from fossil fuel energy sources. While there is a long way to go, several countries in Europe have the goal to be off carbon fuels within the coming decade. The EU already gets 30% of electricity from alternate energy sources. The big breakthrough might finally come from fusion power. This is something that has been 30 years away my whole adult life, but scientists at MIT and other places have developed the needed magnets that can contain the plasma necessary for a fusion reaction and some scientists are now predicting fusion power is now only 15 years away. Fusion power would supply unlimited non-polluting energy, which would transform the whole world, particularly the third world.

An argument can be made that there are other equally disruptive technologies on the horizon like artificial intelligence, robotics, gene-editing, virtual reality, battery storage, and big data processing. Nothing on the list would be as significant as a self-aware computer – but many scientists still think that’s likely to be far into the future. What we can be sure of is that breakthroughs in technology and science will continue to come at us rapidly from all directions. I wonder if the general public will even notice the mosts important breakthroughs or if change has gotten so ho hum that it’s just an expected part of life.

Terahertz WiFi

While labs across the world are busy figuring out how to implement the 5G standards there are scientists already working in the higher frequency spectrum looking to achieve even faster speeds. The frequencies that are just now being explored are labeled as the terahertz range and are at 300 GHz and higher spectrum. This spectrum is the upper ranges of radio spectrum and lies just below ultraviolet light.

Research in these frequencies started around 2010, and since then the achieved broadband transmission speeds have progressed steadily. The first big announced breakthrough in the spectrum came in 2016 when scientists at the Tokyo Institute of Technology achieved speeds of 34 Gbps using the WiFi standard and the 500 GHz spectrum range.

In 2017, researchers at Brown University School of Engineering were able to achieve 50 Gbps. Later that year a team of scientists from Hiroshima University, the National Institute of Information and Communications Technology and Panasonic Corporation achieved a speed of 105 Gbps. This team has also subsequently developed a transceiver chip that can send and receive data at 80 Gbps – meaning these faster speeds could be moved out of the lab and into production.

Like with all frequencies, when transmitted through the air, the higher the bandwidth the shorter the distance until a radio transmission scatters. That makes the biggest challenge for using these frequencies the short transmission distances. However, several of the research teams have shown that transmissions perform well when bounced off walls and the hope is to eventually achieve distances as long as 10 meters (30 feet).

The real benefit of superfast bandwidth will likely be for super-short distances. One of the uses of these frequencies could be to beam data into computer processors. One of the biggest impediments to faster computing is the physical act of getting data to where it’s needed on time, and terahertz lasers could be used to speed up chips.

Another promising use of the faster lasers is to create faster transmission paths on fibers. Scientists have already been experimenting and it looks like these frequencies can be channeled through extremely thin fibers to achieve speeds much faster than anything available today. Putting this application into the field is probably a decade or more away – but it’s a breakthrough that’s needed. Network engineers have already been predicting that we will exhaust the capabilities of current fiber technology on the major Internet transmission paths between major POPs. As the volume of bandwidth we use keeps doubling we will be transmitting more data in a decade or two between places like New York and Washington DC than all of the existing fibers can theoretically carry. When fiber routes get that full the problem can’t be easily fixed by adding more fibers – not when the volumes double every few years. We need solutions that involve fitting more data into existing fibers.

There are other applications that could use higher frequencies today. For example, there are bandwidth needs for specific applications like real-time medical imaging and real-time processing for intricate chemical engineering that need faster bandwidth that is possible with 5G. The automated factories that will create genetic-based drug solutions will need much faster bandwidth. There are other more mundane uses of the higher frequencies. For example, these frequencies could be used to replace X-rays and reduce radiation risks in doctor’s offices and airports.

No matter what else the higher frequencies can achieve, I’m holding out for Star Trek holodecks. The faster terahertz frequencies could support creation of the complex real-time images involved in truly immersive entertainment.

These frequencies will become the workhorse for 6G, the next generation of wireless technology. The early stages of developing a 6g standard is underway with expectations of having a standard by perhaps 2030. Of course, the hype for 6G has also already begun. I’ve already seen several tech articles that talk about the potential for having ultrafast cellular service using these frequencies. The authors of these articles don’t seem to grasp that we’d need a cell site every twenty feet – but facts don’t seem to get in the way of good wireless hype.

Are You Ready for 10 Gbps?

Around the world, we’re seeing some migration to 10 Gbps residential broadband. During the last year the broadband providers in South Korea, Japan, and China began upgrading to the next-generation PON and are offering the blazingly fast broadband products to consumers. South Korea is leading the pack and expects to have the 10 Gbps speed to about 50% of subscribers by the end of 2022.

In the US there are a handful of ISPs offering a 10 Gbps product, mostly for the publicity – but they stand ready to install the faster product. Notable is Fibrant in Salisbury, NC and EPB in Chattanooga. EPB which was also among the first to offer a 1 Gbps residential product a few years ago.

I have a lot of clients who already offer 10 Gbps connections to large business and carrier customers to serve large businessesn like data centers and hospital complexes. However, except for the few pioneers, these larger bandwidth products are being delivered directly to a single customer using active Ethernet technology.

There are a few hurdles for offering speeds over a gigabit in the US. Perhaps foremost is that there are no off-the-shelf customer electronics that can handle speeds over a gigabit – the typical WiFi routers and computers work at slower speeds. The biggest hurdle for an ISP continues to be the cost of the electronics. Today the cost of next-generation PON equipment is high and will remain so until the volume of sales brings the per-unit prices down. The industry market research firm Ovum predicts that we’ll see wide-spread 10 Gbps consumer products starting in 2020 but not gaining traction until 2024.

In China, Huawei leads the pack. The company has a 10 Gbps PON system that is integrated with a 6 Gbps WiFi 6 router for the home. The system is an easy and overlay on top of the company’s traditional GPON network gear. In South Korea the largest ISP SK Broadband has worked with Nokia to develop a proprietary PON technology only used today inside of South Korea. Like Huawei, this overlays onto the existing GPON network. In Japan the 10 Gbps PON network is powered by Sumitomo, a technology only being sold in Japan. None of these technologies has made a dent in the US market, with Huawei currently banned due to security concerns.

In the US there are two technologies being trialed. AT&T is experimenting with XGS-PON technology. They plan to offer 2 Gbps broadband, upgradable to 10 Gbps in the new high-tech community of Walsh Ranch being built outside of Ft. Worth. AT&T is currently trialing the technology at several locations within its FTTP network that now covers over 12 million passings. Verizon is trying the NG-PON2 technology but is mostly planning to use this to power cell sites. It’s going to hard for any ISP to justify deployment of the new technologies until somebody buys enough units to pull down the cost.

Interestingly, Cable Labs is also working on a DOCSIS upgrade that will allow for faster speeds up to 10 Gbps. The problem most cable networks will have is in finding space of their network for the needed channels to support the faster speeds.

There are already vendors and labs exploring 25 Gbps and 50 Gbps PON. These products will likely be used for backhaul and transport at first. The Chinese vendors think the leap forward should be to 50 Mbps while other vendors are all considering a 25 Mbps upgrade path.

The real question that needs to be answered is if there is any market for 10 Gbps bandwidth outside the normally expected uses like cellular towers, data centers, and large business customers. This same question was asked when EPB at Chattanooga and LUS in Lafayette, Louisiana rolled out the earliest 1 Gbps residential bandwidth. Both companies were a bit surprised when they got a few instant takers for the faster products – in both markets from doctors that wanted to be able to analyze MRIs and other big files at home. There are likely a few customers who need speeds above 1 Gbps, with doctors again being good candidates. Just as broadband speeds have advanced, the medical imaging world has grown more sophisticated in the last decade and is creating huge data files. The ability to download these quickly offsite will be tempting to doctors.

I think we are finally on the verge of seeing data use cases that can eat up most of a gigabit of bandwidth in the residential environment. For example, uncompressed virtual and augmented reality can require masses of downloaded data in nearly real-time. As we start seeing use cases for gigabit speeds, the history of broadband has shown that the need for faster speeds is probably not far behind.

The End of the Central Office?

One of the traditional costs for bringing fiber to a new market has always included the creation of some kind of central office space. This might mean modifying space in an existing building or building a new building or large hut. In years past a central office required a lot of physical space, but we are finally to the point with technology where the need for a big central office is often disappearing.

A traditional central office started with the need to house the fiber terminating electronics that connect the new market to the outside world. There also is the need to house and light the electronics facing the customers – although in some network design configurations some of the customer facing electronics can be housed in remote huts in neighborhoods.

A traditional central office needs room for a lot of other equipment. First is significant space for batteries to provide short-term backup in case of power outages. For safety reasons the batteries are often placed in a separate room. Central offices also need space for the power plant used to make the conversion from AC power to DC power. Central offices also usually need significant air conditioning and need room to house the cooling units. If the fiber network terminating to a central office is large enough there is also the requirement for some kind of fiber management system needed to separate the individual fibers in a neat and sensible way. Finally, if the above needs meant building a large enough space, many ISPs also built space to provide working and office space for technicians.

Lately I’ve seen several fiber deployments that don’t require the large traditional central office space. This is largely due to the evolution of the electronics used for serving customers in a FTTP network. For example, the OLTs (optical line terminations) electronics has been significantly compressed in size and density and a shelf of equipment can now perform the same functions that would have required much of a full rack a decade ago. As that equipment has reduced in size, the power requirements have also dropped, reducing the size of the power plant and the batteries.

I’ve seen several markets where a large cabinet provides enough room to replace what would have required a full central office a decade ago. These are not small towns, and two of the deployments are for towns with populations over 20,000.

As the footprint for the ‘central office’ has decreased there’s been a corresponding drop in costs. There are several supply houses that will now pre-install everything needed into the smaller cabinet / hut and deliver the whole unit complete and ready to go after connecting to power and splicing to fiber.

What I find interesting is that I still see some new markets built in the more traditional way. In that same market of 20,000 people it’s possible to still use a configuration that constructs several huts around the city to house the OLT electronics. For purposes of this blog I’ll refer to that as a distributed configuration.

There are pros and cons to both configurations. The biggest benefit of having one core hut or cabinet is lower cost. That means one pre-fab building instead of having to build huts or cabinets at several sites.

The distributed design also has advantages. A redundant fiber ring can be established with a network consisting of at least three huts, meaning that fewer parts of the market will lose service due to a fiber cut near to the core hub. But the distributed network also means more electronics in the network since there is now the need for electronics to light the fiber ring.

The other advantage of a distributed network is that there are fewer fibers terminating to each hut compared to having all customer fibers terminating to a single hut. The distributed network likely also has smaller fibers in the distribution network since fiber can be sized for a neighborhood rather than for the whole market. That might mean less splicing required during the initial construction.

Anybody building a new fiber network needs to consider these two options. If the market is large enough then the distributed network becomes mandatory. However, many engineers seem to be stuck with the idea that they need multiple huts and a fiber ring even for smaller towns. That means paying a premium price to achieve more safety against customer outages. However, since raising the money to build a fiber network is often the number one business consideration, the ability to save electronics costs can be compelling. It would not be unusual to see the single-hub configuration save half a million dollars or more. There is no configuration that is the right choice for all situations. Just be sure if you’re building FTTP in a new market that you consider the options.

Millimeter Wave 5G is Fiber-to-the-Curb

I’ve been thinking about and writing about 5G broadband using millimeter wave spectrum for over a year. This is the broadband product that Verizon launched in Sacramento and a few other markets as a trial last year. I don’t know why it never struck me that this technology is the newest permutation of fiber-to-the curb.

That’s an important distinction to make because naming it this way makes it clear to anybody hearing about the technology that the network is mostly fiber with wireless only for the last few hundred feet.

I remember seeing a trial of fiber-to-the-curb back in the very early 2000s. A guy from the horse country in Virginia had developed the technology of delivering broadband from the pole into the home using radios. He had a working demo of the technology at his rural home. Even then he was beaming fast speeds – his demo delivered an uncompressed video signal from curb to home. He knew that the radios could be made capable of a lot more speed, but in those days I’m sure he didn’t think about gigabit speeds.

The issues that stopped his idea from being practical have been a barrier until recently. There was first the issue of getting the needed spectrum. He wanted to use what we now call midrange spectrum, but which were considered as high spectrum bands in 2000 – he would have to convince the FCC to carve out a slice of spectrum for his application, something that’s always been difficult. He also didn’t have any practical way of getting the needed bandwidth to the pole. ISP’s were still selling T1s, 1 Mbps DSL, and 1 Mbps cable modem service, and while fiber existed, the electronics cost for terminating fiber to devices on multiple poles was astronomical. Finally, even then, this guy had a hard time explaining how it would be cheaper to use wireless to get to the home rather than building a drop wire.

Verizon press releases would make you think that they will be conquering the world with millimeter wave radios and deploying the technology everywhere. However, once you think of this as fiber-to-the-curb that business plan quickly makes no sense. The cost of a fiber-to-the-curb network is mostly in the fiber. Any saving from using millimeter wave radios only applies to the last few hundred feet. For this technology to be compelling the savings for the last hundred feed has to be significant. Do the radio electronics really cost less for wireless compared to the cost of fiber drops and fiber electronics?

Any such comparison must consider all the costs of each technology – meaning the cost of installations, repairs, maintenance, and periodic replacement of electronics. And the comparisons need to be honest. For example, every other wireless technology I know requires more maintenance truck roles than fiber-based technologies due to the squirrelly nature of how wireless behaves in the wild.

Even should the radios become much cheaper than fiber drops, the business case for the technology might still have no legs. There is no way to get around the underlying fact that fiber-to-the-curb means building fiber along residential streets. Verizon has always said that they didn’t extend their fiber FiOS network to neighborhoods where the construction costs were too high. Verizon still seems to be the most cautious of the big ISPs and it’s hard to think that they’ve changed this philosophy. Perhaps the Verizon business plan is to cherry pick in markets outside their footprint, but only where they have the low-cost option of overlashing fiber. If that’s their real business plan then they will not be conquering the world with 5G, but just cherry picking neighborhoods that meet their price profile – a much smaller footprint and business plan than most of the industry is expecting.

My hope is that the rest of the industry starts referring to this technology as fiber-to-the-curb instead of calling it 5G. The wireless companies have gained great advantage from using the 5G name for multiple technologies. They have constantly used the speeds from the fiber-to-the-curb trials and the hot spot trials to make the public think the future means gigabit cellular service. It’s time to start demystifying 5G and using a different name for the different technologies.

Once this is understood it ought to finally be clear that millimeter wave fiber-to-the-curb is not coming everywhere. This sounds incredibly expensive to build in neighborhoods with already-buried utilities. Where density is low it might turn out that fiber-to-the-curb is more expensive than fiber-to-the-home. The big cost advantage seems to come from hitting multiple homes from one pole transmitter. Over time, when anybody can buy the needed components of the technology the best business case will become apparent to us all – for now the whole industry is guessing about what Verizon is doing because we don’t understand the basic costs of the technology.

At the end of the day this is just another new technology to put into the quiver when designing last mile networks. There will undoubtably be places where fiber-to-the-curb has a cost advantage over fiber drops. Assuming that Verizon or somebody else builds enough of the technology to pull hardware prices down, I picture a decade from now that fiber overbuilds will consider fiber-to-the-curb as part of the mix in designing the last few hundred feet.

We Need Public 5G Spectrum

Last October the FCC issued a Notice for Proposed Rulemaking that proposed expanding WiFi into the 6 GHz band of spectrum (5.925 to 7.125 GHz). WiFi has been a huge economic boon to the country and the FCC recognizes that providing more free public spectrum is a vital piece of the spectrum puzzle. Entrepreneurs have found a myriad of inventive ways to use WiFi that go far beyond what carriers have provided with licensed spectrum.

In much of the country the 6 GHz spectrum is likely to be limited to indoor usage due to possible outdoor interference with Broadcast Auxiliary Service, where remote crews transmit news feeds to radio and TV stations, and Cable Television Relay Service, which cable companies used to transmit data within a cable company. The biggest future needs for WiFi are going to be indoors, so restricting this spectrum to indoor use doesn’t feel like an unreasonable limitation.

However, WiFi has some inherent limitations. The biggest problem with the WiFi standard is that a WiFi network will pause to allow any user to use the bandwidth. In a crowded environment with a lot of devices the constant pausing adds latency and delay in the system, and in heavy-use environments like a business hotel the constant pauses can nearly shut down a WiFi network. Most of us don’t feel that interference today inside our homes, but as we add more and more devices over time, we will recognize the inherent WiFi interference into our network. The place where WiFi interference is already a big concern is in heavy wireless environments like hospitals, factories, airports, business hotels, and convention centers.

Many of our future computing needs are going to require low latency. For instance, creating home holograms from multiple transmitters is going to require timely delivery of packets to each transmitter. Using augmented reality to assist in surgery will require deliver of images in real time. WiFi promises to get better with the introduction of WiFi 6 using the 802.11ax standard, but that new standard does not eliminate the innate limitations of WiFi.

The good news is that we already have a new wireless standard that can create a low-latency dedicated signal paths to users. Fully implemented 5G with frequency slicing can be used to satisfy those situations where WiFi doesn’t meet the need. It’s not hard to picture a future indoor network where a single router can satisfy some user needs using the WiFi standard with other uses satisfied using 5G – the router will choose the best standard to use for a given need.

To some degrees the cellular carriers have this same vision. They talk of 5G being used to take over IoT needs instead of WiFi. They talk about using 5G for low latency uses like augmented reality. But when comparing the history of the cellular networks and WiFi it’s clear that WiFi has been used far more creatively. There are thousands of vendors working in today’s limited WiFi spectrum that have developed a wide array of wireless services. Comparatively, the cellular carriers have been quite vanilla in their use of cellular networks to deliver voice and data.

I have no doubt that AT&T and Verizon have plans to offer million-dollar 5G solutions for smart factories, hospitals, airports and other busy wireless environments. But in doing so they will tap only a tiny fraction of the capability of 5G. If we want 5G to actually meet the high expectations that the industry has established, we ought to create a public swath of spectrum that can use 5G. The FCC could easily empower the use of the 6 GHz spectrum for both WiFi and 5G, and in doing so would unleash wireless entrepreneurs to come up with technologies that haven’t even been imagined.

The current vision of the cellular carriers is to somehow charge everybody a monthly subscription to use 5G – and there will be enough devices using the spectrum that most people will eventually give in and buy the subscription. However, the big carriers are not going to be particularly creative, and instead are likely to be very restrictive on how we use 5G.

The alternate vision is to set aside a decent slice of public spectrum for indoor use of 5G. The public will gain use of the spectrum by buying a 5G router, with no monthly subscription fee – because it’s using public spectrum. After all, 5G is a just standard, developed worldwide and is not the proprietary property of the big cellular companies. Entrepreneurs will jump on the opportunity to develop great uses for the spectrum and the 5G standard. Rather than being held captive by the limited vision of AT&T and Verizon we’d see huge number of devices using 5G creatively. This could truly unleash things like augmented reality and virtual presence. Specialty vendors would develop applications that make great strides in hospital health care. We’d finally see smart shopping holograms in stores.

The public probably doesn’t understand that the FCC has complete authority over how each swath of spectrum is used. Only the FCC can determine which spectrum can or cannot be used for WiFi, 5G and other standards. The choice ought to be an easy one. The FCC can let a handful of cellular companies decide how society will use 5G or they can unleash the creativity of thousands of developers to come up with a myriad of 5G applications. We know that creating public spectrum creates immense societal and economic good. If the FCC hadn’t set aside public spectrum for WiFi we’d all still have wires to all our home broadband devices and many of the things we now take for granted would never have come to pass.

The Resurgence of Wireless Mesh?

I’ve had several calls recently from clients asking about wireless mesh networks. Those that have been in the industry for a while probably remember the mesh network craze in the late 1990s. At that time large cities all over the country considered building WiFi mesh networks to try to bring broadband to everybody in their cities. Many cities deployed pilot systems, but in the end, the technology never panned out. The technology had the squirrely problems often associated with wireless technology and never delivered the bandwidth that the manufacturers promised.

Apparently, the technology is back. I went to the web for a quick investigation, and sure enough there are carrier-class outdoor mesh radios available from a number of different manufacturers. In case you aren’t familiar with the concept of a mesh network, it’s a network comprised of multiple radios, each of which connects to multiple other radios. Most mesh networks are dynamically linked, meaning that the radios work autonomously to find the most efficient routing path for traffic within the mesh. The easiest way to understand this is with this diagram from Cisco, which has been manufacturing mesh network gear for many years. In this diagram each radio interconnects with neighboring radios.

The biggest flaw in the technology two decades ago was that the mesh networks didn’t scale well. This was for two reasons. First, by definition, a wireless link loses half of its bandwidth with every hop to another radio. Mesh networks with too many hops don’t deliver very much bandwidth to the most remote nodes in the network.

Large mesh network also developed an unexpected problem. One of the characteristics of a mesh network is that the radios constantly coordinate with each other. If a given node is temporarily overloaded with a big bandwidth demand from an end user, the network dynamically routes other traffic around the bottleneck. Unfortunately, it turned out that in large networks the radios spent a majority of the bandwidth communicating with each other at the expense of the bandwidth left for end users. As mesh network grew in size the amount of bandwidth throughput decreased significantly. Technicians determined that this excess internode chatter could be reduced by limiting the number of nodes that any radio could communicate with, but in doing so the network was no longer a real mesh.

The other big problem in the 1990s is that the networks were deployed as outdoor radios, meaning that very little bandwidth actually made it into homes. I remember working one day at a client where I could see a nearby mesh radio through a window. As long as I sat where I had a direct line of sight to the radio I could use the WiFi, but if I moved to another part of the room the signal completely died. Broadcasting WiFi with outside radios is an inefficient way to provide bandwidth indoors.

Those inherent problems are still an issue today. There is no way to eliminate the issue of the bandwidth decreasing with each hop. However, the difference from today and the 19990s is that we can feed a mesh network with gigabits of broadband instead of with a few T1s. To some degree, that means that we can overpower the system so that at least some bandwidth makes it to the furthest nodes in the network.

One of the other weaknesses of a mesh network is that most networks use WiFi spectrum. Practically every wired home uses WiFi today to move bandwidth around the house. Superimposing a mesh WiFi network in a neighborhood means a lot more high-power WiFi sources to cause interference with every other WiFi device. Anybody who has ever tried to maintain a WiFi signal in a crowded business hotel understands the issues with WiFi interference.

Even with those limitations, I can see some great uses for a mesh network. The vendors are pushing the technology as a way to bring bandwidth more easily to outdoor spaces like parks. There is a brand of outdoor mesh devices being marketed as a way to spread WiFi around a farmhouse to the outdoor buildings. While nobody seems to be marketing the idea yet, a mesh network might be a good way to spread WiFi signals to fields and pastures to track the small bandwidth sensors being used to collect data from fields and herds.

What my clients really wanted to know is if a mesh network could be used to provide residential broadband. There might be situations where this makes sense. Rather than trying to beam the bandwidth from outside hotspots, each radio could feed a wire into a home. But mesh networks still have the same inherent problems as in the past and in most cases other solutions can probably produce faster and more consistent bandwidth. As a consultant I always have an open mind, but having seen the technology crash and burn once before I’d want to see this in practice before buying the resurgence of the technology again.

Why 5G Won’t Be Here Tomorrow

I just saw another article yesterday written by a major-city newspaper telling the public that 5G is coming in 2020. I hate to see reporters who have accepted the nonsense being peddled by the carriers without digging a little deeper to find the truth. At some point in the near future, the public will finally realize that the 5G talk has mostly been hype.

I don’t mean to always sound like a 5G critic because over time 5G will vastly improve the cellular experience. However, many of the improvements being suggested by the cellular companies – like gigabit cellular service – may never happen. Of more immediacy is the fact that there won’t be any major improvements to cellular networks from 5G for at least 3 – 5 years. The carriers have the country and politicians fully convinced that 5G is right around the corner – but it’s not.

There was a recent article written by Sue Marek in FierceWireless that is a great example of why 5G is not going to be here tomorrow. Titled Network Slicing is a Security Nightmare for Operators, Marek explains how complicated it’s going to be to implement network slicing – perhaps the most important new aspect of 5G cellular service.

Network slicing is the ability of the cellular network to size the transmission path to exactly meet a customer’s bandwidth needs. Network slicing is one of the ways that will enable a cell site to communicate with many more customers at the same time. Today, every customer gets the same-sized data channel, meaning a lot of bandwidth is wasted when customers use less than a full channel.

Marek points out the difficult technical challenge for providing security for every slice of bandwidth. She says that getting this right is going to take two to three years. Until network slicing is viable there really is nothing that can be called 5G. The important takeaway from her article is how difficult it is to implement new technology. 5G is a drastic change from 4G in many ways. There are thirteen major changes in the 5G specification compared to 4G and implementing each of them will be a technical challenge.

What is annoying about the 5G marketing hype is that we’ve always known it would take up to a decade to fully implement 5G, just as it did to implement 4G. The cellular companies can’t seem to help themselves from overhyping new technology, but the 5G hype is many times worse than the 4G hype a decade ago. This mostly seems due to the fact that the cellular carriers decided to use the 5G hype as a way to cram through regulatory changes they’ve wanted for a long time. That forced them to really crank up the 5G rhetoric.

5G will take the same path used by all other electronic technologies – there is a tried-and-true method of introducing upgrades. New breakthroughs start in a lab. They then go to a ‘breadboard’ process where working models are developed. Once the breadboards have been thoroughly tested they go into prototype chips, which are then retested to make sure the performance made it through the conversion to silicone. Finally, the chip design is approved and the new breakthrough goes into production. At the very fastest this process might be done in 12 – 18 months, although this can take as long as three years. Breaking in new changes in the cellular world is doubly complicated because these same changes also have to be introduced into cellphone handsets.

The likely progression we’ll see for 5G is that some new aspect of the 5G specification will make it annually into chipsets. As that happens, only the newest phones will be able to use the upgrades, while earlier versions of 5G phones won’t recognize the new breakthroughs. The idea that the handset manufacturers are introducing 5G handsets in 2020 is laughable because practically none of the important 5G upgrades are yet in chip production. Those handsets will be 5G in name only (and still priced ridiculously high).

Marek is pointing out the complexity of getting 5G security right. There are dozens of other equally difficult technical challenges needed to fully realize 5G, and there are scientists in labs working on all of them. The labs will plow through all of this over time, and long after the hype is far in the past, we’ll get 5G phones that implement most of the 5G specification. It’s worth noting that there never may be a phone that meets the entire specification – because the specifications for a new technology are a wish list. It may turn out that some parts of the specification may never practically work in the field.