Unlicensed Millimeter Wave Spectrum

I haven’t seen it talked about a lot, but the FCC has set aside millimeter wave spectrum that can be used by anybody to provide broadband. That means that entities will be able to use the spectrum in rural America in areas that the big cellphone companies are likely to ignore.

The FCC set aside the V band (60 GHz) as unlicensed spectrum. This band provides 14 GHz of contiguous spectrum available for anybody to use. This is an interesting spectrum because it has a few drawbacks. This particular spectrum shares a natural harmonic with oxygen and thus is more likely to be absorbed in an open environment than other bands of millimeter wave spectrum. In practice, this will shorten bandwidth delivery distances a bit for the V band.

The FCC also established the E band (70/80 GHz) for public use. This spectrum will have a few more rules than the 60 GHz spectrum and there are light licensing requirements for the spectrum. These licenses are fairly easy to get for carriers, but it’s not so obvious that anybody else can get the spectrum. The FCC will get involved with interference issues with the spectrum – but the short carriage distances of the spectrum make interference somewhat theoretical.

There are several possible uses for the millimeter wave spectrum. First, it can be focused in a beam and used to deliver 1-2 gigabits of broadband for up to a few miles. There have been 60 GHz radios on the market for several years that operate for point-to-point connections. These are mostly used to beam gigabit broadband in places where that’s cheaper than building fiber, like on college campuses or in downtown highrises.

This spectrum can also be used as hotspots, as is being done by Verizon in cities. In the Verizon application, the millimeter wave spectrum is put on pole-mounted transmitters in downtown areas to deliver data to cellphones as fast as 1 Gbps. This can also be deployed in more traditional hot spots like coffee shops. The problem of using 60 GHz spectrum for this use is that there are almost no devices yet that can receive the signal. This isn’t going to get widespread acceptance until somebody builds this into laptops or develops a cheap dongle. My guess is that cellphone makers will ignore 60 GHz in favor or the licensed bands owned by the cellular providers.

The spectrum could also be used to create wireless fiber-to-the-curb like was demonstrated by Verizon in a few neighborhoods in Sacramento and a few other cities earlier this year. The company is delivering residential broadband at speeds of around 300 Mbps. These two frequency bands are higher than what Verizon is using and so won’t carry as far from the curb to homes, so we’ll have to wait until somebody tests this to see if it’s feasible. The big cost of this business plan will still be the cost of building the fiber to feed the transmitters.

The really interesting use of the spectrum is for indoor hot spots. The spectrum can easily deliver multiple gigabits of speed within a room, and unlike WiFi spectrum won’t go through walls and interfere with neighboring rooms. This spectrum would eliminate many of the problems with WiFi in homes and in apartment buildings – but again, this needs to first be built into laptops, sart TVs and other devices.

Unfortunately, the vendors in the industry are currently focused on developing equipment for the licensed spectrum that the big cellular companies will be using. You can’t blame the vendors for concentrating their efforts in the 24, 28, and 39 GHz ranges before looking at these alternate bands. There is always a bit of a catch 22 when introducing any new spectrum – a vendor needs to make the equipment available before anybody can try it, and vendors won’t make the equipment until they have a proven market.

Electronics for millimeter wave spectrum is not as easily created as equipment in lower frequency bands. For instance, in the lower spectrum bands, software-defined radios can easily change between nearby frequencies with no modification of hardware. However, each band of millimeter wave spectrum has different operating characteristics and specific antenna requirements and it’s not nearly as easy to shift between a 39 GHz radio and a 60 GHz radio – they requirements are different for each.

And that means that equipment vendors will need to enter the market if these spectrum bands are ever going to find widespread public use. Hopefully, vendors will find this worth their while because this is a new WiFi opportunity. Wireless vendors have made their living in the WiFi space and they need to be convinced that they have the same with these widely available spectrum bands. I believe that if some vendor builds indoor multi-gigabit routers and receivers, the users will come.

The Busy Skies

I was looking over the stated goals of the broadband satellite companies and was struck by the sheer numbers of satellites that are being planned. The table further down in the blog shows plans for nearly 15,000 new satellites.

To put this into perspective, consider the number of satellites ever shot into space. The United Nations Office for Outer Space Affairs (NOOSA) has been tracking space launches for decades. They report that there have been 8,378 objects put into space since the first Sputnik in 1957. As of the beginning of 2019, there were still 4,987 satellites still in orbit, although only 1,957 were still operational.

There was an average of 131 satellites launched per year between 1964 and 2012. Since 2012 we’ve seen 1,731 new satellites, with 2017 (453) and 2018 (382) seeing the most satellites put into space.

The logistics for getting this many new satellites into space is daunting. We’ve already seen OneWeb fall behind schedule. In addition to these satellites, there will continue to be numerous satellites launched for other purposes. I note that a few hundred of these are already in orbit. In the following table, “Current” means satellites that are planned for the next 3-4 years.

Current Future Total
SkyLink 4,425 7,528 11,953
OneWeb 650 1,260 1,910
Telesat 117 512 629
Samsung 4,600 4,600
Kuiper 3,326 3,326
Boeing 147 147
Kepler 140 140
LeoSat 78 30 108
Iridium Next 66 66
SES 03B 27 27
Facebook 1 1
 Total 5,192 9,300 14,492

While space is a big place, there are some interesting challenges from having this many new objects in orbit. One of the biggest concerns is space debris. Low earth satellites travel at a speed of about 17,500 miles per hour to maintain orbit. When satellites collide at that speed, they create a large number of new pieces of space junk, also traveling at high speed. NASA estimates there are currently over 128 million pieces of orbiting debris smaller than 1 square centimeter and 900,000 objects between 1 and 10 square centimeters.

NASA scientist Donald Kessler described the dangers of space debris in 1978 in what’s now described as the Kessler syndrome. Every space collision creates more debris and eventually there will be a cloud of circling debris that will make it nearly impossible to maintain satellites in space. While scientists think that such a cloud is almost inevitable, some worry that a major collision between two large satellites, or malicious destruction by a bad actor government could accelerate the process and could quickly knock out all of the satellites in a given orbit. It would be ironic if the world solves the rural broadband problem using satellites, only to see those satellites disappear a cloud of debris.

Having so many satellites in orbit also concerns another group of scientists. The International Dark Sky Association has been fighting against light pollution that makes it hard to use earth-based telescopes. The group now also warns that a large number of new satellites will forever change our night sky. From any given spot on the Earth, the human eye can see roughly 1,300 visible stars. These satellites are all visible and once launched, mankind will never again see the natural sky that doesn’t contains numerous satellites at any given moment.

Satellite broadband is an exciting idea. The concept of bringing good broadband to remote people, to ships, and to airplanes is enticing. For example, the company Kepler listed above is today connecting to monitors for scientific purposes in places like lips of volcanos and on ocean buoys and is helping us to better understand our world. However, in launching huge numbers of satellites for broadband we’re possibly polluting space in a way that could make it unusable for future generations.

Robocalls and Small Carriers

In July, NTCA filed comments in the FCC docket that is looking at an industry-wide solution to fight against robocalls. The comments outline some major concerns about the ability of small carriers to participate in the process.

The industry solution to stop robocalls, which I have blogged about before, is being referred to as SHAKEN/STIR. This new technology will create an encrypted token that verifies that a given call really originated with the phone number listed in the caller ID. Robocalls can’t get this verification token. Today, robocallers spoof telephone numbers, meaning that they insert a calling number into the caller ID that is not real. These bad actors can make a call look like it’s coming from any number – even your own!

On phones with visual caller ID, like cellphones, a small token will appear to verify that the calling party is really from the number shown. Once the technology has been in place for a while, people will learn to ignore calls that don’t come with the token. If the industry does this right, it will become easier to spot robocalls, and I imagine a lot of people will use apps that will automaticlly block calls without a token.

NTCA is concerned that small carriers will be shut out of this system, causing huge harm to them and their customers. Several network prerequisites must be in place to handle the SHAKEN/STIR token process. First, the originating telephone switch must be digital. Most, but not all small carriers now use digital switches. Any telco or CLEC using any older non-digital switch will be shut out of the process, and to participate they’d have to buy a new digital switch. After the many-year decline in telephone customers, such a purchase might be hard to cost-justify. I’m picturing that this might also be a problem for older PBXs – the switches operated by private businesses. The world is full of large legacy PBXs operated by universities, cities, hospitals and large businesses.

Second, the SHAKEN/STIR solution is likely to require an expensive software upgrade for the carriers using digital switches. Again, due to the shrinking demand for selling voice, many small carriers are going to have a hard time justifying the cost of a software upgrade. Anybody using an off-brand digital switch (several switch vendors folded over the last decade) might not have a workable software solution.

The third requirement to participate in SHAKEN/STIR is that the entire path connecting a switch to the public switched telephone network (PSTN) must be end-to-end digital. This is a huge problem and most small telcos, CLECs, cable companies, and other carriers connect to the PSTN using the older TDM technology (based upon multiples of T1s).

You might recall a decade ago there was a big stir about what the FCC termed a ‘digital transition’. The FCC at the time wanted to migrate the whole PSTN to a digital platform largely based upon SIP trunking. While there was a huge industry effort at the time to figure out how to implement the transition, the effort quietly died and the PSTN is still largely based on TDM technology.

I have clients who have asked for digital trunking (the connection between networks) for years, but almost none of them have succeeded. The large telcos like AT&T, Verizon, and CenturyLink don’t want to spend the money at their end to put in new technology for this purpose. A request to go all-digital is either a flatly refused, or else a small carrier is told that they must pay to transport their network traffic to some distance major switching point in places like Chicago or Denver – an expensive proposition.

What happens to a company that doesn’t participate in SHAKEN/STIR? It won’t be pretty because all of the calls originating from such a carrier won’t get a token verifying that the calls are legitimate. This could be devastating to rural America. Once SHAKEN/STIR is in place for a while a lot of people will refuse to accept unverified calls – and that means calls coming from small carriers won’t be answered. This will also affect a lot of cellular calls because in rural America those calls often originate behind TDM trunking.

We already have a problem with rural call completion, meaning that there are often problems trying to place calls to rural places. If small carriers can’t participate in SHAKEN/STIR, after a time their callers will have real problems placing calls because a lot of the world won’t accept calls that are not verified with a token.

The big telcos have assured the FCC that this can be made to work. It’s my understanding that the big telcos have mistakenly told the FCC that the PSTN in the country is mostly all-digital. I can understand why the big telcos might do this because they are under tremendous pressure from the FCC and Congress to tackle the robocall issue. These big companies are only looking out for themselves and not the rest of the industry.

I already had my doubts about the SHAKEN/STIR solution because my guess is that bad actors will find a way to fake the tokens. One has to only look back at the decades-old battles against spam email and against hackers to understand that it’s going to require a back-and-forth battle for a long time to solve robocalling – the first stab of SHAKEN/STIR is not going to fix the problem. The process is even more unlikely to work if it doesn’t function for large parts of the country and for whole rural communities. The FCC needs to listen to NTCA and other rural voices and not create another disaster for rural America.

Is There a Business Case for Fast Cellular?

We’ve gotten a glimpse of the challenges of marketing faster cellular usage since the two major cellular providers in South Korea made a big push in offering ultrafast cellular broadband. South Korea has two primary cellular carriers – SK Telecom and KT – and both have implemented cellular products using millimeter wave spectrum in Seoul and other dense urban areas.

The technology is nearly identical to the technology introduced by Verizon is small sections of major US cities. The technology uses millimeter wave hot spots from small cell sites to beam broadband to phones that are equipped to use the ultra-high spectrum. In South Korea, both companies are selling a millimeter wave spectrum version of the Samsung Gallery. In the US there are still barely any handset options.

5G hotspot data is not the same as traditional cellular data. The small cells blast out gigabit broadband that carries for only short distances of 500 to 800 feet. The signals can bounce off buildings in the right circumstances and can be received sporadically at greater distances from the transmitters. Millimeter wave spectrum won’t pass through any obstacle and the broadband signal reception can be blocked by any obstacle in the environment, including the body of the person using the cellphone.

Even with those limitations, the speeds delivered with this technology are far faster than traditional cellular data speeds. Verizon has reported peak speeds as fast as 600 Mbps in trials being deployed in US cities. That’s an amazing amount of bandwidth to deliver to a cellphone since a cellphone is, by definition a single user device. Since the average 4G LTE data speed is less than 25 Mbps, our cellphone apps are not designed to be bandwidth hogs. Current 4G speeds are more than adequate to stream video, and with the small screens, there’s no benefit to streaming in 4K or even in 1080p. All of the major cellular carriers already chop down the quality of video streams and thus use only a fraction of the bandwidth used to deliver a single video stream to homes. Cellphones are also not designed to multitask and handle multiple simultaneous tasks.

For now, the biggest benefit of millimeter wave spectrum for cellphones looks to be the ability to quickly download big files like movies, apps or software updates. There is certainly an appeal to downloading a big movie to watch later in less than 30 seconds rather than the more normal 10 minutes. But with data caps on even most unlimited plans I have to wonder how many people routinely download big movie files when they aren’t connected to WiFi.

Another way that faster cellular speeds could be beneficial is for faster web browsing. However, the slow cellphone browsing we experience today is not due to 4G LTE speeds, which are adequate for a decent browsing experience. The painfully slow browsing on cellphones is due to operating systems in cellphones that favor display over functionality – the cellular companies have chosen to downplay browsing speed in favor of maximizing the display for phone apps. Faster millimeter wave spectrum won’t overcome this inherent and deliverate software limitation.

There is another use for faster broadband. South Korea likely has a much higher demand for high-speed cellular because the country is game-crazy. A large majority of the population, including adults, are heavily involved in intensive gaming. There is obviously some appeal for having a fast gaming connection when away from a desktop.

South Korean market analysts are looking at the cost of millimeter wave deployment and the potential revenue stream and are already wondering if this is a good investment. SK Telecom expects to have 2 million customers for the faster broadband by the end of this year. In South Korea, sales of millimeter wave spectrum phones are going well. (these can’t be called 5G phones because they don’t handle frequency slicing or the other slew of 5G features that won’t be introduced for at least three more years).

If the analysts in South Korea don’t see the financial benefits, it’s much harder to see the benefits here. Remember that in South Korea that urban homes can already buy gigabit broadband at home for the equivalent of $30 per month. Moreover, the two big ISPs are in the process of upgrading everybody to 10 Gbps within the next five years. This is a country where everybody has been trained to expect an instant response online – and the faster cellular speeds can bring that expected response to mobility.

The business plan here in the US is a lot more challenging. In South Korea, a lot of people live in dense urban city centers unlike our spread-out population with far-stretching suburbs around cities. The network cost to deploy the millimeter wave technology here will be significantly higher to achieve the same kind of coverage seen in South Korea. At least for now, it’s also a lot harder to paint a picture in the US for large numbers of users willing to pay extra for faster cellular data. Several recent surveys indicate that US consumers think faster 5G data speeds should be offered at the same high prices we already pay for cellular broadband (the US has some of the highest cellular data prices among industrial countries).

I can’t see a major play here for ultra-fast cellular broadband outside of dense city centers and perhaps in places like stadiums and convention centers. It’s hard to think that somehow deploying this technology in the suburbs could ever be cost-justified. We are likely to upgrade cellular data to the more normal 5G using mid-range spectrum, and that’s going to nudge cellular data speeds in time up to 100 Mbps. I think most users here will love somewhat faster speeds but won’t be willing to pay extra for them. It’s hard to think that there are enough people in the US willing to pay even more for millimeter wave speeds that can justify the cost of deploying the networks. This is further compounded by the fact that these millimeter wave networks are outdoors only and the spectrum doesn’t penetrate buildings at all. The US has become an indoor society. At least where I live you rarely see teenagers outdoors in their home neighborhood – they are consuming broadband indoors. Does anybody really care about a fast outdoor network?

Why Aren’t We Talking about Technology Disruption?

One of the most interesting aspects of modern society is how rapidly we adapt to new technology. Perhaps the best illustration of this is the smartphone. In the short period of a decade, we went from a new invention to the point where the large majority of the American public has a smartphone. Today the smartphone is so pervasive that recent statistics from Pew show that 96% of those under between 18 and 29 have a smartphone.

Innovation is exploding in nearly every field of technology, and the public has gotten so used to change that we barely notice announcements that would have made worldwide headlines a few decades ago. I remembre as a kid when Life Magazine had an issue largely dedicated to nylon and polymers and had the world talking about something that wouldn’t even get noticed today. People seem to accept miracle materials, gene splicing, and self-driving cars as normal technical advances. People now give DNA test kits as Christmas presents. Nobody blinks an eye when big data is used to profile and track us all. We accept cloud computing as just another computer technology. In our little broadband corner of the technology world, the general public has learned that fiber and gigabit speeds are the desired broadband technology.

What I find perhaps particularly interesting is that we don’t talk much about upcoming technologies that will completely change the world. A few technologies get talked to death such as 5G and self-driving cars. But technologists now understand that 5G is, in itself, not a disruptive technology – although it might unleash other disruptive technologies such as ubiquitous sensors throughout our environment. The idea of self-driving cars no longer seems disruptive since I can already achieve the same outcome by calling an Uber. The advent of self-driving semi trucks will be far more disruptive and will lower the cost of the nationwide supply chain when we use fleets of self-driving electric trucks.

I’ve always been intrigued about those who peer into the future and I read everything I can find about upcoming technologies. From the things I read there are a few truly disruptive technologies on the horizon. Consider the following innovations that aren’t too far in the future:

Talking to Computers. This will be the most important breakthrough in history in terms of the interface between humans and technology. In a few short generations, we’ve gone from typing on keyboards, to using a mouse, to using cellphones – but the end game will be talking directly to our computers using natural conversational language. We’ve already seen significant progress with natural language processing and are on a path to be able to converse with computers in the same way we communicate with other people. That will trigger a huge transition in society. Computers will fade into the background since we’ll have the full power of the cloud anywhere that we’re connected to the cloud. Today we get a tiny inkling by seeing how people use Apple Siri or Amazon Alexa – but these are rudimentary voice recognition systems. It’s nearly impossible to predict how mankind will react to having the full power of the web with us all of the time.

Space Elevator. In 2012 the Japanese announced a nationwide goal of building a space elevator by 2050. That goal has now been pulled forward to 2045. A space elevator will be transformational since it will free mankind from the confines of the planet earth. With a space elevator we can cheaply and safely move people and materials to and from space. We can drag up the raw materials needed to build huge space factories that can then take advantage of the mineral riches in the asteroid belt. From there we can colonize the moon and mars, build huge space cities and build spaceships to explore nearby stars. The cost of the space elevator is still estimated to only be around $90 billion, the same as the cost of the high-speed rail system between Osaka and Tokyo.

Alternate Energy. We are in the process of weaning mankind from fossil fuel energy sources. While there is a long way to go, several countries in Europe have the goal to be off carbon fuels within the coming decade. The EU already gets 30% of electricity from alternate energy sources. The big breakthrough might finally come from fusion power. This is something that has been 30 years away my whole adult life, but scientists at MIT and other places have developed the needed magnets that can contain the plasma necessary for a fusion reaction and some scientists are now predicting fusion power is now only 15 years away. Fusion power would supply unlimited non-polluting energy, which would transform the whole world, particularly the third world.

An argument can be made that there are other equally disruptive technologies on the horizon like artificial intelligence, robotics, gene-editing, virtual reality, battery storage, and big data processing. Nothing on the list would be as significant as a self-aware computer – but many scientists still think that’s likely to be far into the future. What we can be sure of is that breakthroughs in technology and science will continue to come at us rapidly from all directions. I wonder if the general public will even notice the mosts important breakthroughs or if change has gotten so ho hum that it’s just an expected part of life.

Terahertz WiFi

While labs across the world are busy figuring out how to implement the 5G standards there are scientists already working in the higher frequency spectrum looking to achieve even faster speeds. The frequencies that are just now being explored are labeled as the terahertz range and are at 300 GHz and higher spectrum. This spectrum is the upper ranges of radio spectrum and lies just below ultraviolet light.

Research in these frequencies started around 2010, and since then the achieved broadband transmission speeds have progressed steadily. The first big announced breakthrough in the spectrum came in 2016 when scientists at the Tokyo Institute of Technology achieved speeds of 34 Gbps using the WiFi standard and the 500 GHz spectrum range.

In 2017, researchers at Brown University School of Engineering were able to achieve 50 Gbps. Later that year a team of scientists from Hiroshima University, the National Institute of Information and Communications Technology and Panasonic Corporation achieved a speed of 105 Gbps. This team has also subsequently developed a transceiver chip that can send and receive data at 80 Gbps – meaning these faster speeds could be moved out of the lab and into production.

Like with all frequencies, when transmitted through the air, the higher the bandwidth the shorter the distance until a radio transmission scatters. That makes the biggest challenge for using these frequencies the short transmission distances. However, several of the research teams have shown that transmissions perform well when bounced off walls and the hope is to eventually achieve distances as long as 10 meters (30 feet).

The real benefit of superfast bandwidth will likely be for super-short distances. One of the uses of these frequencies could be to beam data into computer processors. One of the biggest impediments to faster computing is the physical act of getting data to where it’s needed on time, and terahertz lasers could be used to speed up chips.

Another promising use of the faster lasers is to create faster transmission paths on fibers. Scientists have already been experimenting and it looks like these frequencies can be channeled through extremely thin fibers to achieve speeds much faster than anything available today. Putting this application into the field is probably a decade or more away – but it’s a breakthrough that’s needed. Network engineers have already been predicting that we will exhaust the capabilities of current fiber technology on the major Internet transmission paths between major POPs. As the volume of bandwidth we use keeps doubling we will be transmitting more data in a decade or two between places like New York and Washington DC than all of the existing fibers can theoretically carry. When fiber routes get that full the problem can’t be easily fixed by adding more fibers – not when the volumes double every few years. We need solutions that involve fitting more data into existing fibers.

There are other applications that could use higher frequencies today. For example, there are bandwidth needs for specific applications like real-time medical imaging and real-time processing for intricate chemical engineering that need faster bandwidth that is possible with 5G. The automated factories that will create genetic-based drug solutions will need much faster bandwidth. There are other more mundane uses of the higher frequencies. For example, these frequencies could be used to replace X-rays and reduce radiation risks in doctor’s offices and airports.

No matter what else the higher frequencies can achieve, I’m holding out for Star Trek holodecks. The faster terahertz frequencies could support creation of the complex real-time images involved in truly immersive entertainment.

These frequencies will become the workhorse for 6G, the next generation of wireless technology. The early stages of developing a 6g standard is underway with expectations of having a standard by perhaps 2030. Of course, the hype for 6G has also already begun. I’ve already seen several tech articles that talk about the potential for having ultrafast cellular service using these frequencies. The authors of these articles don’t seem to grasp that we’d need a cell site every twenty feet – but facts don’t seem to get in the way of good wireless hype.

Are You Ready for 10 Gbps?

Around the world, we’re seeing some migration to 10 Gbps residential broadband. During the last year the broadband providers in South Korea, Japan, and China began upgrading to the next-generation PON and are offering the blazingly fast broadband products to consumers. South Korea is leading the pack and expects to have the 10 Gbps speed to about 50% of subscribers by the end of 2022.

In the US there are a handful of ISPs offering a 10 Gbps product, mostly for the publicity – but they stand ready to install the faster product. Notable is Fibrant in Salisbury, NC and EPB in Chattanooga. EPB which was also among the first to offer a 1 Gbps residential product a few years ago.

I have a lot of clients who already offer 10 Gbps connections to large business and carrier customers to serve large businessesn like data centers and hospital complexes. However, except for the few pioneers, these larger bandwidth products are being delivered directly to a single customer using active Ethernet technology.

There are a few hurdles for offering speeds over a gigabit in the US. Perhaps foremost is that there are no off-the-shelf customer electronics that can handle speeds over a gigabit – the typical WiFi routers and computers work at slower speeds. The biggest hurdle for an ISP continues to be the cost of the electronics. Today the cost of next-generation PON equipment is high and will remain so until the volume of sales brings the per-unit prices down. The industry market research firm Ovum predicts that we’ll see wide-spread 10 Gbps consumer products starting in 2020 but not gaining traction until 2024.

In China, Huawei leads the pack. The company has a 10 Gbps PON system that is integrated with a 6 Gbps WiFi 6 router for the home. The system is an easy and overlay on top of the company’s traditional GPON network gear. In South Korea the largest ISP SK Broadband has worked with Nokia to develop a proprietary PON technology only used today inside of South Korea. Like Huawei, this overlays onto the existing GPON network. In Japan the 10 Gbps PON network is powered by Sumitomo, a technology only being sold in Japan. None of these technologies has made a dent in the US market, with Huawei currently banned due to security concerns.

In the US there are two technologies being trialed. AT&T is experimenting with XGS-PON technology. They plan to offer 2 Gbps broadband, upgradable to 10 Gbps in the new high-tech community of Walsh Ranch being built outside of Ft. Worth. AT&T is currently trialing the technology at several locations within its FTTP network that now covers over 12 million passings. Verizon is trying the NG-PON2 technology but is mostly planning to use this to power cell sites. It’s going to hard for any ISP to justify deployment of the new technologies until somebody buys enough units to pull down the cost.

Interestingly, Cable Labs is also working on a DOCSIS upgrade that will allow for faster speeds up to 10 Gbps. The problem most cable networks will have is in finding space of their network for the needed channels to support the faster speeds.

There are already vendors and labs exploring 25 Gbps and 50 Gbps PON. These products will likely be used for backhaul and transport at first. The Chinese vendors think the leap forward should be to 50 Mbps while other vendors are all considering a 25 Mbps upgrade path.

The real question that needs to be answered is if there is any market for 10 Gbps bandwidth outside the normally expected uses like cellular towers, data centers, and large business customers. This same question was asked when EPB at Chattanooga and LUS in Lafayette, Louisiana rolled out the earliest 1 Gbps residential bandwidth. Both companies were a bit surprised when they got a few instant takers for the faster products – in both markets from doctors that wanted to be able to analyze MRIs and other big files at home. There are likely a few customers who need speeds above 1 Gbps, with doctors again being good candidates. Just as broadband speeds have advanced, the medical imaging world has grown more sophisticated in the last decade and is creating huge data files. The ability to download these quickly offsite will be tempting to doctors.

I think we are finally on the verge of seeing data use cases that can eat up most of a gigabit of bandwidth in the residential environment. For example, uncompressed virtual and augmented reality can require masses of downloaded data in nearly real-time. As we start seeing use cases for gigabit speeds, the history of broadband has shown that the need for faster speeds is probably not far behind.

The End of the Central Office?

One of the traditional costs for bringing fiber to a new market has always included the creation of some kind of central office space. This might mean modifying space in an existing building or building a new building or large hut. In years past a central office required a lot of physical space, but we are finally to the point with technology where the need for a big central office is often disappearing.

A traditional central office started with the need to house the fiber terminating electronics that connect the new market to the outside world. There also is the need to house and light the electronics facing the customers – although in some network design configurations some of the customer facing electronics can be housed in remote huts in neighborhoods.

A traditional central office needs room for a lot of other equipment. First is significant space for batteries to provide short-term backup in case of power outages. For safety reasons the batteries are often placed in a separate room. Central offices also need space for the power plant used to make the conversion from AC power to DC power. Central offices also usually need significant air conditioning and need room to house the cooling units. If the fiber network terminating to a central office is large enough there is also the requirement for some kind of fiber management system needed to separate the individual fibers in a neat and sensible way. Finally, if the above needs meant building a large enough space, many ISPs also built space to provide working and office space for technicians.

Lately I’ve seen several fiber deployments that don’t require the large traditional central office space. This is largely due to the evolution of the electronics used for serving customers in a FTTP network. For example, the OLTs (optical line terminations) electronics has been significantly compressed in size and density and a shelf of equipment can now perform the same functions that would have required much of a full rack a decade ago. As that equipment has reduced in size, the power requirements have also dropped, reducing the size of the power plant and the batteries.

I’ve seen several markets where a large cabinet provides enough room to replace what would have required a full central office a decade ago. These are not small towns, and two of the deployments are for towns with populations over 20,000.

As the footprint for the ‘central office’ has decreased there’s been a corresponding drop in costs. There are several supply houses that will now pre-install everything needed into the smaller cabinet / hut and deliver the whole unit complete and ready to go after connecting to power and splicing to fiber.

What I find interesting is that I still see some new markets built in the more traditional way. In that same market of 20,000 people it’s possible to still use a configuration that constructs several huts around the city to house the OLT electronics. For purposes of this blog I’ll refer to that as a distributed configuration.

There are pros and cons to both configurations. The biggest benefit of having one core hut or cabinet is lower cost. That means one pre-fab building instead of having to build huts or cabinets at several sites.

The distributed design also has advantages. A redundant fiber ring can be established with a network consisting of at least three huts, meaning that fewer parts of the market will lose service due to a fiber cut near to the core hub. But the distributed network also means more electronics in the network since there is now the need for electronics to light the fiber ring.

The other advantage of a distributed network is that there are fewer fibers terminating to each hut compared to having all customer fibers terminating to a single hut. The distributed network likely also has smaller fibers in the distribution network since fiber can be sized for a neighborhood rather than for the whole market. That might mean less splicing required during the initial construction.

Anybody building a new fiber network needs to consider these two options. If the market is large enough then the distributed network becomes mandatory. However, many engineers seem to be stuck with the idea that they need multiple huts and a fiber ring even for smaller towns. That means paying a premium price to achieve more safety against customer outages. However, since raising the money to build a fiber network is often the number one business consideration, the ability to save electronics costs can be compelling. It would not be unusual to see the single-hub configuration save half a million dollars or more. There is no configuration that is the right choice for all situations. Just be sure if you’re building FTTP in a new market that you consider the options.

Millimeter Wave 5G is Fiber-to-the-Curb

I’ve been thinking about and writing about 5G broadband using millimeter wave spectrum for over a year. This is the broadband product that Verizon launched in Sacramento and a few other markets as a trial last year. I don’t know why it never struck me that this technology is the newest permutation of fiber-to-the curb.

That’s an important distinction to make because naming it this way makes it clear to anybody hearing about the technology that the network is mostly fiber with wireless only for the last few hundred feet.

I remember seeing a trial of fiber-to-the-curb back in the very early 2000s. A guy from the horse country in Virginia had developed the technology of delivering broadband from the pole into the home using radios. He had a working demo of the technology at his rural home. Even then he was beaming fast speeds – his demo delivered an uncompressed video signal from curb to home. He knew that the radios could be made capable of a lot more speed, but in those days I’m sure he didn’t think about gigabit speeds.

The issues that stopped his idea from being practical have been a barrier until recently. There was first the issue of getting the needed spectrum. He wanted to use what we now call midrange spectrum, but which were considered as high spectrum bands in 2000 – he would have to convince the FCC to carve out a slice of spectrum for his application, something that’s always been difficult. He also didn’t have any practical way of getting the needed bandwidth to the pole. ISP’s were still selling T1s, 1 Mbps DSL, and 1 Mbps cable modem service, and while fiber existed, the electronics cost for terminating fiber to devices on multiple poles was astronomical. Finally, even then, this guy had a hard time explaining how it would be cheaper to use wireless to get to the home rather than building a drop wire.

Verizon press releases would make you think that they will be conquering the world with millimeter wave radios and deploying the technology everywhere. However, once you think of this as fiber-to-the-curb that business plan quickly makes no sense. The cost of a fiber-to-the-curb network is mostly in the fiber. Any saving from using millimeter wave radios only applies to the last few hundred feet. For this technology to be compelling the savings for the last hundred feed has to be significant. Do the radio electronics really cost less for wireless compared to the cost of fiber drops and fiber electronics?

Any such comparison must consider all the costs of each technology – meaning the cost of installations, repairs, maintenance, and periodic replacement of electronics. And the comparisons need to be honest. For example, every other wireless technology I know requires more maintenance truck roles than fiber-based technologies due to the squirrelly nature of how wireless behaves in the wild.

Even should the radios become much cheaper than fiber drops, the business case for the technology might still have no legs. There is no way to get around the underlying fact that fiber-to-the-curb means building fiber along residential streets. Verizon has always said that they didn’t extend their fiber FiOS network to neighborhoods where the construction costs were too high. Verizon still seems to be the most cautious of the big ISPs and it’s hard to think that they’ve changed this philosophy. Perhaps the Verizon business plan is to cherry pick in markets outside their footprint, but only where they have the low-cost option of overlashing fiber. If that’s their real business plan then they will not be conquering the world with 5G, but just cherry picking neighborhoods that meet their price profile – a much smaller footprint and business plan than most of the industry is expecting.

My hope is that the rest of the industry starts referring to this technology as fiber-to-the-curb instead of calling it 5G. The wireless companies have gained great advantage from using the 5G name for multiple technologies. They have constantly used the speeds from the fiber-to-the-curb trials and the hot spot trials to make the public think the future means gigabit cellular service. It’s time to start demystifying 5G and using a different name for the different technologies.

Once this is understood it ought to finally be clear that millimeter wave fiber-to-the-curb is not coming everywhere. This sounds incredibly expensive to build in neighborhoods with already-buried utilities. Where density is low it might turn out that fiber-to-the-curb is more expensive than fiber-to-the-home. The big cost advantage seems to come from hitting multiple homes from one pole transmitter. Over time, when anybody can buy the needed components of the technology the best business case will become apparent to us all – for now the whole industry is guessing about what Verizon is doing because we don’t understand the basic costs of the technology.

At the end of the day this is just another new technology to put into the quiver when designing last mile networks. There will undoubtably be places where fiber-to-the-curb has a cost advantage over fiber drops. Assuming that Verizon or somebody else builds enough of the technology to pull hardware prices down, I picture a decade from now that fiber overbuilds will consider fiber-to-the-curb as part of the mix in designing the last few hundred feet.

We Need Public 5G Spectrum

Last October the FCC issued a Notice for Proposed Rulemaking that proposed expanding WiFi into the 6 GHz band of spectrum (5.925 to 7.125 GHz). WiFi has been a huge economic boon to the country and the FCC recognizes that providing more free public spectrum is a vital piece of the spectrum puzzle. Entrepreneurs have found a myriad of inventive ways to use WiFi that go far beyond what carriers have provided with licensed spectrum.

In much of the country the 6 GHz spectrum is likely to be limited to indoor usage due to possible outdoor interference with Broadcast Auxiliary Service, where remote crews transmit news feeds to radio and TV stations, and Cable Television Relay Service, which cable companies used to transmit data within a cable company. The biggest future needs for WiFi are going to be indoors, so restricting this spectrum to indoor use doesn’t feel like an unreasonable limitation.

However, WiFi has some inherent limitations. The biggest problem with the WiFi standard is that a WiFi network will pause to allow any user to use the bandwidth. In a crowded environment with a lot of devices the constant pausing adds latency and delay in the system, and in heavy-use environments like a business hotel the constant pauses can nearly shut down a WiFi network. Most of us don’t feel that interference today inside our homes, but as we add more and more devices over time, we will recognize the inherent WiFi interference into our network. The place where WiFi interference is already a big concern is in heavy wireless environments like hospitals, factories, airports, business hotels, and convention centers.

Many of our future computing needs are going to require low latency. For instance, creating home holograms from multiple transmitters is going to require timely delivery of packets to each transmitter. Using augmented reality to assist in surgery will require deliver of images in real time. WiFi promises to get better with the introduction of WiFi 6 using the 802.11ax standard, but that new standard does not eliminate the innate limitations of WiFi.

The good news is that we already have a new wireless standard that can create a low-latency dedicated signal paths to users. Fully implemented 5G with frequency slicing can be used to satisfy those situations where WiFi doesn’t meet the need. It’s not hard to picture a future indoor network where a single router can satisfy some user needs using the WiFi standard with other uses satisfied using 5G – the router will choose the best standard to use for a given need.

To some degrees the cellular carriers have this same vision. They talk of 5G being used to take over IoT needs instead of WiFi. They talk about using 5G for low latency uses like augmented reality. But when comparing the history of the cellular networks and WiFi it’s clear that WiFi has been used far more creatively. There are thousands of vendors working in today’s limited WiFi spectrum that have developed a wide array of wireless services. Comparatively, the cellular carriers have been quite vanilla in their use of cellular networks to deliver voice and data.

I have no doubt that AT&T and Verizon have plans to offer million-dollar 5G solutions for smart factories, hospitals, airports and other busy wireless environments. But in doing so they will tap only a tiny fraction of the capability of 5G. If we want 5G to actually meet the high expectations that the industry has established, we ought to create a public swath of spectrum that can use 5G. The FCC could easily empower the use of the 6 GHz spectrum for both WiFi and 5G, and in doing so would unleash wireless entrepreneurs to come up with technologies that haven’t even been imagined.

The current vision of the cellular carriers is to somehow charge everybody a monthly subscription to use 5G – and there will be enough devices using the spectrum that most people will eventually give in and buy the subscription. However, the big carriers are not going to be particularly creative, and instead are likely to be very restrictive on how we use 5G.

The alternate vision is to set aside a decent slice of public spectrum for indoor use of 5G. The public will gain use of the spectrum by buying a 5G router, with no monthly subscription fee – because it’s using public spectrum. After all, 5G is a just standard, developed worldwide and is not the proprietary property of the big cellular companies. Entrepreneurs will jump on the opportunity to develop great uses for the spectrum and the 5G standard. Rather than being held captive by the limited vision of AT&T and Verizon we’d see huge number of devices using 5G creatively. This could truly unleash things like augmented reality and virtual presence. Specialty vendors would develop applications that make great strides in hospital health care. We’d finally see smart shopping holograms in stores.

The public probably doesn’t understand that the FCC has complete authority over how each swath of spectrum is used. Only the FCC can determine which spectrum can or cannot be used for WiFi, 5G and other standards. The choice ought to be an easy one. The FCC can let a handful of cellular companies decide how society will use 5G or they can unleash the creativity of thousands of developers to come up with a myriad of 5G applications. We know that creating public spectrum creates immense societal and economic good. If the FCC hadn’t set aside public spectrum for WiFi we’d all still have wires to all our home broadband devices and many of the things we now take for granted would never have come to pass.