Many of the planned used for 5G rely upon the use of millimeter wave spectrum, and like every wireless technology the characteristics of the spectrum defines both the benefits and limitations of the technology. Today I’m going to take a shot at explaining the physical characteristics of millimeter wave spectrum without using engineering jargon.
Millimeter wave spectrum falls in the range of 30 GHz to 300 GHz, although currently there has been no discussion yet in the industry of using anything higher than 100 GHz. The term millimeter wave describes the shortness of the radio waves which are only a few millimeters or less in length. The 5G industry is also using spectrum that is a little longer than millimeter waves size such as 24 GHz and 28 GHz – but these frequencies share a lot of the same operating characteristics.
There are a few reasons why millimeter wave spectrum is attractive for transmitting data. The millimeter spectrum has the capability of carrying a lot of data, which is what prompts discussion of using millimeter wave spectrum to deliver gigabit wireless service. If you think of radio in terms of waves, then the higher the frequency the greater the number of waves that are being emitted in a given period of time. For example, if each wave carries one bit of data, then a 30 GHz transmission can carry more bits in one second than a 10 GHz transmission and a lot more bits than a 30 MHz transmission. It doesn’t work exactly like that, but it’s a decent analogy.
This wave analogy also defines the biggest limitation of millimeter wave spectrum – the much shorter effective distances for using this spectrum. All radio waves naturally spread from a transmitter, and in this case thinking of waves in a swimming pool is also a good analogy. The further across the pool a wave travels, the more dispersed the strength of the wave. When you send a big wave across a swimming pool it’s still pretty big at the other end, but when you send a small wave it’s often impossible to even notice it at the other side of the pool. The small waves at millimeter length die off faster. With a higher frequency the waves are also closer together. Using the pool analogy, that means that the when waves are packed tightly together then can more easily bump into each other and become hard to distinguish as individual waves by the time they get to the other side of the pool. This is part of the reason why shorter millimeter waves don’t carry as far as other spectrum.
It would be possible to send millimeter waves further by using more power – but the FCC limits the allowed power for all radio frequencies to reduce interference and for safety reasons. High-power radio waves can be dangerous (think of the radio waves in your microwave oven). The FCC low power limitation greatly reduces the carrying distance of this short spectrum.
The delivery distance for millimeter waves can also be impacted by a number of local environmental conditions. In general, shorter radio waves are more susceptible to disruption than longer spectrum waves. All of the following can affect the strength of a millimeter wave signal:
- Mechanical resonance. Molecules of air in the atmosphere naturally resonate (think of this as vibrating molecules) at millimeter wave frequencies, with the biggest natural interference coming at 24 GHz and 60 GHz.
- Atmospheric absorption. The atmosphere naturally absorbs (or cancels out) millimeter waves. For example, oxygen absorption is highest at 60 GHz.
- Millimeter waves are easily scattered. For example, the millimeter wave signal is roughly the same size as a raindrop, so rain will scatter the signal.
- Brightness temperature. This refers to the phenomenon where millimeter waves absorb high frequency electromagnetic radiation whenever they interact with air or water molecules, and this degrades the signal.
- Line-of-sight. Millimeter wave spectrum doesn’t pass through obstacles and will be stopped by leaves and almost everything else in the environment. This happens to some degree with all radio wavs, but at lower frequencies (with longer wavelengths) the signal can still get delivered by passing through or bouncing off objects in the environment (such as a neighboring house and still reach the receiver. However, millimeter waves are so short that they are unable to recover from collision with an object between the transmitter and receiver and thus the signal is lost upon collision with almost anything.
One interesting aspect of these spectrum is that the antennas used to transmit and receive millimeter wave spectrum are tiny and you can squeeze a dozen or more antenna into a square inch. One drawback of using millimeter wave spectrum for cellphones is that it takes a lot of power to operate multiple antennas, so this spectrum won’t be practical for cellphones until we get better batteries.
However, the primary drawback of small antennas is the small target area used to receive a signal. It doesn’t take a lot of spreading and dispersion of the signal to miss the receiver. For spectrum in the 30 GHz range the full signal strength (and maximum bandwidth achievable) to a receiver can only carry for about 300 feet. With greater distances the signal continues to spread and weaken, and the physics show that the maximum distance to get any decent bandwidth at 30 GHz is about 1,200 feet. It’s worth noting that a receiver at 1,200 feet is receiving significantly less data than one at a few hundred feet. With higher frequencies the distances are even less. For example, at 60 GHz the signal dies off after only 150 feet. At 100 GHz the signal dies off in 4 – 6 feet.
To sum all of this up, millimeter wave transmission requires a relatively open path without obstacles. Even in ideal conditions a pole-mounted 5G transmitter isn’t going to deliver decent bandwidth past about 1,200 feet, with the effective amount of bandwidth decreasing as the signal travels more than 300 feet. Higher frequencies mean even less distance. Millimeter waves will perform better in places with few obstacles (like trees) or where there is low humidity. Using millimeter wave spectrum presents a ton of challenges for cell phones – the short distances are a big limitation as well as the extra battery life needed to support extra antennas. Any carrier that talks about deploying millimeter wave in a way that doesn’t fit the basic physics is exaggerating their plans.