An Update on ATSC 3.0

This is the year when we’ll finally start seeing the introduction of ATSC 3.0. This is the newest upgrade to broadcast television and is the first big upgrade since TV converted to all-digital over a decade ago. ATSC 3.0 is the latest standard that’s been released by the Advanced Television Systems Committee that creates the standards used by over-the-air broadcasters.

ATSC 3.0 will bring several upgrades to broadcast television that should make it more competitive with cable company video and Internet-based programming. For example, the new standard will make it possible to broadcast over-the-air in 4K quality. That’s four times more pixels than 1080i TV and rivals the best quality available from Netflix and other online content providers.

ATSC 3.0 also will support the HDR (high dynamic range) protocol that enhances picture quality by creating a better contrast between light and dark parts of a TV screen. ATSC 3.0 also adds additional sound channels to allow for state-of-the-art surround sound.

Earlier this year, Cord Cutters News reported that the new standard was to be introduced in 61 US markets by the end of 2020 – however, that has slowed a bit due to the COVID-19 pandemic. But the new standard should appear in most major markets by sometime in 2021. Homes will either have to buy ATSC-enabled TVs, which are just now hitting the market, or they can buy an external ATSC tuner to get the enhanced signals.

One intriguing aspect of the new standard is that a separate data path is created with TV transmissions. This opens up some interesting new features for broadcast TV. For example, a city could selectively send safety alerts and messages to homes in just certain parts of a city. This also could lead to targeted advertising that is not the same in every part of a market. Local advertisers have often hesitated to advertise on broadcast TV because of the cost and waste of advertising to an entire market instead of just the parts where they sell service.

While still in the early stages of exploration, it’s conceivable that ATSC 3.0 could be used to create a 25 Mbps data transmission path. This might require several stations joining together to create that much bandwidth. While a 25 Mbps data path is no longer a serious competitor of much faster cable broadband speeds, it opens up a lot of interesting possibilities. For example, this bandwidth could offer a competitive alternative for providing data to cellphones and could present a major challenge to cellular carriers and their stingy data caps.

ATSC 3.0 data could also be used to bring broadband into the home of every urban school student. If this broadband was paired with computers for every student, this could go a long way towards solving the homework gap in urban areas. Unfortunately, like most other new technologies, we’re not likely to see the technology in rural markets any time soon, and perhaps never. The broadband signals from tall TV towers will not carry far into rural America.

The FCC voted on June 16 on a few issues related to the ATSC 3.0 standard. In a blow to broadcasters, the FCC decided that TV stations could not use close-by vacant channels to expand ATSC 3.0 capabilities. The FCC instead decided to maintain vacant broadcast channels to be used for white space wireless broadband technology.

The FCC also took a position that isn’t going to sit as well with the public. As homeowners have continued to cut the cord there have been record sales in the last few years of indoor antennas for receiving over-the-air TV. Over-the-air broadcasters are going to be allowed to sunset the older ATSC 1.0 standard in 2023. That means that homes will have to replace TVs or will have to install an external ATSC 3.0 tuner if they want to continue to watch over-the-air broadcasts.

Who Owns Your Connected Device?

It’s been clear for years that IoT companies gather a large amount of data from customers. Everything from a smart thermometer to your new car gathers and reports data back to the cloud. California has tried to tackle customer data privacy through the California Consumer Privacy Act that went into effect on January 1.

Web companies must provide California consumers the ability to opt-out from having their personal information sold to others. Consumers must be given the option to have their data deleted from the site. Consumers must be provided the opportunity to view the data collected about them. Consumers also must be shown the identity of third parties that have purchased their data. The new law defines personal data broadly to include things like name, address, online identifiers, IP addresses, email addresses, purchasing history, geolocation data, audio/video data, biometric data, or any effort made to classify customers by personality type or trends.

However, there is one area that the new law doesn’t cover. There are examples over the last few years of IoT companies making devices obsolete and nonfunctional. Two examples that got a lot press involve Charter security systems and Sonos smart speakers.

When Charter purchased Time Warner Cable, the company decided that it didn’t want to support the home security business it had inherited. Charter ended its security business line earlier this year and advised customers that the company would no longer provide alarm monitoring. Unfortunately for customers, this means their security devices become non-functional. Customers probably felt safe in choosing Time Warner Cable as a security company because the company touted that they were using off-the-shelf electronics like Ring cameras and Abode security devices – two of the most common brands of DIY smart devices.

Unfortunately for customers, most of the devices won’t work without being connected to the Charter cloud because the company modified the software to only work in a Charter environment. Customers can connect some of the smart devices like smart thermostats and lights to a different hub, but customers can’t repurpose the security devices, which are the most expensive parts of most systems. When the Charter service ended, homeowners were left with security systems that can’t connect to a monitoring service or law enforcement. Charter’s decision to exit the security business turned the devices into bricks.

In a similar situation, Sonos notified owners of older smart speakers that it will no longer support the devices, meaning no more software upgrades or security upgrades. The older speakers will continue to function but can become vulnerable to hackers. Sonos offered owners of the older speakers a 30% discount on newer speakers.

It’s not unusual for older electronics to become obsolete and to no longer be serviced by the manufacturer – it’s something we’re familiar with in the telecom industry. What is unusual is that Sonos told customers that they cannot sell their older speakers without permission from the company. Sonos has this ability because the speakers communicate with the Sonos cloud. Sonos is not going to allow the old speakers to be registered by somebody else. If I was a Sonos customer I would also assume this to mean that the company is likely to eventually block old speakers from their cloud. The company’s notification told customers that their speakers are essentially a worthless brick. This is a shock to folks who spent a lot of money on top-of-the-line speakers.

There are numerous examples of similar incidents in the smart device industry. Google shut down the Revolv smart hub in 2016, making the device unusable. John Deere has the ability to shut off farm equipment costing hundreds of thousands of dollars if farmers use somebody other than John Deere for service. My HP printer gave me warnings that the printer would stop working if I didn’t purchase an HP ink-replacement plan.

This raises the question if consumers really own a device if the manufacturer or some partner of the manufacturer has the ability at some future time to shut the device down. Unfortunately, when consumers buy smart devices they never get any warning of the rights of the manufacturer to kill the devices in the future.

I’m sure the buyers of the Sonos speakers feel betrayed. People likely expect decent speakers to last for decades. I have a hard time imagining somebody taking Sonos up on the offer to buy new speakers at a discount to replace the old ones because in a few years the company is likely to obsolete the new speakers as well. We all have gotten used to the idea of planned obsolescence. Microsoft stops supporting older versions of Windows and users continue to use the older software at their risk. But Microsoft doesn’t shut down computers running old versions of Windows as Charter is doing. Microsoft doesn’t stop a customer from selling a computer loaded with Windows 5 to somebody else, as Sonos is doing.

These two examples provide a warning to consumers that smart devices might come with an expiration date. Any device that continues to interface with the original manufacturer through the cloud can be shut down. It would be an interesting lawsuit if a Sonos customer sues the company for essentially stealing their device.

It’s inevitable that devices grow obsolete over time. Sonos says the older speakers don’t contain enough memory to accept software updates. That’s probably true, but the company went way over the line when they decided to kill old speakers rather than let somebody sell them. Their actions tell customers that they were only renting the speakers and that they always belonged to Sonos.

The Evolution of 5G

Technology always evolves and I’ve been reading about where scientists envision the evolution of 5G. The first generation of 5G, which will be rolled out over the next 3-5 years, is mostly aimed at increasing the throughput of cellular networks. According to Cisco, North American cellular data volumes are growing at a torrid 36% per year, and even faster than that in some urban markets where the volumes of data are doubling every two years. The main goal of first-generation 5G is to increase network capacity to handle that growth.

However, if 5G is deployed only for that purpose we won’t see the giant increases in speed that the public thinks is coming with 5G. Cisco is predicting that the average North American cellular speed in 2026 will be around 70 Mbps – a far cry from the gigabit speed predictions you can find splattered all over the press.

There is already academic and lab work looking into what is being labeled as 6G. That will use terabit spectrum and promises to potentially be able to deliver wireless speeds up to as much as 1 terabit per second. I’ve already seen a few articles touting this as a giant breakthrough, but the articles didn’t mention that the effective distance for this spectrum can be measured in a few feet – this will be an indoor technology and will not be the next cellular replacement for 5G.

This means that to some degree, 5G is the end of the line in terms of cellular delivery. This is likely why the cellular carriers are gobbling up as much spectrum as they can. That spectrum isn’t all needed today but will be needed by the end of the decade. The cellular carriers will use every spectrum block now to preserve the licenses, but the heavy lifting for most of the spectrum being purchased today will come into play a decade or more from now – the carriers are playing the long game so that they aren’t irrelevant in the not-too-distant future

This doesn’t mean that 5G is a dead-end, and the technology will continue to evolve. Here are a few of the ideas being explored in labs today that will enhance 5G performance a decade from now:

  • Large Massive Network MIMO. This means expanding the density and capacity of cellular antennas to simultaneously be able to handle multiple spectrum bands. We need much better antennas if we are to get vastly greater data volumes into and out of cellular devices. For now, data speeds on cellphones are being limited by the capacity of the antennas.
  • Ultra Dense Networks (UDN). This envisions the end of cell sites in the way we think about them today. This would come first in urban networks where there will be a hyper-dense deployment of small cell devices that would likely also incorporate small cells, WiFi routers, femtocells, and M2M gateways. In such an environment, cellphones can interact with the cloud rather than with a traditional cell site. This eliminates the traditional cellular standard of one cell site controlling a transaction. In a UDN network, a cellular device could connect anywhere.
  • Device-to-Device (D2D) Connectivity. The smart 5G network in the future will let nearby devices communicate with each other without having to pass traffic back and forth to a data hub. This would move some cellular transactions to the edge, and would significantly reduce logjams at data centers and on middle-mile fiber routes.
  • A Machine-to-Machine (M2M) Layer. A huge portion of future web traffic will be communications between devices and the cloud. This research envisions a separate cellular network for such traffic that maximizes M2M communications separately from traffic used by people.
  • Use of AI. Smart networks will be able to shift and react to changing demands and will be able to shuffle and share network resources as needed. For example, if there is a street fair in a neighborhood that is usually vehicle traffic, the network would smartly reconfigure to recognize the changing demand for connectivity.
  • Better Batteries. None of the improvements come along until there are better ‘lifetime’ batteries that can allow devices to use more antennas and process more data.

Wireless marketing folks will be challenged to find ways to describe these future improvements in the 5G network. If the term 6G becomes associated with terabit spectrum, marketers are going to find something other than a ‘G’ term to over-hype the new technologies.

Are You Ready for 400 Gb?

AT&T recently activated a 400-gigabit fiber connection between Dallas and Atlanta and claimed it is the first such connection in the country. This is a milestone because it represents a  major upgrade in fiber speeds in our networks. While scientists in the labs have created multi-terabyte lasers, our fiber network backbones for the last decade have mostly relied on 100-gigabit or slower laser technology.

Broadband demand has grown by a huge amount over the last decade. We’ve seen double-digit annual growth in residential broadband, business broadband, cellular data, and machine-to-machine data traffic. Our backbone and transport networks are busy and often full. AT&T says it’s going to need the faster fiber transport to accommodate 5G, gaming, and ever-growing video traffic volumes.

I’ve heard concerns from network engineers that some of our long-haul fiber routes, such as the ones along the east coast are overloaded and in danger of being swamped. Having the ability to update long-haul fiber routes from 100 Gb to 400 Gb is a nice upgrade – but not as good as you might imagine. If a 100 Gb fiber route is nearly full and is upgraded to 400 Gb, the life of that route is only stretched another six years if network traffic volumes are doubling every three years. But upgrading is a start and a stopgap measure.

AT&T is also touting that they used white box hardware for this new deployment. White box hardware uses inexpensive generic switches and routers controlled by open-source software. AT&T is likely replacing a 100 Gb traditional electronics route with a much cheaper white box solution. Folks who don’t work with long-haul networks probably don’t realize the big cost of electronics needed to light a long fiber route like this one between Dallas and Atlanta. Long-haul fiber requires numerous heated and cooled huts placed along the route that house repeaters needed to amplify the signal. A white box solution doesn’t just mean less expensive lasers at the end points, but at all of the intermediate points along the fiber route.

AT&T views 400 Gb transport as the next generation of technology needed in our networks and the company submitted specifications to the Open Compute Project for an array of different 400 GB chassis and backbone fabrics. The AT&T specifications rely on Broadcom’s Jericho2 family of chips.

100 Gb electronics are not only used today in long-haul data routes. I have a lot of clients that operate fiber-to-the-home networks that use a 100 Gb backbone to provide the bandwidth to reach multiple neighborhoods. In local networks that are fiber-rich there is always a trade-off between the cost up upgrading to faster electronics or instead lighting additional fiber pairs. As an existing 100 Gb fiber starts getting full, network engineers will consider the cost of lighting a second 100 Gb route versus upgrading to the 400 Gb technology. The fact that AT&T is pushing this as a white box solution likely means that it will be cheaper to upgrade to a new 400 Gb network than it is to buy a second traditional 100 Gb set of electronics.

There are other 400 Gb solution hitting the market from Cisco, Juniper, and Arista Networks – but all will be more expensive than a white box solution. Network engineers always talk about chokepoints in a network – places where the traffic volume exceeds the network capability. One of the most worrisome chokepoints for ISPs are the long-haul fiber networks that connect communities – because those routes are out of the control of the last-mile ISP. It’s reassuring to know there are technology upgrades that will let the industry keep up with demand.

Expect a New Busy Hour

One of the many consequences of the coronavirus is that networks are going to see a shift in busy hour traffic. Busy hour traffic is just what is sounds like – it’s the time of the day when a network is busiest, and network engineers design networks to accommodate the expected peak amount of bandwidth usage.

Verizon reported on March 18 that in the week since people started moving to work from home that they’ve seen a 20% overall increase in broadband traffic. Verizon says that gaming traffic is up 75% as those stuck at home are turning to gaming for entertainment. They also report that VPN (virtual private network) traffic is up 34%. A lot of connections between homes and corporate and school WANs are using a VPN.

These are the kind of increases that can scare network engineers, because Verizon just saw a typical year’s growth in traffic happen in a week. Unfortunately, the announced Verizon traffic increases aren’t even the whole story since we’re just at the beginning of the response to the coronavirus. There are still companies figuring out how to give secure access to company servers and the work-from-home traffic is bound to grow in the next few weeks. I think we’ll see a big jump in video conference traffic on platforms like Zoom as more meeting move online as an alternative to live meetings.

For most of my clients, the busy hour has been in the evening when many homes watch video or play online games. The new paradigm has to be scaring network engineers. There is now likely going to be a lot of online video watching and gaming during the daytime in addition to the evening. The added traffic for those working from home is probably the most worrisome traffic since a VPN connection to a corporate WAN will tie up a dedicated path through the Internet backbone – bandwidth that isn’t shared with others. We’ve never worried about VPN traffic when it was a small percentage of total traffic – but it could become one of the biggest continual daytime uses of bandwidth. All of the work that used to occur between employees and the corporate server inside of the business is now going to traverse the Internet.

I’m sure network engineers everywhere are keeping an eye on the changing traffic, particularly to the amount of broadband used during the busy hour. There are a few ways that the busy hour impacts an ISP. First, they must buy enough bandwidth to the Internet to accommodate everybody. It’s typical to buy at least 15% to 20% more bandwidth than is expected for the busy hour. If the size of the busy hour shoots higher, network engineers are going to have to quickly buy a larger pipe to the Internet, or else customer performance will suffer.

Network engineers also keep a close eye on their network utilization. For example, most networks operate with some rule of thumb, such as it’s time to upgrade electronics when any part of the network hits some pre-determined threshold like 85% utilization. These rules of thumb have been developed over the years as warning signs to provide time to make upgrades.

The explosion of traffic due to the coronavirus, might shoot many networks past these warning signs and networks start experiencing chokepoints that weren’t anticipated just a few weeks earlier. Most networks have numerous possible chokepoints – and each is monitored. For example, there is usually a chokepoint going into neighborhoods. There are often chokepoints on fiber rings. There might be chokepoints on switch and router capacity at the network hub. There can be the chokepoint on the data pipe going to the world. If any one part of the network gets overly busy, then network performance can degrade quickly.

What is scariest for network engineers is that traffic from the reaction to the coronavirus is being layered on top of networks that already have been experiencing steady growth. Most of my clients have been seeing year-over-year traffic volumes increases of 20% to 30%. If Verizon’s experience in indicative of what we’ll all see, then networks will see a year’s typical growth happen in just weeks. We’ve never experienced anything like this, and I’m guessing there aren’t a lot of network engineers who are sleeping well this week.

Introducing 6 GHz into WiFi

WiFi is already the most successful deployment of spectrum ever. In the recent Annual Internet Report, Cisco predicted that by 2022 that WiFi will cross the threshold and will carry more than 50% of global IP traffic. Cisco predicts by 2023 that there will be 628 million WiFi hotspots – most used for home broadband.

These are amazing statistics when you consider that WiFi has been limited to using 70 MHz of spectrum in the 2.4 GHz spectrum band and 500 MHz in the 5 GHz spectrum band. That’s all about to change as two major upgrades are being made to WiFi – the upgrade to WiFi 6 and the integration 6 GHz spectrum into WiFi.

The Impact of WiFi 6. WiFi 6 is the new consumer-friendly name given to the next generation of WiFi technology (replaces the term 802.11ax). Even without the introduction of new spectrum WiFi 6 will significantly improve performance over WiFi 5 (802.11ac).

The problem with current WiFi is congestion. Congestion comes in two ways – from multiple devices trying to use the same router, and from multiple routers trying to use the same channels. My house is probably typical, and we have a few dozen devices that can use the WiFi router. My wife’s Subaru even connects to our network to check for updates every time she pulls into the driveway. With only two of us in the house, we don’t overtax our router – but we can when my daughter is home from college.

Channel congestion is the real culprit in our neighborhood. We live in a moderately dense neighborhood of single-family homes and we can all see multiple WiFi networks. I just looked at my computer and I see 24 other WiFi networks, including the delightfully named ‘More Cowbell’ and ‘Very Secret CIA Network’. All of these networks are using the same small number of channels, and WiFi pauses whenever it sees a demand for bandwidth from any of these networks.

Both kinds of congestion slow down throughput due to the nature of the WiFi specification. The demands for routers and for channels are queued and each device has to wait its turn to transmit or receive data. Theoretically, a WiFi network can transmit data quickly by grabbing a full channel – but that rarely happens. The existing 5 GHz band has six 80-MHz and two 160-MHz channels available. A download of a big file could go quickly if a full channel could be used for the purpose. However, if there are overlapping demands for even a portion of a channel then the whole channel is not assigned for a specific task.

Wi-Fi 6 introduces a few major upgrades in the way that WiFi works to decrease congestion. The first is the introduction of orthogonal frequency-division multiple access (OFDMA). This technology allows devices to transmit simultaneously rather than wait for a turn in the queue. OFDMA divides channels into smaller sub-channels called resource units. The analogy used in the industry is that this will open WiFi from a single-lane technology to a multi-lane freeway. WiFi 6 also uses other techniques like improved beamforming to make a focused connection to a specific device, which lowers the chances of interference from other devices.

The Impact of 6 GHz. WiFi performance was already getting a lot better due to WiFi 6 technology. Adding the 6 GHz spectrum will drive performance to yet another level. The 6GHz spectrum adds seven 160 MHz channels to the WiFi environment (or alternately adds fifty-nine 20 MHz channels. For the typical WiFi environment, such as a home in an urban setting, this is enough new channels that a big bandwidth demand ought to be able to grab a full 160 MHz channel. This is going to increase the perceived speeds of WiFi routers significantly.

When the extra bandwidth is paired with OFDMA technology, interference ought to be a thing of the past, except perhaps in super-busy environments like a business hotel or a stadium. Undoubtedly, we’ll find ways over the next decade to fill up WiFi 6 routers and we’ll eventually be begging the FCC for even more WiFi spectrum. But for now, this should solve WiFi interference in all but the toughest WiFi environments.

It’s worth a word of caution that this improvement isn’t going to happen overnight. You need both a WiFi 6 router and WiFi-capable devices to take advantage of the new WiFi 6 technology. You’ll also need devices capable of using the 6 GHz spectrum. Unless you’re willing to throw away every WiFi device in your home and start over, it’s going to take most homes years to migrate into the combined benefits of WiFi 6 and 6 GHz spectrum.

There is No Artificial Intelligence

It seems like most new technology today comes with a lot of hype. Just a few years ago, the press was full of predictions that we’d be awash with Internet of Thing sensors that would transform the way we live. We’ve heard similar claims for technologies like virtual reality, block chain, and self-driving cars. I’ve written a lot about the massive hype surrounding 5G – in my way of measuring things, there isn’t any 5G in the world yet, but the cellular carriers are loudly proclaiming its everywhere.

The other technology with a hype that nearly equals 5G is artificial intelligence. I see articles every day talking about the ways that artificial intelligence is already changing our world, with predictions about the big changes on the horizon due to AI. A majority of large corporations claim to now be using AI. Unfortunately, this is all hype and there is no artificial intelligence today, just like there is not yet any 5G.

It’s easy to understand what real 5G will be like – it will include the many innovations embedded in the 5G specifications like frequency slicing and dynamic spectrum sharing. We’ll finally have 5G when a half dozen new 5G technologies are on my phone. Defining artificial intelligence is harder because there is no specification for AI. Artificial intelligence will be here when a computer can solve problems in much the way that humans do. Our brains evaluate available data on hand to see if we know enough to solve a problem. If not, we seek the additional data we need. Our brains can consider data from disparate and unrelated sources to solve problems. There is no computer today that is within a light-year of that ability – there are not yet any computers that can ask for specific additional data needed to solve a problem. An AI computer doesn’t need to be self-aware – it just has to be able to ask the questions and seek the right data needed to solve a given problem.

We use computer tools today that get labeled as artificial intelligence such as complex algorithms, machine learning, and deep learning. We’ve paired these techniques with faster and larger computers (such as in data centers) to quickly process vast amounts of data.

One of the techniques we think of artificial intelligence is nothing more than using brute force to process large amounts of data. This is how IBM’s Deep Blue works. It can produce impressive results and shocked the world in 1997 when the computer was able to beat Garry Kasparov, the world chess champion. Since then, the IBM Watson system has beat the best Jeopardy players and is being used to diagnose illnesses. These computers achieve their results through processing vast amounts of data quickly. A chess computer can consider huge numbers of possible moves and put a value on the ones with the best outcome. The Jeopardy computer had massive databases of human knowledge available like Wikipedia and Google search – it looks up the answer to a question faster than a human mind can pull it out of memory.

Much of what is thought of as AI today uses machine learning. Perhaps the easiest way to describe machine learning is with an example. Machine learning uses complex algorithms to analyze and rank data. Netflix uses machine learning to suggest shows that it thinks a given customer will like. Netflix knows what a viewer has already watched. Netflix also knows what millions of others who watch the same shows seem to like, and it looks at what those millions of others watched to make a recommendation. The algorithm is far from perfect because the data set of what any individual viewer has watched is small. I know in my case, I look at the shows recommended for my wife and see all sorts of shows that interest me, but which I am not offered. This highlights one of the problems of machine learning – it can easily be biased and draw wrong conclusions instead of right ones. Netflix’s suggestion algorithm can become a self-fulfilling prophecy unless a viewer makes the effort to look outside of the recommended shows – the more a viewer watches what is suggested, the more they are pigeonholed into a specific type of content.

Deep learning is a form of machine learning that can produce better results by passing data through multiple algorithms. For example, there are numerous forms of English spoken around the world. A customer service bot can begin each conversation in standard English, and then use layered algorithms to analyze the speaker’s dialect to switch to more closely match a given speaker.

I’m not implying that today’s techniques are not worthwhile. They are being used to create numerous automated applications that could not be done otherwise. However, almost every algorithm-based technique in use today will become instantly obsolete when a real AI is created.

I’ve read several experts that predict that we are only a few years away from an AI desert – meaning that we will have milked about all that can be had out of machine learning and deep learning. Developments with those techniques are not leading towards a breakthrough to real AI – machine learning is not part of the evolutionary path to AI. At least for today, both AI and 5G are largely non-existent, and the things passed off as these two technologies are pale versions of the real thing.

5G and Rural America

FCC Chairman Ajit Pai recently told the crowd at CES that 5G would be a huge benefit to rural America and would help to close the rural broadband divide. I have to imagine he’s saying this to keep rural legislators on board to support that FCC’s emphasis on promoting 5G. I’ve thought hard about the topic and I have a hard time seeing how 5G will make much difference in rural America – particularly with broadband.

There is more than one use of 5G, and I’ve thought through each one of them. Let me start with 5G cellular service. The major benefits of 5G cellular are that a cell site will be able to handle up to 100,000 simultaneous connection per cell site. 5G also promises slightly faster cellular data speeds. The specification calls for speeds up to 100 Mbps with the normal cellular frequencies – which happens to also have been the specification for 4G, although it was never realized.

I can’t picture a scenario where a rural cell site might need 100,000 simultaneous connections within a circle of a few miles. There aren’t many urban places that need that many connections today other than stadiums and other crowded locations where a lot of people want connectivity at the same time. I’ve heard farm sensors mentioned as a reason for needing 5G, but I don’t buy it. The normal crop sensor might dribble out tiny amounts of data a few times per day. These sensors cost close to $1,000 today, but even if they somehow get reduced to a cost of pennies it’s hard to imagine a situation where any given rural cell site is going to need to more capacity than is available with 4G.

It’s great if rural cell sites get upgraded, but there can’t be many rural cell sites that are overloaded enough to demand 5G. There is also the economics. It’s hard to imagine the cellular carriers being willing to invest in a rural cell site that might support only a few farmers – and it’s hard to think the farmers are willing to pay enough to justify their own cell site

There has also been talk of lower frequencies benefitting rural America, and there is some validity to that. For example, T-Mobile’s 600 MHz frequency travels farther and penetrates obstacles better than higher frequencies. Using this frequency might extend good cellular data coverage as much as an extra mile and might support voice for several additional miles from a cell site. However, low frequencies don’t require 5G to operate. There is nothing stopping these carriers from introducing low frequencies with 4G (and in fact, that’s what they have done in the first-generation cellphones capable of using the lower frequencies). The cellular carriers are loudly claiming that their introduction of new frequencies is the same thing as 5G – it’s not.

5G can also be used to provide faster data using millimeter wave spectrum. The big carriers are all deploying 5G hot spots with millimeter wave technology in dense urban centers. This technology broadcasts super-fast broadband for up to 1,000 feet.  The spectrum is also super-squirrely in that it doesn’t pass through anything, even a pane of glass. Try as I might, I can’t find a profitable application for this technology in suburbs, let alone rural places. If a farmer wants fast broadband in the barnyard I suspect we’re only a few years away from people being able to buy a 5G/WiFi 6 hot spot that could satisfy this purpose without paying a monthly fee to a cellular company.

Finally, 5G can be used to provide gigabit wireless loops from a fiber network. This is the technology trialed by Verizon in a few cities like Sacramento. In that trial, speeds were about 300 Mbps, but there are no reason speeds can’t climb to a gigabit. For this technology to work there has to be a transmitter on fiber within 1,000 feet of a customer. It seems unlikely to me that somebody spending the money to get fiber close to farms would use electronics for the last few hundred feet instead of a fiber drop. The electronics are always going to have problems and require truck rolls, and the electronics will likely have to be replaced at least once per decade. The small telcos and electric coops I know would scoff at the idea of adding another set of electronics into a rural fiber network.

I expect some of the 5G benefits to find uses in larger county seats – but those towns have the same characteristics as suburbia. It’s hard to think that rural America outside of county seats will ever need 5G.

I’m at a total loss of why Chairman Pai and many politicians keep extolling the virtues of rural 5G. I have no doubt that rural cell sites will be updated to 5G over time, but the carriers will be in no hurry to do so. It’s hard to find situations in rural America that demand a 5G solution that can’t be done with 4G – and it’s even harder to justify the cost of 5G upgrades that benefit only a few customers. I can’t find a business case, or even an engineering case for pushing 5G into rural America. I most definitely can’t foresee a 5G application that will solve the rural broadband divide.

 

Is 5G Radiation Safe?

There is a lot of public sentiment against placing small cell sites on residential streets. There is a particular fear of broadcasting higher millimeter wave frequencies near to homes since these frequencies have never been in widespread use before. In the public’s mind, higher frequencies mean a higher danger of health problems related to exposure to radiofrequency emissions. The public’s fears are further stoked when they hear that Switzerland and Belgium are limiting the deployment of millimeter wave radios until there is better proof that they are safe.

The FCC released a report and order on December 4 that is likely to add fuel to the fire. The agency rejected all claims that there is any public danger from radiofrequency emissions and affirmed the existing frequency exposure rules. The FCC said that none of the thousand filings made in the docket provided any scientific evidence that millimeter wave, and other 5G frequencies are dangerous.

The FCC is right in their assertion that there are no definitive scientific studies linking cellular frequencies to cancer or other health issues. However, the FCC misses the point that most of those asking for caution, including scientists, agree with that. The public has several specific fears about the new frequencies being used:

  • First is the overall range of new frequencies. In the recent past, the public was widely exposed to relatively low frequencies from radio and TV stations, to a fairly narrow range of cellular frequencies, and two bands of WiFi. The FCC is in the process of approving dozens of new bands of frequency that will be widely used where people live and work. The fear is not so much about any given frequency being dangerous, but rather a fear that being bombarded by a large range of frequencies will create unforeseen problems.
  • People are also concerned that cellular transmitters are moving from tall towers, which normally have been located away from housing, to small cell sites on poles that are located on residential streets. The fear is that these transmitters are generating a lot of radiation close to the transmitter – which is true. The amount of frequency that strikes a given area decreases rapidly with distance from a transmitter. The anecdote that I’ve seen repeated on social media is of placing a cell site fifteen feet from the bedroom of a child. I have no idea if there is a real small cell site that is the genesis of this claim – but there could be. In dense urban neighborhoods, there are plenty of streets where telephone poles are within a few feet of homes. I admit that I would be leery about having a small cell site directly outside one of my windows.
  • The public worries when they know that there will always be devices that don’t meet the FCC guidelines. As an example, the Chicago Tribune tested eleven smartphones in August and found that a few of them were issuing radiation at twice the FCC maximum-allowable limit. The public understands that vendors play loose with regulatory rules and that the FCC largely ignores such violations.

The public has no particular reason to trust this FCC. The FCC under Chairman Pai has sided with the large carriers on practically every issue in front of the Commission. This is not to say that the FCC didn’t give this docket the full consideration that should be given to all dockets – but the public perception is that this FCC would side with the cellular carriers even if there was a public health danger.

The FCC order is also not particularly helped by citing the buy-in from the Food and Drug Administration on the safety of radiation. That agency has licensed dozens of medicines that later proved to be harmful, so that agency also doesn’t garner a lot of public trust.

The FCC made a few changes with this order. They have mandated a new set of warning signs to be posted around transmitters. It’s doubtful that anybody outside of the industry will understand the meaning of the color-coded warnings. The FCC is also seeking comments on whether exposure standards should be changed for frequencies below 100 kHz and above 6 GHz. The agency is also going to exempt certain kinds of transmitters from FCC testing.

I’ve read extensively on both sides of the issue and it’s impossible to know the full story. For example, a majority of scientists in the field signed a petition to the United Nations warning against using higher frequencies without more testing. But it’s also easy to be persuaded by other scientists who say that higher frequencies don’t even penetrate the skin. I’ve not heard of any studies that look at exposing people to a huge range of different low-power frequencies.

This FCC is in a no-win position. The public properly perceives the agency of being pro-carrier, and anything the FCC says is not going to persuade those worried about radiation risks. I tend to side with the likelihood that the radiation is not a big danger, but I also have to wonder if there will be any impact after expanding by tenfold the range of frequencies we’re exposed to. The fact is that we’re not likely to know until after we’ve all been exposed for a decade.