Using Fiber as a Sensor

I am an admitted science nerd. I love to spend time crawling through scientific research papers. I don’t always understand the nuances since scientific papers are often written in severe jargon, but I’ve always been fascinated by scientific research, because it presages the technology of a few decades from now.

I ran across research by Nokia Bell Labs concerning using fiber as a sensor. Scientists there have been exploring ways to interpret the subtle changes that happen in a long strand of fiber strand. The world is suddenly full of fiber strands, and scientists want to know if they can discern any usable real-life data from measuring changes in fiber.

They are not looking at the transmission of the light inside the data. Fiber electronics have been designed to isolate the light signal from external stimuli. We don’t get a degraded signal when a fiber cable is swaying in the wind. We probably don’t marvel enough about the steady and predictable nature of a fiber light signal.

The research is exploring if the physical attributes of the fiber can be used to predict problems in the network before they occur. If a network operator knows that a certain stretch of fiber is under duress, then steps can be taken to address the issues long before there is a fiber outage. Developing ways to interpret the stresses on fiber would alone justify the research many times over.

But scientists can foresee a much wider range of sensor capabilities. Consider a fiber strung across a bridge. It’s hard to measure tiny shifts in the steel infrastructure in a bridge. However, a fiber cable across the bridge can sense and measure subtle changes in the tensions on the bridge and might be able to understand the way that a bridge is shifting long before it becomes physically obvious.

There is already some physical sensing used to monitor underseas fibers – but more can be done. The fiber can possibly measure changes in temperature, current flows, and seismic activity for the full length of these long fibers. Scientists have developed decent sensors for measuring underground faults on land, but it’s much harder to do in the depths of the open ocean.

To test the capabilities to measure and interpret changes to fiber, Bell Lab scientists built a 524-kilometer fiber route between Gothenburg and Karlstad in Sweden as the first test bed for the technology. This will allow them to try to measure a wide range of environmental data to see what can or cannot be done with the sensing technology.

It’s hard to know where this research might go, which is always the case with pure research. It’s not hard to imagine uses if the technology works as hoped. Fiber might be able to identify and pinpoint small forest fires long before they’ve spread and grown larger. Fibers might serve as an early warning system for underground earthquakes long before we’d know about them in the traditional way. The sensing might be useful as a way to identify minor damage to fiber – we know about fiber cuts, but there is often no feedback today from lesser damages to fiber that can still grow to finally result in an outage.

Are You Ready for 6G?

The first 6G summit convenes this coming weekend in Levi, Lapland, Finland, sponsored by the University of Oulu. The summit will end with a closed-door, invitation-only assembly of wireless researchers and vendors with the goal to create a draft vision statement defining the goals of 6G research. Attendees include all of the major wireless vendors like Huawei, Ericsson, Samsung, and NTT, along with researchers from numerous universities and groups like Nokia Bell Labs.

As you would expect, even as 5G standards were being finalized there were already private and academic research labs working on what will come next. So far, some of the vision for 6G includes concepts like:

  • Use of higher frequencies between 100 GHz and 1 THz, introducing the world to the idea of terahertz spectrum. The upper end of this range lies between radio waves and infrared light. The FCC just approved research above 95 GHz.
  • Researches believe this next generation wireless will be needed to finally enable 3D holograms needed for lifelike telepresence.
  • The higher frequencies would also allow for densification and for the simultaneous transmission of multiple large-bandwidth transmissions. Researchers already believe that with the higher frequencies that the capacity of a wireless network could be as much as 1,000 times that of 5G – but even 10 times faster would be a major breakthrough.
  • Scientists anticipate within a decade that we’ll have advanced far enough with artificial intelligence to enable AI-powered routing that will choose the best path in real time for each packet and will significantly decrease latency.
  • Various researchers from Brown University and universities in Australia have said that 5G will be inadequate to satisfy our future needs for both bandwidth and for the overall number of IoT connections. One of the goals of 6G will be to increase the number of connected devices from a given transmitter by one to two magnitudes.

The higher frequencies will allow for even faster data transmission, as much as 10 times faster than the gigabit speeds envisioned for point-to-multipoint 5G using millimeter wave radios.

There are a number of issues to be overcome with the higher frequencies, the primary being that radio waves at those frequencies won’t pass through any barrier. However, scientists already think there might be strategies for bouncing the waves around obstacles.

The other shortcoming of the frequencies is the short distances before the signal dissipates. This is likely to limit the higher frequencies to indoor use allowing for indoor wireless networks with speeds as fast as 10 Gbps.

Interestingly, researchers in China say that this vision of 6G is the end of the line in terms of major platform upgrades and that there will never be a 7G. After 6G the goal over time will be to improve the performance of the various aspects of the technologies involved. Apparently, the Chinese have never met any AT&T and Verizon marketing staff.

Many of the group researching these topics are already talking about having a 6G set of standards by 2030. But there is a lot of research to be done including fundamental steps like developing chips capable of handling the higher speeds. We also will hit regulatory barriers – governments all regulate the use of radio waves, but it might be harder to regulate the use of the light-like frequencies at the base of the infrared spectrum.

Telecom R&D

In January AT&T announced the creation of the WarnerMedia Innovation Lab, which is a research group that will try to combine AT&T technology advances and the company’s huge new media content. The lab, based in New York City, will consider how 5G, the Internet of Things, artificial intelligence, machine learning and virtual reality can work to create new viewer entertainment experiences.

This is an example of a highly directed R&D effort to create specific results – in this case the lab will be working on next-generation technologies for entertainment. This contrasts with labs that engage in basic research that allow scientists to explore scientific theories. The closest we’ve ever come to basic research from a commercial company was with Bell Labs that was operated by the old Ma Bell monopoly.

Bell Labs was partially funded by the government and also got research funds from ratepayers of the nationwide monopoly telco. Bell Labs research was cutting edge and resulted in breakthroughs like the transistor, the charge coupled device, Unix, fiber optics, lasers, data networking and the creation of the big bang theory. The Lab created over 33,000 patents and its scientists won eight Nobel Prizes. I was lucky enough to have a tour of Bell Labs in the 80s and I was a bit sad today when I had to look on the Internet to see if it still exists; it does and is now called Nokia Bell Labs and operates at a much smaller scale than the original lab.

Another successor to Bell Labs is AT&T Labs, the research division of AT&T. The lab engages in a lot of directed research, but also in basic research. AT&T Labs is investigating topics such as the physics of optical transmission and the physics of computing. Since its creation in 1996 AT&T Labs has been issued over 2,000 US patents. The lab’s directed research concentrates on technologies involved in the technical challenges of large networks and of working with huge datasets. The Lab was the first to be able to transmit 100 gigabits per second over fiber.

Verizon has also been doing directed research since the spin-off of Nynex with the divestiture of the Bell system. Rather than operate one big public laboratory the company has research groups engaged in topics of specific interest to the company. Recently the company chose a more public profile and announced the creation of its 5G Lab in various locations. The Manhattan 5G Lab will focus on media and finance tech; the Los Angeles lab will work with augmented reality (AR) and holograms; the Washington DC lab will work on public safety, first responders, cybersecurity, and hospitality tech; the Palo Alto lab will look at emerging technologies, education, and big data; and its Waltham, Massachusetts, lab will focus on robotics, healthcare, and real-time enterprise services.

Our industry has other labs engaged in directed research. The best known of these is CableLabs, the research lab outside Denver that was founded in 1988 and is jointly funded by the world’s major cable companies. This lab is largely responsible for the cable industry’s success in broadband since the lab created the various generations of DOCSIS technology that have been used to operate hybrid-fiber coaxial networks. CableLabs also explores other areas of wireless and wired communications.

While Comcast relies on CableLabs for its underlying technology, the company has also created Comcast Labs. This lab is highly focused on the customer experience and developed Comcast’s X1 settop box and created the integrated smart home product being sold by Comcast. Comcast Labs doesn’t only develop consumer devices and is involved in software innovation efforts like OpenStack and GitHub development. The lab most recently announced a breakthrough that allows cable networks to deliver data speeds up to 10 Gbps.