Categories
The Industry

Telecom R&D

In January AT&T announced the creation of the WarnerMedia Innovation Lab, which is a research group that will try to combine AT&T technology advances and the company’s huge new media content. The lab, based in New York City, will consider how 5G, the Internet of Things, artificial intelligence, machine learning and virtual reality can work to create new viewer entertainment experiences.

This is an example of a highly directed R&D effort to create specific results – in this case the lab will be working on next-generation technologies for entertainment. This contrasts with labs that engage in basic research that allow scientists to explore scientific theories. The closest we’ve ever come to basic research from a commercial company was with Bell Labs that was operated by the old Ma Bell monopoly.

Bell Labs was partially funded by the government and also got research funds from ratepayers of the nationwide monopoly telco. Bell Labs research was cutting edge and resulted in breakthroughs like the transistor, the charge coupled device, Unix, fiber optics, lasers, data networking and the creation of the big bang theory. The Lab created over 33,000 patents and its scientists won eight Nobel Prizes. I was lucky enough to have a tour of Bell Labs in the 80s and I was a bit sad today when I had to look on the Internet to see if it still exists; it does and is now called Nokia Bell Labs and operates at a much smaller scale than the original lab.

Another successor to Bell Labs is AT&T Labs, the research division of AT&T. The lab engages in a lot of directed research, but also in basic research. AT&T Labs is investigating topics such as the physics of optical transmission and the physics of computing. Since its creation in 1996 AT&T Labs has been issued over 2,000 US patents. The lab’s directed research concentrates on technologies involved in the technical challenges of large networks and of working with huge datasets. The Lab was the first to be able to transmit 100 gigabits per second over fiber.

Verizon has also been doing directed research since the spin-off of Nynex with the divestiture of the Bell system. Rather than operate one big public laboratory the company has research groups engaged in topics of specific interest to the company. Recently the company chose a more public profile and announced the creation of its 5G Lab in various locations. The Manhattan 5G Lab will focus on media and finance tech; the Los Angeles lab will work with augmented reality (AR) and holograms; the Washington DC lab will work on public safety, first responders, cybersecurity, and hospitality tech; the Palo Alto lab will look at emerging technologies, education, and big data; and its Waltham, Massachusetts, lab will focus on robotics, healthcare, and real-time enterprise services.

Our industry has other labs engaged in directed research. The best known of these is CableLabs, the research lab outside Denver that was founded in 1988 and is jointly funded by the world’s major cable companies. This lab is largely responsible for the cable industry’s success in broadband since the lab created the various generations of DOCSIS technology that have been used to operate hybrid-fiber coaxial networks. CableLabs also explores other areas of wireless and wired communications.

While Comcast relies on CableLabs for its underlying technology, the company has also created Comcast Labs. This lab is highly focused on the customer experience and developed Comcast’s X1 settop box and created the integrated smart home product being sold by Comcast. Comcast Labs doesn’t only develop consumer devices and is involved in software innovation efforts like OpenStack and GitHub development. The lab most recently announced a breakthrough that allows cable networks to deliver data speeds up to 10 Gbps.

Categories
Technology

The Anniversary of Fiber Optics

I recently saw an article that noted that this month marks the fiftieth anniversary of a scientific paper by Charles Kao in 1966 that kicked off the field of fiber optics communications. That paper eventually won him the Nobel prize for physics in 2009. He was assisted by George Hockman, a British engineer who was awarded the Rank prize for Opto-electronics in 1978.

We are so surrounded by fiber optic technology today that it’s easy to forget what a relatively new technology this is. We’ve gone from theoretical paper to the world covered with fiber optic lines in only fifty years.

As is usual with most modern inventions, Kao and Hockman were not the only ones looking for a way to use lasers for communications. Bell Labs had considered using fiberglass but abandoned the idea due to the huge attenuation they saw in glass – meaning that the laser light signal scattered quickly and wouldn’t travel very far. Bell Labs was instead looking at shooting lasers through hollow metal tubes using focused lenses.

The big breakthrough was when Kao and Hockman found a way to reduce the attenuation within a fiberglass cable to less than 20 decibels per kilometer. At that level of attenuation they could overcome irregularities and impurities in the fiber cable.

It took a decade for the idea to be put to practical use and Corning Glass Works (now Corning Inc.) found ways to lower attenuation even more; they laid the first fiber optic cable in Torino, Italy in 1977.

We didn’t see any wide-spread use of fiber optics in the U.S. until the early 1980s. AT&T and a few other companies like the budding MCI began installing fiber as an alternative to copper for long-haul networks.

We’ve come a very long way since the first generation fiber installations. The glass was expensive to manufacture, and so the early fiber cables generally did not contain very many strands of glass. It was not unusual to see 6 and 8 strand fibers being installed.

Compared to today’s standards, the fiber produced in the 1980s into the early 1990s was dreadful stuff. Early fiber cables degraded over time, mostly due to microscopic cracks introduced into the cable during manufacturing and installation. These cracks grew over time and eventually caused the cables to become cloudy and unusable. Early splicing technologies were also a problem and each splice introduced a significant amount of interference into the fiber run. I doubt that there is much, if any, functional fiber remaining from those early days.

But Corning and other companies have continually improved the quality of fiber optic cable and today’s fiber is lightyears ahead of the early cables. Splicing technology has also improved and modern splices introduce very little interference into the transmission path. In fact, there is no good estimate today of how long a properly-installed fiber cable might last in the field. It’s possible that fiber installed today might still be functional 75 to 100 years from now. The major issues with the life of fiber today is no longer failure of the glass sheath, but rather the damage that is done to fibers over time due to fiber cuts and storm damage.

The speeds achieved in modern fiber optics are incredible. The newly commissioned undersea fiber that Google and others built between Japan and the west coast of the US can pass an incredible 60 Terabits per second of data. Improvements in laser technology have grown probably even faster than the improvements in fiber glass manufacturing. We’ve grown to where fiber optic cable is taken for granted as something that is reliable and relatively easy to install and use. We certainly would be having a very different discussion about broadband today had fiber optic cables not improved quickly over the last several decades.

Categories
Technology The Industry

Broadband Shorts for December

Following are a few topics I found interesting but which are too short for a whole blog.

Petabit Fiber Speeds: Bell Labs announced a successful trial in the lab of 1-petabit speeds on a fiber (that is 1,000 terabits). This was done by using real-time space-division multiplexed optical multiple-input-multiple-output (MIMO-SDM) technology. The technology is able to send six separate signals through the fiber without having them interfere with each other. Within each of those signals is another full set of other signals that utilize separate light frequencies.

It always take a few years to go from lab to finished product but this is a major breakthrough with the fastest commercial fiber systems operating at 20 terabits. This is a 50 times increase in capacity. This kind of technology would be used on long-haul routes to move massive amounts of data between two points. The lasers and electronics for this are bound to be very expensive, but in places where that much capacity will be needed it will probably be much cheaper than operating large numbers of fiber pairs. In a world where data usage is growing at a geometric rate, a 50-times increase in capacity only moves us a decade or two ahead of demand.

Comcast Data Caps: A leak of internal documents used to train customer service reps show that Comcast has dropped all pretense that their new ‘trial’ of data caps is based upon network congestion. In fact, Comcast is now training their employees to emphatically deny that congestion is an issue and instead wants customer service reps to tell customers that caps are all about ‘fairness’ and of offering ‘a more flexible policy’ for customers.

I could buy that new story line if they were using the caps to somehow give customers the ability to buy a cheaper connection by agreeing to have a cap. That would indeed be flexible and fair to the small users. But it’s hard to see any flexibility where nobody’s price goes down but customers that actually use the data to which they have subscribed must pay more. But like most huge companies. Comcast is now in full double-speak mode and telling customers the exact opposite of what they are actually doing.

Fewer Ads on Cable: While cable companies will not publicly acknowledge that their major competition might be Netflix, a number of major content providers like Time Warner, Fox, and Viacom are quietly cutting back on the number of ad slots they cram into a given hour of television.

For example, the CEO of Time Warner pledged to try to cut advertising slots during prime time in half – a major reduction. They will start with a trial after New Year’s with TruTV which carries programming that is largely reality TV aimed at younger viewers. If the trial is successful they plan to move this same idea to other networks like TBS, TNT, and CNN.

It’s becoming obvious that the average person is tiring of intrusive advertising. The spread of ad blockers on the web shows how much people hate ads. As a cord cutter I have almost entirely eliminated video ads from my life and it feels like this has given me more time (which of course I then use to watch more time-wasting programming – but it still feels good).

Media Usage by Kids: Common Sense Media did a major survey of the media usage of kids and gave us a detailed look into how kids use various kinds of digital media. For instance, they found that tweens (kids between 8 and 12) use digital media of some sort an average of six hours per day. This might be streaming music, watching television and videos, using social media, playing games, texting, or posting to web sites. Older teens use digital media over 9 hours per day.

Kids are often using digital media in the background while doing other things, so they might not be concentrating on it the whole time, but have it on in the background. Tweens still use television as their most common digital activity. But for older teens music has bypassed television.

25% of teens who go online say that their parents don’t understand what they do online. 30% say that their parents don’t understand or know about the social media they use. 53% of teens and 72% of tweens say that their parents have talked to them about the time they spend using media and the content they view.

The biggest problem identified in the study was the continuing digital divide. As schoolwork goes online, kids without adequate broadband are finding it impossible to keep up with kids that have access. The report showed that 10% of lower-income kids still have dial-up Internet access and only half of lower-income kids have smartphones – both very different statistics than kids from more affluent households.

Categories
Current News The Industry

ALU Sells to Nokia

It was just announced that Nokia will be buying Alcatel/Lucent. It seems that this was done so that Nokia can pick up the latest 4G technology from ALU. As one who has been in the industry for a while I have a long memory of the history of Lucent.

Before the Lucent name, the business was a part of AT&T and was the combination of Western Electric and Bell Labs. Bell Labs was always a wonderment for techies like me because they employed some of the smartest minds in the world. The lab was started by Alexander Graham Bell and over the years they developed such things as the transistor, the laser, information theory, and the UNIX and C++ programming languages. There were eight Nobel Prize winners from Bell Labs. I worked in the Bell System for a few years pre-divestiture and it was a point of pride to work for the same company that operated Bell Labs.

There was a time when Western Electric was the sole manufacturer of telephones and telecommunications devices. I recall that when I was a kid the only option for a home phone was the ponderously heavy, black Western Electric phone. These were hard wired and didn’t have long cords and when you talked you had to stand close to the phone. Over the years, Western Electric introduced smaller phones like the Princess phone and introduced longer cords that provided a little more freedom when using the phone. But all of the Western Electric phones were solid and they rarely had problems or broke. They were solid America technology made in America.

The first big change I remember for Western Electric was when AT&T started licensing other companies to make some handsets. I remember when the Mickey Mouse phone, the Sculptura phone (pictured here) and other colorful phones hit the market. Within a few years, the FCC began to widely license handsets made by numerous companies as long as they passed Bell Labs certification, and Western Electric lost their monopoly on handsets.

Western Electric also made the bulk of the electronics used by AT&T. These included voice switches, line repeaters, and various kinds of carriers used to carry more than one call at a time across a piece of copper. But Western Electric never had a total monopoly and companies like Nortel often sold equipment to non-AT&T telcos.

The big change for the companies came during the divestiture of AT&T in 1984. During the divestiture both Western Electric and Bell Labs were placed into the AT&T Technologies subsidiary. The companies went on, largely unchanged, until they were spun off from AT&T as Lucent, a standalone corporation, in 1996. Most of Lucent’s business was still with the various Bell companies, but they were branching out into numerous fields of telephony technology. At that time Lucent was the mostly widely held stock company in the US and had a stock price of $84 and a market capitalization of $258 billion.

Lucent fell onto hard times at the end of 2000 and was one of the first companies to be hurt by the telephony and dot com crash. The industry as a whole had heavily pursued the new competitive telephone companies (CLECs) that had been authorized by Congress and the FCC in 1996. Unfortunately, the large companies like Lucent and Nortel provided significant vendor financing to the fledgling CLEC industry, and when those companies started folding all of the large manufacturers were thrown into financial trouble.

Lucent never fully recovered from that crash (like many other tech companies that disappeared at that time). Their stock lost significant capitalization from the crash, but then really got slammed when it was revealed that the company had been using dubious accounting methods for recognizing sales and revenues. By May of 2001, the company’s stock had fallen to $9. I remember at the time that everybody in the industry could quote the Lucent stock price and we all watched in wonder as the company crashed and burned.

Over the next few years Lucent tried to gain some value by spinning off business units. It spun off its business systems into Avaya and its microelectronics unit unto Agere Systems. By 2003 the Lucent stock price was down to just over $2 per share and the company had shed over 130,000 employees. Lucent merged with Alcatel in 2006 and became Alcatel Lucent (ALU). That company did well for a while but then had a long string of losses until positive profits were recently announced.

And now the business has been absorbed by Nokia, mostly to pick up the division that makes 4G wireless equipment. There is not much of the old company left. Bell Labs is still around and one has to wonder if Nokia will continue to operate it. The Lucent history is not unusual for high tech companies. Western Electric had a near-monopoly for decades, but over time everything made by them changed drastically and newer companies ate away at the old giant. Today we have new giant companies like Apple and Samsung, and if history is any indicator they will someday be supplanted by somebody new as well.

Categories
Technology

New Technology – Telecom and Computing Breakthroughs

Today I look at some breakthroughs that will result in better fiber networks and faster computers – all components needed to help our networks be faster and more efficient.

Increasing Fiber Capacity. A study from Bell Labs suggests that existing fiber networks could be made 40% more efficient by changing to IP transit routing. Today operators divvy up networks into discrete components. For example, the capacity on a given route may be segmented into distinct dedicated 100 Gig paths that are then used for various discrete purposes. This takes the available bandwidth on a given long-haul fiber and breaks it into pieces, much in the same manner as was done in the past with TDM technology to break data into T1s and DS3s.

The Bell Lab study suggests a significant improvement if the entire bandwidth on a given fiber is treated as one huge data pipe, much in the same manner as might be done with the WAN inside of a large business. This makes sense because there is always spare or unused capacity on each segment of the fiber’s bandwidth and putting it all together into one large pipe makes the spare capacity available. Currently Alcatel Lucent, Telefonica, and Deutsche Telekom are working on gear that will enable the concept.

Reducing Interference on Fiber. Researchers at University College London have developed a new set of techniques that reduce interference between different light wave frequencies on fiber. It is the accumulation of interference that requires optical repeaters to be placed on networks to refresh optical signals.

The research team took a fresh approach to how signals are generated onto fiber and pass the optical signals through a comb generator to create seven equidistantly-spaced and frequency-locked signals, each in the form of a 16 QAM super-channel. This reduces the number of different light signals on the fiber to these seven channels which drastically reduces the interference.

The results were spectacular and they were able to generate a signal that could travel without re-amplification for 5,890 kilometers, or 3,660 miles. This has immediate benefit for undersea cables since finding ways to repeat these signals is costly. But there are applications beyond long-haul fiber and the team is now looking at ways to use the dense super-channels for cable TV systems, cable modems, and Ethernet connections.

Faster Computer Chips. A research team at MIT has found a way to make multicore chips faster. Multicore chips contain more than one processor and are used today for intense computing needs in places like data centers and in supercomputers.

The improvement comes through the creation of a new scheduling technique they are calling CDCS (computation and data co-scheduling). This technique is a way to more efficiently distribute data flow and the timing of computations on the chips. The new algorithm they have developed allows data to be placed near to where calculations are performed, reducing the movement of data within the chip. This results in a 46% increase in computing capacity while also reducing power consumption by 36%. Consequently, this will reduce the need for cooling which is becoming a major concern and one of the biggest costs at data centers.

Faster Cellphones. Researchers at the University of Texas have found a way to double the speed at which cellphones and other wireless devices can send or receive data. The circuit they have developed will let the cellphone radio deploy in ‘full-duplex’ mode, meaning that the radio can make both send and receive signals at the same time.

Today a cellphone radio can do one or the other and your phone’s radio constantly flips between sending or receiving data. Radios have always done this so that the frequencies from the transmitting part of the phone, which are normally the stronger of the two signals, don’t interfere with and drown out the incoming signals.

The new circuit, which they are calling a circulator, can isolate the incoming and outgoing signals and acts as a filter to keep the two separate. Circulators have been is use for a long time in devices like radar, but they have required large, bulky magnets made from expensive rare earth metals. But the new circulator devised by the team does this same function using standard chip components.

This circulator is a tiny standalone device that can be added to any radio chip and it acts like a traffic manager to monitor and control the incoming and outgoing signals. This simple, new component is perfect for cellphones, but will benefit any two-way radio, such as WiFi routers. Since a lot of the power used in a cellphone goes to flipping between send and receive mode, this new technology ought to also provide a significant improvement to battery life.

Million-Fold Increase in Hard Drive Capacity? Researchers at the Naval Research Laboratory have developed a way to magnetize graphene, and this could lead to data storage devices with a million-time increase in storage per size of the device. Graphene is a 1-atom thick sheet of carbon which can be layered to make multi-dimensional stacked chips.

The scientists have been able to magnetize the graphene by sitting it on a layer of silicon and submerging it in a pool of cryogenic ammonia and lithium for about a minute. They then introduce hydrogen, which renders the graphene electromagnetic. The process is adjustable, and with an electron beam you can shave off hydrogen atoms and effectively write on the graphene chip. Today we already have terabyte flash drives. Anybody have a need for an exabyte flash drive?

Categories
The Industry

The Start of the Information Age

A few weeks ago I wrote a blog about the key events in the history of telecom. Today I am going to take a look at one of those events which is how today’s information age sprung out of a paper published in 1948 titled “A Mathematical Theory of Communication” by Claude Shannon. At the time of publication he was a young 32-year old researcher at Bell Laboratories.

But even prior to that paper he had made a name for himself when at MIT. His Master’s dissertation there was “A Symbolic Analysis of Relay and Switching Circuits” that pointed out that the logical values of true and false could easily be substituted for a one and a zero, and that this would allow for physical relays to perform logical calculations. Many have called this the most important Master’s thesis of the 1900s.

His paper was a profound breakthrough at the time and was done a decade before the development of computer components. Shannon’s thesis showed how a machine could be made to perform logical calculations and was not limited to just doing mathematical calculations. This made Shannon the first one to realize that a machine could be made to mimic the actions of human thought and some call this paper the genesis of artificial intelligence. This paper provided the push to develop computers since it made it clear that machines could do a lot more things that merely calculate.

Shannon joined Bell Labs as WWII was looming and he went to work immediately on military projects like cryptography and designing a fire control for antiaircraft guns. But in his spare time Shannon worked on his idea that he referred to as a fundamental theory of communications. He saw that it was possible to ‘quantify’ knowledge by the use of binary digits.

This paper was one of those rare breakthroughs in science that come along that are unique and not a refinement of earlier work. Shannon saw information in a way that nobody else had ever thought of it. He showed that information could be quantified in a very precise way. His paper was the first place to use the word ‘bit’ to describe a discrete piece of information.

For those who might be interested, a copy of this paper is here. I read this many years ago and I still find it well worth reading. The paper was unique and so clearly written that it is still used today to teach at MIT.

What Shannon had done was to show how we could measure and quantify the world around us. He made it clear how all measurable data in the world could be captured precisely and then transmitted without losing any precision. Since this was developed at Bell Labs, one of the first applications of the concept was applied to telephone signals. In the lab they were able to convert a voice signal into digital code of 1’s and 0’s and then transmit it to be decoded somewhere else. And the results were just as predicted in that the voice signal that came out at the receiving end was as good as what was recorded at the transmitting end. Until this time voice signals had been analog and that meant that any interference that happened on the line between callers would affect the quality of the call.

But of course, voice is not the only thing that can be encoded as digital signals and as a society we have converted about everything imaginable as 1s and 0s. We applied digital coding to music, pictures, film and text over time and today everything on the Internet has been digitized.

The world reacted quickly to Shannon’s paper and accolades were everywhere. Within two years everybody in science was talking about information theory and applying it to their particular fields of research. Shannon was not comfortable with the fame that came from his paper and he slowly withdrew from society. He left Bell Labs and returned to teach at MIT. But he even slowly withdrew from there and stopped teaching by the mid-60’s.

We owe a huge debt to Claude Shannon. His original thought gave rise to the components that let computers ‘think’, which gave a push to the nascent computer industry and was the genesis of the field of artificial intelligence. And he also developed information theory which is the basis for everything digital that we do today. His work was unique and probably has more real-world applications than anything else developed in the 20th century.

Categories
Technology The Industry

What’s Next?

I had the opportunity this week to visit CableLabs. CableLabs is a non-profit research laboratory founded in 1988 that is funded by the largest cable companies in the US and Europe. CableLabs works on both practical applications for cable networks while also looking ahead into the future to see what is coming next. CableLabs developed the DOCSIS standards that are now the basis for cable modems on coaxial networks. They hold numerous patents and have developed such things as orthogonal frequency division and VoIP.

I also had the opportunity over the years to visit Bell Labs a few time. Bell Labs has a storied history. They were founded by Alexander Graham Bell as Volta Laboratories and eventually became part of AT&T and became known as Bell Labs. They were credited with developing some of the innovations that have shaped our electronic world such as the transistor, the laser and radio astronomy. They developed information theory which has led to the ability to encode and send data and is the basis for the Internet. They also developed a lot of software including UNIX, C and C++. Bell Labs employed scientists who went on to win seven Nobel prizes for their inventions.

Both of these organizations are full of really bright, really innovative people. In visiting both places you can feel the energy of the places, which I think comes from the fact that the scientists and engineers that work there are free to follow good ideas.

When you visit places like these labs it makes you think about what is coming in the future. It’s a natural human tendency to get wrapped up in what is happening today and to not look into the future, but these places are tasked with looking both five years and twenty years into the future and trying to develop the networking technologies that are going to be needed then.

Some of this work done in these labs is practical. For example, both labs today are working on finding ways to distribute fast internet throughout existing homes and businesses using the existing wires. Google has helped to push the world into looking at delivering a gigabit of bandwidth to homes, business and schools, and yet the wiring that exists in those places is not capable with today’s technology to deliver that much bandwidth, short of expensive rewiring with category 5 cable. So both places are looking at technologies that will allow the existing wires to carry more data.

It’s easy some time to take for granted the way that new technologies work. What the general public probably doesn’t realize is the hard work that goes into to solving the problems associated with any new technology. The process of electronic innovation is two-fold. First scientist develop new ideas and work in the lab to create a working demonstration. But then the hard work comes when the engineers get involved and are tasked with turning a good lab idea into practical products. This means first finding ways to solve all the little bugs and challenges that are part of every complicated electronic medium. There are always interference issues, unexpected harmonics and all sorts of issues that must be tweaked and fixed before a new technology is ready to hit the street.

And then there are the practical issues associated with making new technology affordable. It’s generally much easier to make something work when there is no constraints of size or materials. But in the world of electronics we always want to make things smaller, faster, cheaper to manufacture and more reliable. And so engineers work on turning good ideas into workable products that can be profitable in the real world.

There are several big trends that we know will be affecting our industry over the next decade and these labs are knee-deep in looking at them. Yesterday I talked about how the low price of the cloud is bringing much of our industry to a tipping point where functions that were done locally will all move to the cloud. Everyone also predicts a revolution in the interface between people and technology due to the Internet of Things. And as mentioned earlier, we are on the cusp of bringing really fast Internet speeds to most people. Each of these three changes are transformational, and collectively they are almost overwhelming. Almost everything that we have taken for granted in the electronic world is going to change over the next decade. I for one am glad that there are some smart scientists and engineers who are going to help to make sure that everything still works.

Exit mobile version