Network Timing

One element that is key to all networks rarely gets discussed. Network timing (or network clocks) involves hardware or processes to make sure that all parts of a network are in synch.

Timing and synchronization are critical for network services that depend on precise, synchronized timing on network devices. Accurate and reliable synchronization of any network device helps manage the security, availability, and efficiency of the network devices. Timing is essential for the function of telephone, cellular, and broadband networks.

There are multiple kinds of timing in use.

Frequency Synchronization. This makes sure that all electronics inside a network operate using the same clock rate or frequency. Many kinds of network gear come with built-in clocks, and having different parts of a network using different clocks will result in data loss, corruption, or misinterpretation of bits. Frequency synchronization forces all of the clocks inside the network to operate in unison by matching the frequency of each clock to a source clock. There are different sources for frequency synchronization:

  • Synchronous Ethernet (SyncE) chooses one clock and forces the other clocks to match.
  • Networks can be synchronized to external clocks such as BITS or the GPS satellites. BITS can choose any reliable external clock.
  • Many networks use Precision Time Protocol (PTP), which eliminates the danger of losing the connection to an external clock.
  • A network can use a free-running internal oscillator chip that holds an accurate clock.

Many networks have used GPS for frequency synchronization. A GPS satellite carries a highly stable atomic clock that provides precise time signals, which can be converted into frequency references by a GPS receiver. While the atomic clock provides highly precise time and frequency information, GPS is not as reliable when there isn’t a clear view of the sky during weather events.

Phase Synchronization makes sure that the phase of network signal is consistent throughout the network. Phase refers to a specific point in time on a waveform cycle. Phase synchronization ensures that electronics agree on the timing of the start and end of each bit in a data stream. This is critical in applications where data from multiple sources have to be combined or compared, such as in a cellular network.

Time Synchronization, also called Time of Day (ToD) ensures that all electronics agree on the current time, which is critical in applications where timing is crucial. Networks differ in the need for precise time. Network Time Protocol (NTP) can be used to provide millisecond accuracy, while PTP can provide nanosecond accuracy along with phase synchronization.

A New Security Risk

A new security risk has recently been brought to my attention. I was on a Teams call that included an attorney who would not let the call continue while an AI notetaker was present. His comment was that the notetaker is listening to everything that is said, transmitting everything verbatim to a data center somewhere in the cloud. He said he was aghast that people would hold meetings about sensitive topics and then give everything that was said to unknown parties outside of the call. He used the analogy that having an AI notetaker is the equivalent of inviting a reporter into a meeting.

It didn’t take much research to realize he is right. An AI notetaker records everything that is said in a meeting so that AI servers somewhere in the cloud can make a transcript or summary of the meeting. Every word said in a meeting, from the brilliant to the mundane, is sent to a data center out of the control of the people on the call.

There is no way to know what the folks who control the recording will do. At a minimum, it’s almost certain they are using the data to further train AI models, which are voracious for more data. A record of the meeting could be sold to others. It’s possible, and even likely, that somebody really good at AI prompts can figure out what is discussed at a corporate meeting.

Of course, the AI notetaker companies can all swear that they don’t use the data for purposes other than making a summary of the meeting. But I have to ask, does anybody have the slightest idea of the identity of the people who own and work at these businesses, and do you trust them? Nobody would let an unknown stranger into a work meeting, but that’s exactly what companies are doing with AI notetakers. But suddenly, companies have begun willingly sharing conversations with the cloud that they might not even want to share with everybody else inside their company. It’s hard to see this as anything but a self-inflicted data breach.

Before writing this blog, I asked a few people about this. One friend who is an AI expert said that it would be too tempting for anybody in this kind of business to monetize the data they are gathering by selling it to others to train AI models. He said that most AI companies are struggling to be profitable, and that secondary revenue streams have to be tempting (just as it is tempting for ISPs to sell user data). He thought that it’s too expensive for companies to routinely sift through the data for tidbits of corporate espionage, but that it would be possible for anybody willing to spend the processing time, or who is interested in a specific business or a specific person. He also said he would be worried that AI companies could be using the data to gather a voice print of meeting participants, something that they might otherwise have a hard time finding for most people.

I don’t have any knowledge that the companies in this line of business are doing anything nefarious with the data gathered, and perhaps they are not. But letting key information out of a closed circle of people on a call is practically the definition of a security risk. There is no way to know if this might harm a business.

There are a few companies that sell notetakers that say that they keep all data on a user’s computer and don’t share it in the cloud. The AI engine that summarizes a call is still going to be in the cloud, so unless that can be proven somehow, that still feels like a risk. Tech companies have been lying to the public about how they use the data they gather since AOL and early web companies figured out how to monetize user data.

This is one of the oddest blogs I’ve ever written because it makes me wonder if I’m being paranoid. But that feeling is probably a sign that this is a real concern.

Broadband in a Hurry

There is an interesting new twist on wireless backhaul. The Swedish company TERASi has developed a wireless backhaul technology that enables networks to be configured on the fly. The company has developed a small, lightweight, portable microwave radio that can quickly be mounted anywhere on a tripod, a pole, or any object with line-of-sight to a neighboring radio.

The radios use frequencies in the 70 GHz range. They can provide 2 Gbps in bandwidth for up to 5 miles or 10 Gbps for a few miles. Latency is a super-low 5 milliseconds.

The selling point for these portable radios is that they can be installed and configured in minutes. This is due to the small size of 3x3x1 inches. The company says a radio can be mounted on a photography tripod or even on a drone to create a quick wireless link. The small radios are being touted as a solution for quick links in the field for the military or for a quick link any time an ISP needs a quick connection.

The radios are now in beta testing mode, and the company would like to hear from ISPs or local governments that might have a unique use case for radios that can create a quick link.

It’s not hard to imagine numerous uses for a microwave network that can be installed quickly.

  • The company is marketing this to the military as an alternative to using Starlink on the battlefield. There have been several times in Ukraine where the Starlink network went down – at least once intentionally, and once recently when Starlink had a worldwide outage. Microwave radios are safe from interference since it’s nearly impossible to intercept the tiny beam between two devices. These radios also have the upside of delivering higher bandwidth than satellite.
  • The technology could be a boon for disaster recovery. ISPs and utilities could string together a backhaul network that would allow them to reestablish a quick bandwidth link to substations, cell towers, or powered electronics hubs. The devices could be in place quickly to establish connections for critical first responders. Local governments could use the radios to power public hotspots to give quick connectivity to the public.
  • These radios could be an instant patch for damaged networks, particularly in situations where repairs will be slow. These radios could be a quick fix for fiber cuts in places that are hard to fix, like bridges and railroad crossings. The radios could leapfrog landslides, fire, or flooded areas to keep a network functioning.
  • Temporary wireless networks make sense for places like construction sites that need bandwidth today, but not permanently.
  • Commercial firms might consider this as a quick fix between nearby buildings for emergency redundancy.

The downside is the expense of buying units that might never be used. But the huge upside is having the ability to create a quick broadband connection for emergencies and critical needs.

A New Major Telecom Vendor

Many folks in the industry will already recognize Amphenol, the company that is poised to become one of the major new vendors in the industry. The company has decided to grow quickly by acquisition. It recently purchased the Connectivity and Cable Solutions subsidiary from CommScope for $10.5 billion. Amphenol also bought Trexon, a cable assembly business, for $1 billion.

Amphenol is a worldwide business with manufacturing facilities in forty countries. The company is in a wide range of markets, including military-aerospace, industrial, automotive, information technology, mobile phones, wireless infrastructure, broadband, medical, and pro audio. The largest division of Amphenol is Amphenol Aerospace (formerly Bendix Corporation).

In the telecom world, Amphenol Fiber Systems International (AFSI) was started in 1993 to manufacture fiber optic connectivity products and systems in Allen, Texas. In July 2024, Amphenol purchased two subsidiaries from CommScope. The company paid $2.1 billion to buy the Outdoor Wireless Networks (OWN) and the Distributed Antenna Systems (DAS) business. Amphenol also resurrected the Andrew Corporation brand name, a company previously acquired by CommScope, that manufactures tower and rooftop systems and cable management accessories.

Amphenol’s acquisitions are not just focused on telecom, and recent acquisitions include Carlisle Interconnect Technologies (CIT) which makes antennas and sensors for harsh environments; Lutze, a railway technology company; LifeSync, a manufacturer of connectors, antennas, and sensors for the medical industry; Narda-MITEQ, a maker of RF and microwave equipment for the military; XMA, a manufacturer of passive microwave components; and Q Microwave, which specializes in RF filters and subsystems for the military and space sectors.

The many acquisitions have already boosted 2025 earnings for the first half of the year. The strategic acquisitions contributed 15% to the first half of 2025 revenues. On a reported basis, revenues jumped 52% and excluding acquisition-related contributions, organic growth was 37% to hit $10.46 billion. In second-quarter 2025, revenues jumped 57% year over year on a reported basis and 41% organically to $5.65 billion.

The acquisition of CommScope’s fiber business makes Amphenol a major player in the broadband business. This puts Amphenol in competition with companies like Corning, Belden, and Prysmian. The company is also hoping for a big boost from selling fiber to supply the current AI explosion.

The CommScope sale might surprise some, but the company was in trouble due to a massive debt load of over $7 billion, and slower-than-expected sales that led to inventory build-ups in its broadband and cable access segments.

Space Shorts September 2025

Space has been a part of the communications networks since the communications satellite Telstar was first put into orbit in 1962. I remember as a kid tracking Telstar across the sky. Space today is an increasingly important part of communications. The following are a few pieces of space news I recently found to be interesting..

Low-Orbit LEO. The Spanish startup Kreios Space is working to develop a new type of satellite that can fly at lower altitudes. LEO satellites today typically fly at altitudes from 220 to 350 miles above the Earth. Kreios is working on satellites that would fly at an altitude of 125 miles. LEO satellites for companies like Starlink are parked high enough to avoid the drag caused by the upper atmosphere. Kreios would be able to fly lower by using air intake to drive electric motors that would generate enough thrust to maintain altitude. This would allow for long-duration orbits and the ability to move the satellite without needing any traditional fuel.

It’s not hard to understand the advantage of flying at lower altitudes. The satellites would be able to observe the ground in much greater detail. Communications and broadband satellites at a lower altitude would mean lower latency and faster communications times.  The company thinks the improvement in performance would be between 3 and 16 times better than the current LEO satellites flying at higher altitudes.

 Bluetooth Satellites. Hubble Network is a startup that is building a fleet of satellites to communicate with Bluetooth devices. The Bluetooth devices involved are different than the typical Bluetooth device that is designed to send a lot of data for a short distance. Instead, Hubble will connect to low-power Bluetooth sensors that only transmit a small amount of data. Hubble launched its first two satellites in 2024, now has seven satellites in orbit, and plans on having a full satellite constellation in place by 2028.

The advantage of the technology for Hubble customers is the use of low-power Bluetooth devices that are far less costly than connecting to cellular technology. Sensors can be placed anywhere on the planet that are out of reach of cellular networks and can be used for functions like tracking the movement of cargo ships. Hubble is already tracking millions of devices and expects to be able to keep track of billions. The company today is working with customers like Life360, which has a location-based safety service that can let families and friends share real-time locations with each other. The sensors can be used to track vehicle fleets and can provide instant feedback on things like driving speeds.

 Space Robots. I can’t think of a space sci-fi movie that didn’t have worker robots in the background taking care of the maintenance required to work in space. I saw an article about Icarus, a startup that is raising money to develop robot workers to replace astronauts on the ISS space station. That set me on a search to understand the space robotics market, and there is a space robot-race underway. Established companies like Maxar Technologies, Northrup Grumann, NDA, Honeybee Robotics, and Motiv Space Systems have been active in the field. They are joined by numerous startups, including Astrobotic Technology, GITAI, Rovial Space, BigDipper Exploration Technologies, Space 11, and Novium.

We’ve already seen space robots for many years, like the various Mars rovers like Nasa’s Sojourner, Spirit, Opportunity, Curiosity, and Perseverance, and China’s Zhurong. The companies listed are working on robots of all sizes, from the inchworm robots being developed GITAI to moon rovers being developed by several companies.

Asteroid Mining. There have now been several trips to explore asteroids and bring back samples. This includes NASA’s OSIRIS-Rex mission that returned samples from the Bennu asteroid in 2023 and the Japanese Hayabusa-2 mission, which returned samples from the Ryugu asteroid in 2020. These missions were government-funded and cost hundreds of millions of dollars, and were funded for scientific research purposes.

Startup Karman+ is working on being able to fund a round trip to asteroids for roughly $10 million, with the cost to hopefully drop in the future. This is the first step in developing an asteroid mining industry that would use robots to mine valuable metals from asteroids and round-trip rockets to ferry materials back to Earth orbit. This first mission only plans to bring back one kilogram of material and is a test of concept for the technology. The ultimate technology will need to mine the materials in space needed to create the fuel needed to return heavier payloads back to Earth.

Broadband Technology Improving

As has happened continuously since the introduction of DSL and 1 Mbps cable modems, the major broadband technologies continue to evolve and get faster.

Cable HFC technology is getting faster. Harmonic, one of the makers of core cable broadband technology, recently announced that the company had achieved a 14 Gbps speed with DOCSIS 4.0. The test was achieved during a CableLabs interoperability event. The speed was achieved in a mock-up that included achieving the faster speed using technology provided by multiple other vendors.

The test was achieved with an updated CMTS (which is the main hub router in a cable modem network). The speed beats the old record of 10 Gbps, also achieved by Harmonic. It’s unlikely that any cable companies will try to achieve that speed since it would mean sacrificing some upload speeds with current DOCSIS 4.0 technology. But a faster CMTS would allow a cable company to offer a true 10 Gbps download product. These kinds of breakthroughs are also important since they are the first step towards developing the next generation of electronics.

Faster home broadband service from fiber is also improving. Earlier this year, Nokia announced the availability of two different 25 Gbps customer modems, making it realistic for ISPs to offer the faster 25 Gbps service on a PON fiber network.

Nokia also recently announced the release of a 25G PON card for the network core that can simultaneously support all of the flavors of PON, including GPON, XGS-PON, and 25G PON. The company said the card would easily be able to handle the upcoming 50G PON. Having a core with this flexibility will allow ISPs to keep customers on older GPON technology without having to force an update when the newer technologies are introduced to the network.

Finally, Nokia announced the release of some new home WiFi 7 gateways for the home. The  Beacon 4 gateway can reach speeds of 3.6 Gbps, and the tri-band Beacon 9 gateway offers 9.4 Gbps speeds. These are added to a line of gateways that top out with the Beacon 24, which can achieve home WiFi speeds of 24 Gbps. The new generation of WiFi 7 routers offers the possibility of superfast speeds inside the home using 6 GHz spectrum, while at the same time still connecting to older devices using 2.5 and 5 GHz spectrum.

Another major announcement is the new generation of Tarana radios for fixed wireless. The specifications on the new radios are a leap forward in capacity and performance. The first-generation G1 radio platform could support up to 1,000 customers per tower, 250 per sector. Each sector could accept up to 2.5 gigabits of backhaul bandwidth. The new G2 platform can support up to 512 customers per sector (2048 for a tower). The radios can accept as much as 6 gigabits of backhaul bandwidth per sector.

We can’t leave out satellite technology. The first-generation Starlink satellite weighed around 570 pounds and had a total downlink budget of about 20 Gbps. Starlink is introducing its third generation of satellite that weighs almost 4,200 pounds and has a downlink budget of 1 Tbps and 160 Gbps in aggregate uplink capacity.

This is a sampling of technology improvements and is not meant to exclude improvements being introduced by other vendors. There are many other important improvements including faster lasers for long-haul fiber routes and point-to-point broadband connections using light.

Multi-core Fiber

There is a relatively new fiber technology that most readers will not have heard about. Multi-core fiber (MCF) is a technology that packs multiple strands of fiber inside a bundle that is about the same size as a single strand of fiber today. The benefit of packing more fibers into a tiny strand is obvious – it means a lot more bandwidth can be sent through a single physical strand of fiber.

It may surprise you to understand that only a small fraction of a strand of fiber is used to transmit light. In today’s fiber, the light path in the center of a fiber is tiny and represents only 0.5% of area of a cross-section of a fiber. The rest of the fiber strand is made up of materials surrounding the glass that help to keep the light on a straight path and cladding that protects the fiber. Fiber could be made a lot thinner, but the industry has standardized on a fiber strand of 125 microns because going any smaller makes it hard for technicians to handle a single fiber strand. This means there is a lot of unused real estate inside a 125-micron sheath for additional light paths.

Early prototypes of multi-core fiber have created fibers with 7, 12, and 19 fibers, with the possibility of getting even more cores into a single strand. Each core is equivalent to a single-strand of fiber today. A 24-strand cable that uses 12-core multi-core fiber would contain 288 separate fiber paths. Future networks using multi-core fibers will be lighter and easier to handle than the fibers they would replace using current technologies.

There are some obvious issues with using multi-core fibers. One is cost, and MCF fiber is a lot more expensive today than traditional fiber. But that difference might be eliminated if MCF fiber becomes common and is produced in volume. The extra cost of the fiber might be easily offset by the increased ease of working with smaller fiber bundles. There are major challenges of splicing an MCF fiber into an existing network comprised of single-strand fiber. MCF fiber also interfaces in a whole new way with fiber electronics. There is also a size issue, because MCF fibers with a lot of cores will be larger than 125 microns, meaning that all new tools are needed to work with the fiber.

There are already a few trials of MCF fiber in use. This is a natural improvement for undersea fibers, where getting the most bandwidth possible in a fiber bundle is desired. There is also MCF fiber installed in some data centers to facilitate moving huge amounts of data from device to device.

Multiple vendors are manufacturing or testing multi-core fiber and it will become more available over time. This seems like a natural upgrade to long-haul fiber routes between major cities. There has been a lot of industry concern that the explosion of data centers means these long-haul routes are filling up soon after being constructed. MCF fiber multiplies the bandwidth that can be delivered through existing conduits.

One of the concerns of having many tightly packed cores side-by-side is crosstalk and interference between cores. However, scientists seem to have solved this problem with good shielding materials around each core.

It may be a long time before this makes sense in last-mile networks. We can already deliver far more bandwidth than almost any customer needs with current fiber technology. However, MCF answers the question of whether fiber technology will ever be obsolete. No wireless technology will ever be able to outcompete a small MCF fiber strand with multiple cores in each small fiber strand.

The Space Cannon

As if low-orbit space isn’t already getting over-crowded, there is a startup that may send huge numbers of additional satellites into orbit. The California company is Spinlaunch.

Spinlaunch plans to shoot microsatellites into orbit using what they call a centrifugal cannon (pictured to the right). The cannon spins and accelerates a small rocket that will hold multiple satellites. The cannon accelerates the rocket using spinning arms inside a vacuum chamber that achieve a force of 10,000 G and a speed of 5,000 miles per hour – fast enough to achieve a suborbital height. From there, the rocket engines will fire to finish the trip to space. The company has done ten test launches that successfully reached suborbital heights.

The launches will be done from Adak Island, near the western end of the Aleutian Islands off Alaska. Spinlaunch’s partner is the Aleut Corporation, an Alaskan Native business. They chose the Aleutians since it provides a clear launch path over the Pacific Ocean with minimal disruption to commercial flights. The area also has steady winds, which allow for the use of cheap renewable wind power. The site also takes advantage of an abandoned U.S. Navy base on the island.

The company plans to start shooting satellites into orbit in 2026. The satellites, shaped like a disk, are 7.5 feet across and weigh about 154 pounds. This is significantly lighter than a Starlink satellite, with the current V2 satellites weighing in at 1,760 pounds. The plan is to place 250 satellites into orbit in a single launch, the most ever. Last year, Starlink launched 143 satellites in a single launch.

The satellite fleet will be owned by a sister company, Meridian Space. Meridian Space currently holds an FCC license to launch 1,190 satellites. Spinlaunch raised $150 million, including a recent infusion of $12 million from Kongsberg Defense and Aerospace, which will manufacture the satellites. Meridian Space plans to compete head-to-head with Starlink and Kuiper in selling broadband.

Spinlaunch thinks it has a number of advantages over other launch technologies. It requires 70% less fuel to put a satellite into orbit, meaning a much lower cost of deployment. The launch cannon should be fully usable for many years of future launches.

On the flip side, placing even more satellites into space increases the problems that have been identified with proliferation of low-orbit satellites. That includes an increased risk of space collisions and the resulting debris that could make low-orbit space into a dead zone. It means more interference with light pollution and interference with astronomy. It also means more satellites falling back to earth, which can cause degradation of the ozone layer.

But like it or not, the satellite age is upon us, and is going to accelerate as companies find clever ways to launch more satellites into low-orbit space.

Free-space Quantum Optics

In a joint effort by Yale University, Stony Brook University, and Brookhaven National Laboratory, scientists have been working on a project using free-space optics to transmit quantum signals through the air using lasers. The specific project is called Q-LATS for Quantum Laser Across the Sound, where signals are being transmitted 27 miles from a tower at Yale across Long Island Sound to a tower at Stony Brook University.

Quantum computers are used to create qubits, which have the ability to exist as both a one and a zero simultaneously. This allows a quantum computer to explore multiple scenarios at the same time. Qubits can be created using trapped ions or photons. Quantum communications function by generating entangled photon pairs. One photon is retained and used for calculations, while the other is sent to a distance location. Anything done with the retained photon is sensed at the distant end, thus transmitting the details of the computations.

The project is looking for alternatives to using fiber cables. It turns out to be very challenging to transmit quantum signals through fiber since some of the qubits are lost during the transmission. Fiber cables used to transmit quantum signals work best when heavily shielded and buried, making it costly to establish paths for future quantum communications.

The Q-LATS project hopes to show a reasonable alternate to fiber in places where costly fiber routes are impractical, such as across the Sound. Free-space optics can be used to transmit quantum signals in urban locations, across water, and even into space.

Transmitting light signals through the air has limitations due to rain, fog, and atmospheric turbulence, but the scientists believe these shortcomings can be more easily overcome than finding the funding to build specialty fiber routes.

For now, quantum transmissions are mostly of interest in academia to transmit signals between universities. But quantum computing holds some interesting properties that should eventually make it of use for data centers, large businesses, the military, and others. Quantum signals are seemingly nearly impossible to intercept or hack because the effort to do so instantly interferes with the transmission of qubits. That’s even more so with free-space optics, where a hacker would have to somehow intercept a line-of-sight transmission through the air.

Quantum communications might eventually become the standard for sending highly confidential or sensitive information. It’s not hard to imagine using free-space lasers to transmit quantum signals between Wall Street firms, between large businesses and data centers, and between government and military locations that require a secure path.

Cellular Upload Speeds

T-Mobile recently announced a cellular speed test where the company was able to achieve an upload speed of 550 Mbps on a live cellular link. The test was clearly done in ideal conditions in order to achieve the fast speed, but T-Mobile acknowledges that upload speeds are increasingly important to customers. Fierce Network quoted T-Mobile President of Technology Ulf Ewaldsson as saying, “uplink is the next big thing.”

This is something the broadband industry has known for many years. Fiber companies set a standard of symmetrical download and upload speeds, which frankly provide more upload speed than people need. But the public complained loudly about the slow upload speeds from cable companies during the pandemic, and cable companies have scrambled to increase upload speeds using mid-split upgrades. Cable companies have upgraded many markets to upload speeds of 100 to 200 Mbps.

This new speed test record seems to have been released to complement T-Mobile’s press release in April, where it announced that it now offers the first nationwide 5G Advanced network. By that, T-Mobile means its 5G network has begun to incorporate the latest industry 5G standards included in 3 GPP Release 17 and 18. According to the press release, T-Mobile has implemented 5G Advanced nationwide, although there is some discussion in the Fierce Network article saying that is not likely.

There is no doubt that T-Mobile has upgraded networks to a greater degree than the competition, as documented in the latest report from Ookla for the end of 2024 where T-Mobile had a median download speed of 281.5 Mbps, compared to 199.1 Mbps for Verizon and 140.1 for AT&T.  However, during that same period, T-Mobile’s median upload speed, as measured by Ookla, was much slower at 21.3 Mbps. In the April press release, T-Mobile said its typical upload speeds are between 6 and 31 Mbps.

Upload speeds likely matter a lot more to T-Mobile now that it has passed the 6 million customer mark with its FWA home broadband product. Folks who use broadband for gaming, working from home, online schooling, and conferencing are not going to be enamored with a broadband product where poor upload speeds can degrade performance. The current median speed of 21 Mbps is basically the same as the speed customers don’t like on cable company networks.

Upload speeds are probably the biggest long-term weakness of FWA broadband. FWA customers who live in rural areas might not have another alternative other than Starlink, which also has slow upload speeds. But a lot of FWA’s growth is coming from suburbs and cities where customers have a broadband alternative. Cable companies are scrambling to get much faster upload speeds, and fiber generally has symmetrical speeds. Ookla points out in its latest quarterly report that upload usage is growing at a much faster pace than download usage. T-Mobile is being smart in looking at a way to improve upload speeds.