Categories
Technology

Hollow Core Fiber

BT, formerly known as British Telecom has been working with Lumenisity to greatly improve the performance of hollow core fiber. This is fiber that takes advantage of the fact that light travels faster through air than it does through glass. In a hollow core fiber, air fills center tubes surrounded by glass. As can be seen by the picture accompanying this blog, multiple tubes of glass and air are created inside a single fiber creating a honeycomb effect.

There was news about hollow core fiber a decade ago when a lab at DARPA worked with Honeywell to improve the performance of the fiber. They found then that they could create a single straight path of light in the tubes that was perfect for military applications. The light could carry more bandwidth for greater distances without having to be regenerated. By not bouncing through glass, the signal maintained intensity for longer distances. DARPA found the fixed orientation of light inside the tubes to be of great value for communication with military-grade gyroscopes.

Until the recent breakthrough, the hollow tube fiber was plagued by periodic high signal loss when the light signal lost it’s straight-path coherence. Lumenisity has been able to lower signal loss to 1 dB per kilometer, which is still higher than the 0.2 dB loss expected for traditional fiber. However, the lab trials indicate that better manufacturing process should be able to significantly lower signal loss.

The Lumenisity breakthrough comes from the ability to combine multiple wavelengths of light while avoiding the phenomenon known as interwave mixing where different light frequencies interfere with each other. By minimizing signal dispersion, Lumenisity has eliminated the need for digital signal processors that are used in other fiber to compensate for chromatic dispersion. This means repeater sites that can be placed further apart and that require simpler and cheaper electronics.

Lumenisity doesn’t see hollow core fiber being used as a replacement on most fiber routes. The real benefits come in situations that require low latency along with high bandwidth. For example, the hollow core fiber might be used to feed the trading desks on Wall Street. The fiber might improve performance for fiber leaving big data centers.

Lumenisity is building a factory in the U.K. to manufacture hollow core fiber and expects to have it in mass production by 2023.

Categories
The Industry

Beavers Kill Fiber Route

An article from CBC earlier this year reported that beavers had chewed through an underground fiber and had knocked 900 customers in Tumbler Ridge, British Columbia off broadband for 36 hours. The beavers had chewed through a 4.5-inch conduit that was buried three feet underground. This was an unusual fiber cut because it was due to beavers – but animals damaging fiber is a common occurrence.

Squirrels are the number one source of animal damage to fiber. It’s believed that rodents love to chew on fiber as a way to sharpen their teeth, which grow continuously throughout their life. For example, squirrel teeth grow as much as eight inches per year, and the animals are forced to gnaw to keep teeth under control and sharp. For some reason, squirrels seem to prefer fiber cables over other kinds of wires hanging on poles.

I remember reading a few years ago that Level 3 reported that 17% of their aerial fiber outages were caused by squirrels. A Google search turns up numerous network outages caused by squirrels. I have a client with a new fiber network, and the only outage over the last year came from a squirrel chewing through the middle-mile fiber route that carried broadband to and from the community.

ISPs use a wide variety of techniques to try to prevent squirrel damage – but anybody that has ever put out a bird feeder knows how persistent squirrels can be. One deterrent is to use hardened cables that are a challenge for squirrels to chew through. However, there have been cases reported where squirrels still partially chew through such cables and cause enough damage to allow in water and cause future damage.

A more common solution is to use sort of physical barriers to keep squirrels away from the cable. There are barrier devices that can be mounted on the pole to block squirrels from moving higher – but these can also be an impediment for technicians. Another kind of barrier is mounted where the fiber connects to a pole to keep the squirrels away from the fiber. There are more exotic solutions like deploying ultrasonic blasters to drive squirrels away from fiber. In other countries, ISPs sometimes deploy poison or obnoxious chemicals to keep squirrels away from the fiber, but such techniques are frowned upon or illegal in the US.

What was most interesting about the beaver fiber cut was that the cut was far underground – supposedly out of any danger. In parts of the country there are similar threats to buried fiber from pocket gophers. There are thirteen species of pocket gophers in the US that range from 5 to 13 inches in length. The two regions of the country with pocket gophers are the Midwest plains and the Southwest. Gophers live on plants and either eat roots or pull plants down through the soil.

Pocket gophers can cause considerable damage to buried fiber. These rodents will chew through almost anything and there have been reported outages from gophers chewing through gas, water, and buried electric lines. Gophers typically live between 6 and 12 inches below the surface and are a particular threat to buried drops.

There are several ways to protect against gophers. The best protection is to bury fiber deep enough to be out of gopher range, but that can add a lot of cost to buried drops. I have a few clients that bore drops to keep them away from gophers. Another protection is to enclose the fiber in a sheath that is over 3 inches in diameter. Anything that large and tubular is generally too big for a gopher to bite. Another solution is to surround the buried fiber with 6 – 8 inches of gravel of at least 1-inch size – anything smaller gets pushed to the side by the gophers. Unfortunately, all of these solutions add a lot of cost to fiber drops.

An unexpected risk for aerial fiber is from birds. Large birds with sharp talons can create small cuts in the sheath and introduce water. Flocks of birds sitting on a fiber can cause stretch and cause sag. That may sound like a small risk, but when I lived in Florida is was common to see hundreds of birds sitting shoulder to shoulder on the wires between two poles. While most people would find that many birds to be an interesting sight, being a broadband guy, my first reaction was always to see which wire they were sitting on.

Categories
The Industry

Satellites and Cellular Backhaul

Elon Musk recently announced that he was going to be providing cellular backhaul from the Starlink constellation of satellites. This makes a lot of sense from a financial perspective in that it avoids the costly wired fiber networks needed to reach rural cell sites.

This is clearly a shot across the bow for companies that currently bring fiber connectivity to rural cell sites. There are numerous rural middle-mile networks that mostly survive by providing backhaul to cell sites. While there has been downward pressure from the cellular carriers on transport rates – it’s likely that Starlink or other satellite providers could drop the bottom out of the market pricing for transport.

Since we hear so much about how the US is losing the 5G war, people may not realize how far the cellular networks around the world are behind those in the US and other developed countries. According to statistics from GSMA, in 2020 there were 7.9 billion cellular users in the work, 48% who were still using 2G or 3G cellular technology. The percentage of users on older technologies is expected to drop to about 23% by 2025, with a big transition to 4G.

But even then, cellular data speeds are likely to remain slow in many countries due to the lack of fiber backhaul and to the fact that in many countries the vast majority of people get almost all of their broadband from cellphones.

It’s been predicted for many years that satellites would play a big role in supporting cell sites. The worldwide consulting firm NSR predicted last year that there would be 800,000 cell sites worldwide connected via satellite by 2029. Over that same time period, NSR predicts the US market for satellite backhaul at $39 billion.

But it’s still a bit of a surprise to hear Starlink talking about providing cellular backhaul. A rural cell site is a large data user and requires far more bandwidth than the average residential or business customer. It would be a big challenge to Starlink or any satellite network to carry both cellular backhaul and residential broadband – because the cellular backhaul would suck away a lot of the capacity of the network out of any one satellite. One would think that cell sites would get priority routing, which means other broadband users would suffer.

It’s been less than five years since the new generation of satellite companies said they would be launching big constellations in the overhead sky. My first thought when I first heard of the new satellite technology is that they would be far better off financially by supporting a handful of cellular companies rather than million of residential customers. What I never expected is that somebody would try to handle both on the same network.

And perhaps that’s not Starlink’s plan. The company has been talking about launching 30,000 satellites over time (currently at 1,500). It would be possible to have different satellites for different customers with a constellation that large. But Elon Musk’s disclosure made it sound like discussions about cellular backhaul are already in the works.

I think we’re many years away from fully understanding how satellite companies will operate. It’s possible that cellular companies and big corporate users will make it worthwhile for the satellite companies to give them priority over residential broadband customers. It’s not hard envisioning satellites providing connectivity to large fleets like FedEx, UPS, or the US Postal Service. Satellite broadband could become the connectivity of choice for the large trucking companies. It’s going to be hard for a constellation owner to say no to those kinds of opportunities – but saying yes to big commercial opportunities will means diluting the broadband available for residential customers.

Categories
Technology

The Natural Evolution of Technology

I’ve been thinking lately about the future of current broadband technologies. What might the broadband world look like in twenty years?

The future of broadband technology will be driven by the continued growth in broadband demand, both in the amount of bandwidth we use and in the broadband speeds the public will demand. Technologies that can’t evolve to keep up with future demand will fade away – some slowly and some virtually overnight.

I don’t think it’s a big stretch to say that within twenty years that fiber will be king. There is a huge national push to build fiber now, with huge funding from federal and state grants, but also unprecedented amounts of commercial investment in fiber. Fiber will be built in a lot of rural America through subsidies and in a lot of small and medium towns because it makes financial sense. The big challenge will continue to be urban neighborhoods where fiber construction costs are high. Twenty years from now we’ll look back on today as the time when we finally embraced fiber, much like we look back twenty years ago when DSL and cable modems quickly killed dial-up.

It goes without saying that telephone copper will be dead in twenty years. To the extent copper is still on poles it will be used to support overlashed fiber. DSL will serve as the textbook posterchild about how technologies come and go. DSL is already considered as obsolete, a mere twenty years after introduction to the market. In twenty more years, it will be a distant memory.

I don’t see a big future for rural WISPs. These companies will not fare well in the fierce upcoming competition with fiber, low-orbit satellite, and even fixed cellular. Some stubborn WISPs will hang on with small market penetrations, but research into new and better radios will cease as demand for WISP services fade. The smart WISPs are going to move into towns and cities. WISPs willing to adapt to using millimeter-wave radios can grab a decent market share in towns by offering low prices to consumers who value price over big bandwidth. I predict that WISPs will replace DSL as the low-price competitor against the large ISPs in towns and cities.

Low orbit satellites will still serve the most remote customers in twenty years – but this won’t be the technology of choice due to what will be considered in the future as very slow bandwidth. Two decades from now, a 150 Mbps download connection is going to feel like today’s DSL. The satellite companies will thrive in the third world where they will be the ISP of choice for most rural customers. Interestingly, when I look out forty years, I think it’s likely that residential satellite broadband will fade into history. It’s hard to envision this technology can have a forty-year shelf life in a world where broadband demand continues to grow.

The technology that is hard to predict is cable broadband. From a technology perspective, it’s hard to see cable companies still wanting to maintain coaxial copper networks. In twenty years, these networks will be 70 years old. We don’t talk about it much, but age affects coaxial networks even more than telephone copper networks. Over the next decade, cable companies face a hard choice – convert to fiber or take one more swing at upgrading to DOCSIS 4.0 and its successors. It’s hard to imagine the giant cable companies like Comcast or Charter making the decision to go all fiber – they will worry too much about how the huge capital outlay will hurt their stock prices.

I expect there will still be plenty of coaxial networks around in twenty years. Unfortunately, I foresee that coaxial copper will stay in the poorest urban neighborhoods and smaller rural towns while suburbs and more affluent urban neighborhoods will see a conversion to fiber. For anybody who doesn’t think that can happen, I pointto AT&T history of DSL redlining. Cable companies might even decide to largely abandon poorer neighborhoods to WISPs and municipal fiber overbuilders, similar to the way that AT&T recently walked away from DSL.

It’s easy to think of technologies as being permanent and that any broadband technology used today will be around for a long time. One only has to look at the history of DSL to see that broadband technologies can reach great success only to be obsolete within just a few decades. We’re going to see the evolution of technology for as long as the demand for broadband continues to grow. Much of the technology being touted today as broadband solutions will quietly fade into obscurity over the next twenty years.

This is the biggest reason why I think that only technologies that can be relevant a decade or two from now should be eligible for federal grant funding. It’s shortsighted to give tax dollars to technologies that are not likely to be relevant in the somewhat near future. We saw a great example of that with the CAF II program that funded already-obsolete DSL. More recently saw federal grant money going to Viasat and to rural WISPs in the CAF II reverse auction. There are smarter ways to spend valuable tax dollars.

 

Categories
The Industry

Standalone Broadband

Parks Associates recently announced the release of its Home Services Dashboard, a for-pay service that tracks consumer adoption of telecom services like Internet, pay-TV, and cellphones. As part of the announcement, the company released a blog that shows that at the end of the first quarter of 2021 that 41% of US homes are buying standalone broadband – meaning broadband that’s not bundled with cable TV or a home telephone.

This is further evidence of cord-cutting since in 2018 only 33% of homes had standalone broadband. This shows that in eight years that 8% of all US homes have ditched all services from their ISP other than broadband. One doesn’t have to trend this too many years into the future until over half of all homes will be buying only broadband.

ISPs are working hard to keep some kind of bundle because they understand the power of the bundle to control churn. It’s much easier for a customer who is only buying broadband to switch to another ISP. As the bundle with cable TV is losing appeal, the biggest cable companies, Comcast and Charter, have been busy bundling customers with cellular service. Comcast has also been successful in bundling millions of homes with smart home packages. For a short time, Comcast even tested the idea of bundling with solar power.

My consulting firm does broadband surveys and we’ve seen a wide range of success of bundling in different markets across the country. Just in the last two years, we’ve seen communities where the level of bundling is as low as 45% or as high as 80%. We’ve noticed that cities with older populations seem to have the highest percentage of homes still buying a bundle. Bundling is also still fairly common in rural America, although the rural bundle is most typically a DSL connection and a telephone landline. A few telcos like CenturyLink still bundle with satellite TV.

The one statistic from Park Associates that I have to challenge is the average price for unbundled broadband at $64. I have to wonder what is included in that figure. Consider Comcast as an example. The Comcast price for its most popular broadband product, Performance, is $76. There is currently a web special where a new customer can buy the product for as little as $34.99. But that price is only good for 12-months and reverts to the higher price. But more importantly, Comcast charges $14 for the modem. That means the price of standalone broadband without a promotional price is $90. Even the introductory product is at $49 when including the modem. The same is true for Charter and other major cable companies – the standalone price without special discount with every big cable company is more than $64 – sometimes a lot more.

Over the last year, I’ve done rate studies in over a dozen communities, and I’ve never seen the average price for standalone broadband below the $75 range. I can accept the Park Associates price if it doesn’t include the extra modem fee. Park Associates are obviously looking at a larger database of numbers than I am, but I’ve seen the $70+ average price in urban, suburban, and rural markets.

I agree with Parks Associates that broadband prices have steadily been climbing, and they observed that standalone broadband prices have increased in price by 64% since 2011. I have no way to judge the percentage increase, but I agree there has been substantial rate increases.

What might surprise a lot of people is how many households still cling to their bundles. After accounting for homes that don’t buy broadband, there are still slightly more homes that buy broadband through a bundle than households that buy standalone broadband. However, at the rate that homes are cutting the cord, it seems likely that within a year or so that there will be more homes without a bundle. In 2020, the traditional cable industry lost almost 6 million customers or almost 7.4% of the base of cable customers – and most of those customers broke a bundle when they ditched cable.

Categories
Technology

Charging our Future

One of the hottest areas of scientific research that peripherally will affect every tech industry is battery research. It seems like every year there are big breakthroughs in battery capability. Today I look at four of the recent announcements.

New Batteries for Robots. Nicholas Kotov at the University of Michigan announced the development of rechargeable zinc batteries that could power robots of all sizes. The batteries use biomorphic technology that creates a membrane between anode and cathode made from Kevlar that mimics cartilage.

The batteries have huge advantages over current lithium batteries in that the materials in the battery are non-toxic and there is no danger of overheating or fires. The batteries are far more efficient, which reduces the weight of the battery needed to power a robot. These batteries can be scaled to power micro-robots or can save weight for large delivery robots. One of the most interesting properties of the batteries is that they can be woven into the outer cover of a robot – freeing space or decreasing the size of the robot.

Safer Lithium Batteries. Scientists at the Applied Physics Laboratory at Johns Hopkins have developed a safer lithium-ion battery. Traditional lithium batteries are powered by a flammable combination of lithium salts and toxic chemicals used for the electrolyte, and if the membrane between anode and cathode leaks the batteries can catch on fire. The safer batteries instead use water-based electrolytes which are non-flammable and non-toxic. The new batteries are also about three times more powerful than traditional lithium batteries. The most interesting characteristic of the batteries is that the material can be manufactured to be clear and can be placed in transparent and flexible housing – meaning the battery could be integrated into smart clothing .

Stable Lithium Batteries. Scientists at the John A. Paulson School of Engineering and Applied Science at Harvard have developed a solid-state lithium battery that eliminates the problems with current lithium batteries. They’ve created a solid lithium-metal battery that is stable and won’t overheat. The battery can be recharged up to 10,000 times, meaning it could work in a vehicle for over ten years. The solid battery also charges much faster than today’s car batteries.

The batteries are constructed with multiple alternating layers of lithium, graphite, and a solid electrolyte. One of the most interesting properties of the batteries is that they are self-healing. All lithium batteries deteriorate over time as lithium gets deposited as dendrites – metallic crystals. The structure of the batteries inhibits, and even reverses dendrite formation.

Aluminum-based Batteries. Using technology developed by the University of Queensland Australian Institute for Bioengineering and Nanotechnology, the Graphene Manufacturing Group plans to start manufacturing aluminum-based batteries. The graphene aluminum ion batteries are about three times more powerful than traditional lithium batteries and eliminate any possibility of starting fires. The big upside to these batteries is that they can be recharged much faster than lithium batteries.

The technology that makes these batteries work was finding a way to make holes in graphene in such a way to allow the aluminum ions to be stored closer together. Probably the best characteristic of the battery is that it uses widely available and low-cost aluminum and aluminum chloride rather than the more costly lithium which mostly comes from China and Chile. The batteries are effective as tiny coin-sized batteries or can scale up as a replacement for car batteries.

Categories
Regulation - What is it Good For?

The White House Encourages Broadband Reform

President Joe Biden signed an executive order last week that covered 72 issues across a wide range of industries including healthcare, agriculture, transportation, technology, and broadband. In the broadband arena it’s hard to call this an executive order because nothing specific was ordered to change – instead, the White House is encouraging the FCC to take four specific steps:

  • It asks the FCC to limit excessive termination fees on customers who want to change Internet Service Providers.
  • The order asks the FCC to eliminate exclusive deals between landlords and ISPs in order to give renters more options for broadband, and in many cases to lower prices.
  • The executive order asks the FCC to revive efforts to require a ‘broadband nutrition label’ that would better inform consumers about the broadband they are buying and would enable the comparison of different broadband plans.
  • Finally, the order encourages the reinstatement of net neutrality.

The executive order doesn’t implement any of these changes, making this more of an executive wish list.

The odd thing about this request is that the FCC is not in a position to implement any of these changes. The primary reason is that the White House has yet to nominate a fifth FCC Commissioner to replace Ajit Pai, who retired at the start of the new administration. With an FCC deadlocked two to two along party lines, no controversial issues are going to pass in this FCC until a new Commissioner is added.

Even if a new Commissioner would start tomorrow, the FCC’s hands are tied for making any sweeping changes concerning broadband since the last FCC under Chairman Pai killed the agency’s Title II authority over broadband. The FCC can’t tackle topics like net neutrality without first reinstating Title II authority or something like it. The process of reinstating Title II authority will not be quick because the agency will have to run through its slow docket process first.

Of course, Congress could shortcut this process by giving the FCC implicit authority to regulate broadband. The agency doesn’t have that authority today because there hasn’t been a Telecom Act from Congress since 1996 and the days of dial-up Internet access. If Congress provides direct authority to regulate broadband we could stop the never-ending battle of changing the FCC’s rules with every change of party in the administration.

Perhaps this executive order is a signal that a new FCC Commissioner is on the way – but even that will take time since any nominee must go through the Senate confirmation process.

In my mind, the most important idea on this list is the broadband nutrition label. ISPs regularly mislead consumers about the broadband products they sell. The big ISPs never talk about the difference between marketing speeds and actual speeds and millions of rural customers receive broadband that is far lower than the speeds they were told they were going to get. Customers are also sold broadband with low special pricing and are only warned in the small print that prices will rise drastically at the end of the special period. It’s hard to think that any other major consumer products in the US are sold with as much deception as broadband and cable TV.

There is one important issue that is missing from the list. I’m surprised to not see data caps on this list. Data caps mostly penalize families that use the Internet in ways that it was intended to be used. ISPs that charge data caps are imposing outlandish fees that are far out of proportion with the extra broadband being delivered.

I’m sure the current FCC will put these issues onto a docket since the White House told them to do so. But it’s hard to envision any progress being made until a fifth FCC Commissioner is on board.

Categories
The Industry

The Dumb Pipe Question

Every few years I read something that resurrects the old question of whether ISPs should be dumb pipe providers or something more. Some ISPs have fought against the idea of being dumb pipe providers and want to believe they are far more than that. The latest event that raises this question anew is AT&T’s debacle with ditching DirecTV and WarnerMedia. AT&T was clearly not content with being considered as only a dumb pipe provider. The company was lured by the perceived higher earnings of both cable companies and media companies, and AT&T went on a buying spree and purchased both DirecTV and WarnerMedia.

At the time of the DirecTV purchase, when AT&T paid $67 billion for the satellite company, there were already rumblings in the industry about cord-cutting. There hadn’t been any evidence of large numbers of customers dropping traditional cable TV, but the industry was already in a holding pattern of zero net growth, with new customers roughly equaling customers who were ditching traditional TV. Since the DirecTV purchase, cord-cutting materialized with a fury as the traditional cable industry lost over 13 million cable subscribers.

The lure for an ISP to become a media company has hovered over the industry for over twenty years. Those of us that were in the industry in 2000 still remember being flabbergasted by the merger of AOL and Time Warner. The merger was blessed by Wall Street and by the consensus of analysts that the Internet was going to subsume media and that the merger was a defensive move by Time Warner. But it was hard to picture a path where the combined companies could grow to justify the astronomical $350 billion valuation that was awarded by the stock market at merger. And sure enough, the wheels came quickly off in what was possibly the worst merger of all time.

AT&T was also lured by the continued growth in the valuation of media companies. The stocks of media companies like Disney climbed in value year after year while AT&T’s value stagnated. AT&T was convinced that the merger with Time Warner would put the company’s stock on an upward trajectory like other media companies.

Underlying AT&T’s decision in both purchases to branch out was a dissatisfaction of being viewed by Wall Street as a dumb pipe provider. AT&T is the ultimate dumb pipe provider with a huge base of cellular and broadband customers – all who buy basic connectivity from the company.

AT&T was obviously jealous after watching companies like Apple and Google profit by putting apps on AT&T’s phones. AT&T was equally unhappy to see companies like Disney prosper from sending video signals over AT&T copper and fiber. I believe the entire AT&T debacle boils down to a company that did not want to be perceived as only providing dumb pipes. I think it’s that simple.

But something happened in the industry in recent years while AT&T lost over $90 billion from the two acquisitions in just five years. In recent years, the valuation of fiber-based dumb pipe providers is up significantly. In the last year the industry has seen transactions for fiber-based ISPs getting huge valuations. I honestly can’t fathom some of these high valuations any more than I could understand the AOL / Time Warner valuation. But the current high valuation for fiber networks is real since there are investors willing to pay big prices to get fiber companies.

All of the big ISPs have grasped this fundamental market shift. Most of the big ISPs have announced strategies to build significant amounts of fiber this year and next year. AT&T is building fiber past 3 million more homes this year. Verizon is on a tear and says it will build fiber-to-the-curb past 25 million homes by 2025. We see big fiber expansion plans from Charter, CenturyLink, Altice, Frontier, Windstream, and a long list of others.  All of a sudden, everybody wants to be a bigger dumb pipe provider.

It’s going to be interesting to see if this trend continues. For now, investors are betting that fiber companies will beat the cable companies in the broadband market – there is no other way to explain the higher valuations. The cable companies have thrived during a decade of lopsided competition against telephone DSL. Are the cable companies faced with being on the opposite side of the competitive battle and seeing fiber become the consumer choice? As always, this industry continues to provide interesting trends to watch.

Categories
Technology

A Rural Broadband Laboratory

The National Science Foundation along with the US Department of Agriculture is creating a broadband testbed in and around Ames Iowa. The program is part of NSF’s Platforms for Advanced Wireless Research (PAWR) program. This is the fourth wireless test site in the country and the first to be placed in a rural setting. The PAWR programs are a great example of public/private partnerships that to date have attracted over $100 million in private and government investments in research.

This project will provide an outdoor laboratory for engineers and scientists to explore ways to maximize the benefit of new wireless technologies for agriculture. Additionally, new technologies will be deployed throughout the college community of Ames.

The PAWR projects, to date, have included the participation of over 35 wireless providers and vendors. This project has already attracted the participation of several universities in addition to the  Iowa State University including the University of California at Irvine, and the Ohio State University. John Deere will be participating in the testbed along with U.S. Cellular, the Iowa Regional Utilities Association, and the Iowa Department of Transportation. The experiments will include participation from students from Iowa State as well as from local schools. Also participating will be Woodland Farms and the Meskwaki Tribal Nation.

Formal testbeds are always interesting because the FCC generally grants test licenses for scientists to experiment with radio frequencies in ways that may not be on the radar for the big carriers. The project includes $8 million to construct a wireless network that will cover nearly 600 square miles in and around Ames. One of the concepts to be explored is the collaboration potential and interaction between satellite broadband, existing wireless networks, and new wireless technologies.

Scientists will be experimenting with technologies involved in precision agriculture including drones, self-driving farm machinery, and an array of environmental sensors. One of the first experiments will involve identifying weeds for automatic eradication using high-resolution video. Field sensors will transmit live pictures to the cloud to allow for accurate identifications of weeds. Training robots to manually eliminate weeds would mean a drastic reduction in the use of herbicides in the food chain.

The project will also step outside of agriculture and look at technologies and applications that can expand wireless coverage in rural areas. This will involve experimenting with hybrid networks that use different frequencies and wireless technologies in unison to bring stronger broadband signals to the fields and areas where it is most needed.

These kinds of experimental sites are always interesting and exciting because ideas tested in programs like this end up as everyday technology a decade from now. Giving scientists and engineer a large outdoor laboratory provides them with a way to test ideas in ways that can’t be explored in the lab or in small testbeds.

Categories
Regulation - What is it Good For?

Grant Money Should Build for the Future

I filed comments with the Department of Treasury concerning the questions they asked about broadband speeds in the Interim Final Rules for how states, cities, and counties can spend the ARPA grant funds. The following is the gist of my comments.

Question 22: What are the advantages and disadvantages of setting minimum symmetrical download and upload speeds of 100 Mbps? What other minimum standards would be appropriate and why?

Treasury is asking the wrong question when asking about current speed requirements. Federal grant money should only be used to build technology that will be capable of meeting broadband demands at least a decade from now. A technology deployment built to meet today’s speed requirements starts being obsolete almost immediately after it’s constructed.

I think 100 Mbps download is an adequate definition of broadband in 2021, and I doubt that there will be many arguments against a 100 Mbps requirement since most currently deployed technologies can deliver this speed.

There will be a huge outcry against 100 Mbps upload speeds since major technologies like cable company HFC networks, fixed wireless, and fixed cellular can’t deliver fast upload speeds. Treasury can’t be swayed by this argument – grant money should only be used to deploy technology that meets public broadband demand both today and into the future. ISPs are free to use their own money to deploy any technology – but federal money is precious and should be held to a higher standard.

In looking out only a decade, and using a conservative 21% annual growth rate in broadband speeds, the definition of broadband a decade from now should conservatively be 600/200 Mbps. That’s what I recommend as a reasonable goal for federal grant funding.

Question 24: What are the advantages and disadvantages of setting a minimum level of service at 100 Mbps download and 20 Mbps upload in projects where it is impracticable to set minimum symmetrical download and upload speeds of 100 Mbps? What are the advantages and disadvantages of setting a scalability requirement in these cases? What other minimum standards would be appropriate and why?

As described earlier, I think federal grant funding ought to be used to support a network that will still be viable a decade from now. My best guess of the upload requirement for a family of four today is between 30 Mbps and 40 Mbps. In looking forward a decade, that means upload speed requirement for a federal grant should be between 200 Mbps and 270 Mbps.

I think there are ISPs using major technologies like cable HFC networks, fixed wireless networks, and fixed cellular networks that are going to suggest that the proper upload speed for a grant should be 20 Mbps or less. Our vast experience of conducting broadband surveys all over the country during the pandemic showed us that a 20 Mbps upload path is already not always adequate today for a home with multiple people working or schooling from home at the same time. If Treasury sets the definition as low as major industry players are likely to suggest, then those networks will launch as already inadequate and will become badly obsolete as time goes by.

Question 25: What are the advantages and disadvantages of focusing these investments on those without access to a wireline connection that reliably delivers 25 Mbps download by 3 Mbps upload? Would another threshold be appropriate and why?

I think Treasury has identified one of the biggest problems with previous federal broadband grants by now saying that the test for grant eligibility is that an ISP can “reliably” deliver at least 25/3 Mbps. The reality is that much of the technology that is reported to the FCC today as being capable of 25/3 Mbps delivers far slower speeds.

We need to talk about real-life networks using three different definitions of speed – maximum speed, minimum speed, and marketing speed. The maximum speed is the fastest speed that a given technology can achieve in ideal conditions. But network conditions are rarely ideal, except perhaps when a single home is using a node at 3:00 AM – and even then, there could be slowdowns from node congestion outside of the neighborhood.

Minimum speeds are something we’ve always referred to as actual speeds. These are the speeds we see on speed tests, and they rarely equal the maximum speed.

Marketing speeds are something else altogether, and some ISPs advertise numbers close to actual speeds while others advertise purely fictional speeds that are greater than the maximum speeds. Unfortunately, the FCC allows ISPs to report marketing speeds, and this is one of the big contributors to the lousy FCC mapping data.

Areas should qualify for federal grant funding based upon the minimum speeds actually delivered to customers and should not use the maximum theoretical speed or the advertised marketing speeds.

My full comments are here: CCG Consulting Comments on Broadband Speeds.