Categories
Technology The Industry

Getting Serious About Satellite Texting

One of the more interesting telecom announcements at the CES electronics show in Vegas was the announcement from the partnership of Qualcomm and Iridium of plans to bring satellite texting capability to many more cell phones and other devices.

We’ve already seen a few other announcements recently of the ability to make emergency text calls when out of reach of cell coverage. The team of T-Mobile and SpaceX say that T-Mobile customers will be able to reach 911 through a satellite some time in 2023. Apple launched an Emergency SOS system for its newest iPhone users in a partnership with Globalstar, but the service is only available in a handful of cities.

Qualcomm is building this feature into its premier Snapdragon 8 Gen 2 chips, so any new phone or other device using that chip will have texting capabilities. The company says it plans to eventually build the capability into other more affordable chips as well.

For now, Qualcomm has established a 911 service similar to the T-Mobile plans where people can reach 911 when out of the range of the normal cellular network. But the company envisions that cellular carriers will develop price plans to let users text for a fee. That would provide folks with the ability to stay connected while hiking in remote wilderness or during a sea cruise.

Qualcomm is in the business of selling chips, and it would love to see this capability expanded to other places, like built into laptops or new cars. Putting the technology in cars is a major selling point since it would enable features like automatically contacting 911 after an accident.

This first-generation product will be far from perfect, but that’s to be expected from what is basically a beta test. For example, while Iridium satellites blanket the earth, there are times when there is no satellite overhead, and a user might have to wait ten minutes for the next satellite. It seems this issue can be resolved by cell carriers partnering with multiple satellite providers.

This new technology opens up the possibility for people to have some limited connectivity almost anywhere on the globe. For the younger connected generations, this has great appeal. Most people I know with GenZ kids tell me that it’s like banishment to take kids out of reach of connectivity. But more practically, much of the world does not have reliable cellular coverage, and this can bring some form of communication to all.

I know people will read this and assume that the next step is to use satellites to provide data connectivity to cell phones or laptops from anywhere. However, there are limits of physics that make that unrealistic for a handset. The latest Starlink dishy receiver is 19 by 12 inches, and that much surface area is needed to receive the signal from a satellite. However, it’s not hard to imagine a hiker rolling out a flexible receiver to communicate with a satellite – assuming they bring along some kind of power source, perhaps solar.

I track telecom announcements of new technologies and products to give me a baseline a decade from now to see how various technologies performed. It will be interesting to see if satellite texting becomes a routine part of every cellular plan or if it withers on the vine like many other seemingly good ideas that the market didn’t embrace.

Categories
The Industry

Who’s On First?

I saw a short article in Business Wire that said that Comcast Business had landed a project to provide a private wireless network for the guests of The Sound Hotel Seattle Belltown. This is an example of the continuing convergence in the industry where the big cable companies, ISPs, and wireless carriers are freely competing on each other’s turf. For decades we’ve neatly categorized companies as telcos, cable companies, or wireless carriers, but this convenient categorization is starting to fray around the edges, and its getting a lot harder to distinguish between the big industry players.

If we look back ten or fifteen years, the distinctions between these companies were clearly defined. The big telcos served residences and small businesses using DSL. The big telcos were clearly structured in silos. There was practically no interface between the wireless companies at Verizon and AT&T and the broadband business. Verizon went so far as to set up Verizon FiOS, its fiber business, separately in every aspect from the copper and DSL business.

The cable companies had faster broadband than DSL after the upgrades were made to DOCSIS 3.0. Speeds up to 300-400 Mbps blew away the capabilities of DSL. Once those upgrades were completed, the cable companies took market share in cities from the telcos year after year until the cable companies had a near-monopoly in many markets.

The market with more balanced competition has been the large business market. This is the market where fiber quickly became king. At one point the telcos controlled most of this market, with their fiercest competition coming from a handful of big CLECs. Verizon responded to this competition by buying MCI, XO, and others in the northeast. CenturyLink become one of the nationwide market leaders through the acquisition of Qwest and then Level 3. The big cable companies cautiously launched fiber ventures for this market twenty years ago and have picked up a decent market share.

But those simple explanations of the business plans of the big ISPs is now history. As the Business Wire announcement showed, the big companies are crossing technology barriers in new ways. Comcast

Providing a private wireless network for a large hotel is emblematic of a new trend in competition. In doing this, Comcast is crossing technical lines that it would never have considered years ago. From a business perspective, Comcast is going after the full suite of services for businesses like this hotel, not just the wireless network. The newest word in the competitive market is stickiness, and Comcast is likely tying down this hotel as a customer for a long time, assuming it does a great job.

These crossovers are even more evident in the residential and small business markets. Comcast, Charter, and other cable companies are bundling cellular service with broadband and the triple play, something that the telcos have never managed to pull off. Telcos have decided to reclaim urban market share by building huge amounts of fiber. And the cable companies are reacting to that threat by rushing some early versions of DOCSIS 4.0 to the market in order to fix the upload bandwidth issues. The big wireless companies have joined the fray with the FWA cellular wireless broadband products. While these products can’t compete with the bandwidth on fiber or cable networks, the product is still adequate for many homes and hits the market at a much lower price.

This has to be confusing to the average residential consumer. Consumers who abandoned DSL years ago are being lured back by to the telcos by fiber. Folks who have been paying far too much for cellular service are moving to the more affordable cable company wireless service. And people who can’t afford the high price of cable broadband are seemingly flocking to the more affordable FWA wireless. I have to imagine that the customer service desks at the various ISPs are being flooded by customers canceling service to try something different.

Markets always eventually reach an equilibrium. But for now, both the residential and business markets in many cities are seeing a fresh new marketing efforts. A decade from now, it’s likely that we’ll reach a predicable mix of the various technologies. We know this from having watched the markets where Verizon FiOS battled with the cable companies for several decades. But much of the country is just now entering the era of refreshed competition.

Unfortunately, this new competition isn’t everywhere. There is already evidence that new investments are not being made at the same pace in lower-income neighborhoods. Some cities are seeing widespread fiber construction while others are seeing almost none. There will still be a lot of work to do to make sure that everybody gets a shot at the best broadband – but the obvious convergence in the industry shows that we’re headed in the right direction.

Categories
The Industry

No More Underbuilding

Jonathan Chambers wrote another great blog this past week on Conexon where he addresses the issue of federal grants having waste, fraud, and abuse – the reasons given for holding hearings in the House about the upcoming BEAD broadband grants. His blog goes on to say that the real waste, fraud, and abuse came in the past when the FCC awarded federal grants and subsidies to the large telcos to build networks that were obsolete by the time they were constructed. He uses the term underbuilding to describe funding networks that are not forward-looking. This is a phrase that has been around for many years. I remember hearing it years ago from Chris Mitchell, and sure enough, a Google search showed he had a podcast on this issue in 2015.

The term underbuilding is in direct contrast to the large cable and telephone companies that constantly use the term overbuilding to mean they don’t want any grant funding to be used to build any place where they have existing customers. The big ISPs have been pounding the FCC and politicians on the overbuilding issue for well over a decade, and it’s been quite successful for them. For example, the big telcos convinced the FCC to provide them with billions of dollars in the CAF II program to make minor tweaks to rural DSL to supposedly bring speeds up to 25/3 Mbps. I’ve written extensively on the failures of that program, where it looks like the telcos often took the money and made minimal or no upgrades.

As bad as that was – and that is the best example I know of waste, fraud, and abuse – the real issue with the CAF II subsidy is that it funded underbuilding. Rural DSL networks were already dying when CAF II was awarded, mostly due to total neglect by the same big telcos that got the CAF II funding. Those billions could have instead gone to build fiber networks, and a whole lot of rural America would have gotten state-of-the-art technology years ago instead of a tweak to DSL networks that barely crawling alone due to abuse.

The FCC has been guilty of funding underbuilding over and over again. The CAF II reverse auction gave money to Viasat, gave more money for upgrades to DSL, and funded building 25/3 Mbps fixed wireless networks. The classic example of underbuilding came with RDOF, where the areas that were just finishing the CAF II subsidy were immediately rolled into a new subsidy program to provide ten more years of subsidy. Many of the areas in RDOF are going to be upgraded to fiber, but a lot of the money will go into underperforming fixed wireless networks. And, until the FCC finally came to its senses, the RDOF was going to give a billion dollars to Starlink for satellite broadband.

The blame for funding underbuilding lies directly with the FCC and any other federal grant program that funded too-slow technologies. For example, when the CAF II funding was awarded to update rural DSL, areas served by cable companies were already delivering broadband speeds of at least 100 Mbps to 80% of the folks in the country. By the time RDOF was awarded, broadband capabilities in cities had been upgraded to gigabit. The policy clearly was that rural folks didn’t need the same quality of broadband that most of America already had.

But the blame doesn’t just lie with the FCC – it lies with all of the broadband advocates in the country. When the ISPs started to talk non-stop about not allowing overbuilding, we should have been lobbying pro-broadband politicians to say that the FCC should never fund underbuilding. We’ve collectively let the big ISPs frame the discussion in a way that gives politicians and regulators a convenient way to support the big ISPs. Both at the federal and state levels the broadband discussion has often devolved into talking about why overbuilding is bad – why the government shouldn’t give money to overbuild existing ISPs.

Not allowing overbuilding is a ludicrous argument if the national goal is to get good broadband to everybody. Every broadband network that is constructed is overbuilding somebody, except in those exceptionally rare cases where folks have zero broadband options. If we accept the argument that overbuilding is a bad policy, then it’s easy to justify giving the money to incumbents to do better – something that has failed over and over again.

It’s time that we call out the overbuilding argument for what it is – pure protectionism. This is monopolies flexing political power to keep the status quo, however poorly that is working. The big ISPs would gladly roll from one subsidy program to another forever without investing any of their own capital to upgrade rural networks.

Every time a regulator or politician says that we should not be using federal money to overbuild existing networks, we need to prod pro-broadband politicians to counter that argument by saying we should not be spending any more money on underbuilding. Broadband is infrastructure, just like roads and bridges, and we should be investing any grant money into the most forward-looking technology possible. If the national goal is to make sure that everybody has good broadband, then we should be ready to overbuild anywhere the incumbents have underperformed, be that in rural areas or inner cities. It’s time we shift the conversation away from protectionism to instead prioritizing bringing broadband that will still be good a decade or two after the grant award. Let’s not spend another penny of grant money on underbuilding networks by investing in slow technologies that are inadequate and obsolete even before they are completed.

Categories
The Industry

The Disappointment of 5G

Karl Bode recently wrote an excellent article highlighting the overhyping of wireless technologies. He’s right, and for the last twenty years, we’ve been told that a world-changing wireless technology is coming soon, but none ever materialized. No wireless technology has been a bigger flop than 5G when comparing the hype to the eventual reality.

The hype for 5G was amazingly over-the-top. The wireless carriers and vendors blitzed the country in a coordinated effort to paint 5G as the solution that would bring broadband everywhere. 5G was going to bring us self-driving cars. 5G would enable doctors to perform surgery remotely from across the country. 5G was going to fuel an explosion of smart factories that would bring complex manufacturing back to the U.S. And 5G was going to use millimeter waves to bring us gigabit-speed broadband everywhere, eliminating the need for investing in expensive fiber networks.

The hype fired up the general public, which bought into the 5G promises, but the public wasn’t the real audience of the hype. The cellular carriers did a non-stop blitz on federal officials, getting them to buy into the amazing wireless future. The cellular companies launched gimmick networks in downtowns to deliver gigabit cellular speeds using millimeter-wave spectrum as a way to sell the 5G vision. It’s clear in retrospect that the rhetoric and gimmicks were aimed at getting the FCC to release more mid-range spectrum for cellular usage – and it worked. There was pressure on the FCC to move more quickly with proceedings that were examining spectrum availability. The wireless carriers even talked the FCC into allowing cellular carriers to poach free WiFi spectrum in cities. The hype worked so well on elected officials that there was a serious discussion about the U.S. buying one of the big wireless vendors like Nokia or Ericsson so that the U.S. wouldn’t lose the 5G war with China.

The main problem with all of this hype is that the rhetoric didn’t match the specifications for 5G that were adopted by international standards bodies. The 5G specifications included a few key goals: get cellular speeds over 100 Mbps, allow for more simultaneous users at a given cell site, allow a cellphone to use two different spectrum bands at the same time, and allow a user to connect to more than one cell site if the demand needed it. The primary purpose of the 5G spec was to eliminate cell site congestion in places where there are a lot of people trying to simultaneously use the cellular network. Nothing in the 5G specification is earth-shattering. The specification, as a whole, seemed like the natural evolution of cellular to better accommodate a world where everybody has a cell phone.

I wrote several blogs during the height of the 5G hype where I was puzzled by the claims that 5G would bring about a broadband revolution because I couldn’t see those claims backed up by the technical capabilities of 5G. I also wrote several blogs asking about the business case for 5G because I couldn’t find one. We will likely never build a dense cellular network along the millions of miles of roads to support self-driving cars. The biggest business use of 5G touted by the carriers was to get people to buy subscriptions to use 5G to support the smart devices in our homes – but people will never buy a subscription to do what WiFi can do for free.

There is still not a good business case that can drive the new revenues needed to justify spending a lot of money on 5G. Because of this, most of the 5G specification has not been implemented. How many people are willing to pay extra for the ability to connect a cellphone to two cell towers simultaneously?

Instead of 5G that follows the specifications, we’ve gotten more marketing hype where the cellular carriers have labeled the new spectrum from the FCC as 5G. There is almost none of the 5G specification in this product, and the product labeled as 5G still uses 4G LTE technology. The introduction of the new spectrum has relieved the pressure on overloaded cell sites, and we’ve seen cellular speeds rise significantly. But that faster speed is wasted on most cellular customers who don’t do anything more data-intensive than watch video.

It was interesting to see how the rhetoric died down once the cellular carriers got access to more spectrum. The big winner from the marketing hype has been the handset manufacturers, which have convinced customers that they must have 5G phones – without really telling them why. Cellular customers are generally pleased that speeds have increased since this means stronger coverage indoors and in outdoor dead spots. But surveys have shown that only a minuscule percentage of people are willing to pay more for faster cellular speeds.

The most ludicrous thing about the 5G story is that the industry is now hyping 6G. This new marketing hoax is focusing on some of the mid-range spectrum that was originally touted as being part of the 5G war – but the marketers rightfully assume that most customers won’t understand or care about the facts. It seems like the industry has embarked on subdividing what was originally considered as 5G spectrum into small chunks so that the carriers roll out subsequent generations of 6G, 7G, and 8G – all of which were supposedly part of the original 5G revolution. I have no doubt that the public will buy into the hype and want 6G phones when they hit the market, but I also know that none of them will see any difference in performance. The formula seems simple – announce a new G generation every eighteen months and sell a lot of new handsets.

Categories
The Industry

The Slow Death of Operator Services

Starting this month, AT&T customers with digital telephones can no longer dial 411 to get information or dial 0 to get an operator. We’re seeing the beginning of the end of operator services. AT&T still offers this service to copper landlines but already cut this service last year for cellular customers.

This is a big change for the telephone industry, although most people will barely notice. When the industry was new, all calls were processed by operators who sat at switchboards and made manual connections to connect a calling party to the called party. Over time the switching function got automated so that people could directly dial a telephone number.

But operator services remained a major part of the telephone business. Young folks who have grown up with the Internet can’t remember back to a time, not that long ago, when everybody dialed 411 from time to time. There was no way to find the phone number of a person or business outside your local area without calling an operator. Every landline customer got a fixed number of free 411 calls every month, and then you’d pay for the service. Calling to ask for a number didn’t always work because about 10% of homes paid extra to have an unlisted number, meaning it wasn’t published in the directory or given out by 411 operators. It was sometimes hard to find a number if you weren’t sure about the spelling of somebody’s name.

I remember how hard it used to be to find businesses in other cities. You’d have to guess how they were listed in the white pages, which was often different than the name you knew them by. A good directory assistance operator was worth their weight in gold. I can remember calling 411 before moving to a new city to connect my utilities. I’d have no idea the name of the electric company or know the number to call to connect water or telephone service. A good operator could get you the right numbers on one try.

There was a time when dialing 0 for operator was a big deal. The operator could help you with all sorts of things. The disabled could get the operator to dial a call for them. In the days before cell phones and long-distance deregulation, the phone companies helped millions of people every day place collect calls. If your car broke down, a collect call was how you called home for help.

The telephone operator played another important role in the community. Operators got calls from kids who were home alone, elderly who were lonely, and folks feeling suicidal. Particularly at non-busy hours, operators were known to provide a sympathetic ear. Operators were also the predecessor to 911 – you dialed 0 for fires, medical emergencies, and to report a crime in progress. See this ad from AT&T that promoted calling the operator.

I have a personal anecdote about the operator function. I worked in management at Southwestern Bell, and during a union strike, I was assigned to an operator center to work a switchboard. A few days into the strike, I got a call from a woman at a payphone whose husband was having a heart attack. It turns out that this payphone was located at the intersection of the borders of three municipalities, each with its own emergency services. I called the City of St. Louis first and was told that address was not in their jurisdiction. I also chose wrong on the next call, and it took three calls to connect to the right ambulance service. As you might imagine, this shook me since I knew I time was of the essence in trying to save this man’s life. I never knew what happened to this gentleman because, in the life of the operator, I had to immediately move on to the next call. Operators somehow stayed sane while taking these kinds of calls every day.

Operator services have been diminishing in importance for many years. The introduction of 911 rerouted calls for emergencies directly to the right folks – although to this day, operators still get emergency calls. The Internet became the yellow pages, and you can find most, but not all, businesses online. The real death knell for operator services was the cellphone. I knew operator services would become obsolete after buying a flip phone that remembered every number I had ever called. I can’t recall the last time I called an operator, and I wouldn’t be surprised if it was 20 years ago. But it’s been comforting to know that a friendly operator was there if I ever needed help.

Categories
Regulation - What is it Good For? The Industry

Hearings on Broadband Grants

I’ve always tried to keep politics out of this blog. That hasn’t been too hard since I’ve found that getting better broadband for any community is almost always a non-partisan issue. I’ve worked with City and County Councils and Boards all over the country, and I haven’t seen that the political makeup of the local governing body has much impact on the degree to which they want to find better broadband for citizens.

That’s why I was surprised to see that the newly seated House of Representatives immediately announce a set of hearings to look at broadband grants. You know from reading my blog that I think there is a lot of room for improvement in the BEAD grant program – due in large degree to the complicated grant rules established by Congress. I would welcome hearings that would examine some of the over-the-top grant rules if the purpose of the hearings was to create legislation to make it easier to award and spend the BEAD grant funds.

But that doesn’t seem to be the intent of these hearings. The hearings want to look at two issues. The first is to make sure that the grants are only used for connecting to unserved locations and not used for ‘overbuilding’. This has been a major talking point for the big cable companies for years – they don’t want to see any grant money used to encroach on areas they think of as their service territories. The whole idea of not using grants for overbuilding is ludicrous – there are not many homes in the country where at least one ISP can’t provide service – so every new broadband network that is constructed is overbuilding somebody.

The vast majority of the BEAD grants will be used in rural areas, and the idea that rural funding will be used for ‘overbuilding’ is ludicrous. I don’t know anybody who advocates using grant funding to overbuild rural fiber networks or other existing fast networks. All of the state grant programs I’ve worked with have a challenge process to make sure this doesn’t happen, and it looks like the BEAD grants have several crosschecks to make sure this doesn’t happen. Even if a BEAD grant is awarded in error, I would think a State Broadband office would yank the grant award before letting grant money be used to overbuild rural fiber.

The issue that has the big cable companies up in arms is that the IIJA grant legislation says that once a state has satisfied bringing broadband to unserved and underserved locations, grant funding can be used to improve broadband in inner cities and places that the big ISPs have ignored. There will not likely be a lot of BEAD grant money that goes to this purpose, but there will be some.

It’s hard to understand the reason to have a hearing on this issue. The BEAD rules are clearly defined by language crafted and enacted by Congress. The hearings will likely involve grilling officials from the NTIA on this issue. It’s an absurd scenario to picture, because the NTIA has no choice but to follow the law as written by Congress. Any hearings on this issue will likely beat up n officials at the NTIA or FCC, but will really be Congress investigating its own law.

The other stated purpose of the hearings is to make sure that the grants don’t have waste, fraud, or abuse. It’s going to be really interesting to see where this leads in hearings. The only big historical cases of grant waste and abuse I know of are the way the big telcos often took CAF II funding and made no upgrades. I don’t picture these hearings dredging up past abuses by the big ISPs, so I’m having a hard time imagining where else this line of inquiry might go.

I fear that the biggest repercussion of this kind of hearing is that it’s going to make already-cautious grant officials even more cautious. The folks at the NTIA and State Broadband offices are going to worry that everything they do will be under a microscope from Congress – and they are going to get even more careful not to make any bad mistakes in awarding grants. Nobody wants to be yanked in front of Congress in a year and be grilled about a specific grant award. And perhaps that’s the purpose of these grants – to intimidate officials into funneling more grant funding to the safe choice of giving it to big ISPs.

What puzzles me the most is why hold broadband hearings of this sort. Bringing better broadband to communities is immensely popular. In the many surveys we’ve administered on the issue, the public support for bringing better broadband has always been above 90%. This is true even in communities where there is already fast broadband offered by a cable company – folks want competition. It’s hard picturing any headlines coming from these hearings that can benefit the politicians holding the hearings.

These hearings only make sense as a way to appease the large ISPs which contribute heavily to politicians. It’s hard to imagine that these hearings will change anything. Congress can change the BEAD grant rules any time this year, but that will take bipartisan cooperation – something that seems to have disappeared from Washington DC. But the hearings will only allow for the airing of the big ISP grievances, and I guess that is something.

Categories
The Industry

Are We Facing the Splinternet?

One of the consequences of the war between Russia and the Ukraine is that Russia has largely stopped participating in many large worldwide web applications. Russia has blocked Facebook and Twitter. Other applications like Apple, Microsoft, TikTok, Netflix, and others have withdrawn from Russia.

The European Union is in the process of trying to block Russian-generated content such as the state-owned news outlets of RT (formerly Russia Today) and Sputnik. There are discussions of going so far as block all Russian people and businesses from EU search engines.

Russia has responded by declaring Meta, the owner of Facebook, Instagram, and WhatsApp, to be an extremist organization. This has also led the Russian government to withdraw its participation in organizations that set international policies such as the Internet Council of Europe. The EU countered by suspending Russia from the European Broadcasting Union.

There is a new phrase being used for what is happening with Russia – the splinternet. In a full splintenet scenario, Russia could end up being totally separate from the rest of the world as far as participating in the Internet.

There are already countries that don’t fully participate in the worldwide web. North Korea has blocked participation in much of the web. China and Iran block a lot of western content. However, these countries still participate in supporting the general structure and protocols of the Internet, and not all western applications are blocked.

The folks from the worldwide governing bodies that oversee Internet protocols are concerned that Russia, and perhaps China and Iran could decide to fully withdraw from the web and develop their own protocols for use inside the countries. If the countries that have peeled off from the rest of the web don’t maintain the same protocols, then communications with the rest of the world eventually becomes difficult or impossible.

This would have a drastic impact on the web as an international means of communication. There are huge amounts of digital commerce between these countries and the rest of the world over and above social apps. Commerce between these countries and the world depends on email, messaging apps, and collaboration platforms. People and businesses in these countries participate in digital meetings in the same manner as the rest of the world. The economic impacts of large countries effectively withdrawing from worldwide e-commerce would be immense.

This is something that we’ve seen coming for many years. For example, Google and Facebook located servers in Russia so that content generated in Russia would stay in the country and not be stored in servers and data centers outside the country.

A Russian withdrawal from the Internet would be far more drastic than Chinese censoring of web contact – it would cut communications with the outside world to zero. It’s hard to even imagine the impact this would have on Russian businesses, let alone cutting the ties between the Russian people and everybody else. This would create a digital Berlin Wall.

It doesn’t seem likely that having Russia or China withdraw from the web would have much impact on how the rest of the world uses the web. It would mean that citizens in those countries would not benefit from the newest innovations on the web. But most countries already today understand how important the web is for commerce, and for most countries, that’s a good enough reason not to tinker with something that works.

From my perspective, the whole world suffers if folks stop participating in worldwide communications. The web is the great equalizer where folks with similar interests from around the world get to know each other. But we live in a world full of politics and controversy, so it’s probably inevitable that this will spill eventually over to the Internet, like it does to many other parts of the world economy.

Categories
The Industry

What Happened to Verizon Fiber-to-the-Curb?

Back in 2018, Verizon got a lot of press for the release of a fiber-to-the-curb (FTTC) technology it called Verizon Home. The first big test market was Sacramento. The company built fiber along residential streets and used wireless loops to reach homes. At the time, Verizon touted speeds of 300 Mbps but said that it was shooting for gigabit speeds using millimeter-wave spectrum. Verizon tried to make this a self-installed product, and customers got instructions on how to place the receiver in different windows facing the street to find the best reception and speeds.

There were quotes from the time that Verizon intended to build fiber to pass 25 million homes by 2025 with the technology. But then the product went quiet. In 2020, the Verizon Home product reappeared, but it is a totally different product that uses cellular spectrum from cell towers to bring broadband. This is the product that the industry is categorizing as FWA (fixed wireless access). The company no longer quotes a target broadband speed and instead sayshttps://www.verizon.com/5g/home/Verizon 5G Home is reliable and fast to power your whole home with lots of devices connected. So all of your TVs, tablets, phones, gaming consoles and more run on the ultra-fast and reliable Verizon network.” In looking through some Ookla speed tests for the FWA product, it looks like download speeds are in the 100 – 150 Mbps range – but like any cellular product, the speed varies by household according to the distance between a customer and the transmitter and other local conditions.

The new cellular-based product has gone gangbusters, and Verizon had over one million customers on the product by the end of the third quarter of 2022, having sold 342,000 new customers in that quarter. The relaunch of the product was confusing because the company took the unusual step of using the same product name and website when it switched to the wireless product. It even kept the same prices.

But the two products are day and night different. Verizon’s original plan was to pass millions of homes with a broadband product that was fast enough to be a serious competitor to cable broadband. Even if the product never quite achieved gigabit speeds, it was going to be fast enough to be a lower-priced competitor to cable companies.

While the new Verizon Home product is selling quickly, the product is not close in capabilities to the FTTC product. Cellular bandwidth is never going to be as reliable as a landline technology or one where fiber is as close as the curb. Verizon (and T-Mobile) have both made it clear that the FWA customers will take second priority for bandwidth availability behind cell phone customers. I don’t know that these companies could do it any other way – they can’t jeopardize unhappiness from a hundred million cellular customers to serve a much smaller number of FWA customers.

I think everybody understands the way that cellular broadband capabilities change during the day. We all see it as the bars of 4G or 5G at our homes bounce up and down based on a variety of factors such as weather, temperature, and the general network usage in the immediate neighborhood. The most interesting thing about being a broadband customer on a cellular network is that the experience is unique to every customer. The reception will vary according to the distance from the cell tower or small cell and the amount of clutter and interference in a given neighborhood from foliage and other buildings.

I expect that large bandwidth users will get frustrated with the variability of the signal and eventually go back to a landline technology. The FWA product is mostly aimed at bringing broadband to rural customers who have no better broadband alternative or to folks in towns for whom saving money is more important than performance. There are a lot of such people who have stuck with DSL for years rather than upgrading to the more expensive cable broadband, and these are the likely target for FWA. In fact, FWA might finally let the telcos turn off DSL networks.

Verizon says it’s still on track with what it calls the One Fiber initiative which is aimed at building Verizon-owned fiber to cell towers and small cell sites. This backbone was likely the planned starting point for neighborhood fiber, but now this is mostly a cost-cutting step to stop paying fiber leases.

Categories
The Industry

Measuring Sustainability

I’ve seen folks around the country suggesting that State Broadband offices ought to put a priority on sustainability when selecting winners of broadband grant funding. It’s a concept that has instant appeal, but I immediately asked myself what it means. How do you measure sustainability in a way that can be used to score grant requests?

It’s likely that most folks would agree on the definition of sustainability. If we are going to use government grant money to build a broadband network, we want that network to be providing broadband service for as long as possible. We expect sustainability for other kinds of infrastructure, such as roads, bridges, and buildings, so why shouldn’t we expect the same thing from a grant-funded broadband network?

But what does sustainable mean for a broadband network? The first test of sustainability is the expected life of the assets being constructed.

  • The longest-lived asset that is being constructed with grants is conduit. There is no reason why a well-maintained conduit system shouldn’t still be fully functional a century from now.
  • There are big debates about the economic life of fiber. If you go by the economic lives allowed by IRS depreciation, then the expected life of fiber is 25 or 30 years. We know that’s ridiculous because there is plenty of forty-year-old fiber still chugging along in the field. We also know that fiber constructed today is far better than fiber built forty years ago. The manufacturers have learned to make higher-quality glass with less impurities. But the big change in the industry is that the folks that install fiber have learned techniques that minimize damage during construction. Poor handling of fiber manifests twenty years later as micro-fissures – and that means cloudy glass. Nobody will give an expected life for well-maintained fiber, but scientists at some of the manufacturers have privately told me that they think it’s at least 75 years – we’ll just have to wait to find out.
  • The assets that cause the most concern for sustainability are electronics – be that fiber electronics or fixed wireless electronics. All electronics must periodically be replaced. I’ve seen some fiber electronics last fifteen years – but that seems to be near the upper end of economic life. The general industry wisdom is that fixed wireless systems have to be replaced every 7 to 10 years.
  • We largely eliminated some ISPs from grant eligibility due to poor sustainability. For example, low-orbit satellites like Starlink are designed to only last 5 to 7 and then fall from orbit. It’s hard to make an argument that grant funding buys great value with this kind of asset.

This all means that the sustainability of electronics must be a concern for all technologies. Any ISP that wins grant funding will likely be replacing some electronics within a decade. One test of any ISP on sustainability is the financial ability and willingness to replace those electronics. That’s hard to judge.

There is another measure of sustainability that is even harder to measure. A big factor in sustainability is the operating philosophy of the ISP that owns the networks. We know there is a big range of what I would call corporate responsibility between ISPs.

If we go strictly by the past, then the ISPs that have the most likely chance of operating a sustainable network for the long term are cooperatives or other ISPs that expect to still be serving the same customers fifty years from now. But not all cooperatives are the same. We see this when looking at how some electric cooperatives have allowed their poles to deteriorate badly over time.

Next in line in trustworthiness might be small telcos that have been around for as long as a hundred years. But over the last few decades, a large percentage of these companies sold to larger ISPs – so, the question for a grant reviewer is if the small telco that gets a broadband grant today will be the same owner of the network a decade or two from now?

A big question mark for many folks is the large ISPs. We saw the big telephone companies let copper and DSL networks rot in place by basically ceasing all maintenance years ago. This was clearly done as a cost savings measure. These companies will argue that there was no sense in continuing to support a dying technology, but we know that is nonsense. The copper networks in places like Germany were well-maintained and still offer DSL today with speeds in many places over 100 Mbps. The big telcos decided to unilaterally cut costs at the expense of customers. Should a grant office award funding to a company that has already failed the public once before? I’m guessing that grant offices will make awards to the big companies by reasoning that fiber networks will last a long time, so maintenance doesn’t matter. But I would argue just the opposite. I think a fiber network can deteriorate even faster without good maintenance than a copper network because the technology is less forgiving. There is still 20-year old DSL cards chugging away, something that likely won’t happen with fiber. If an ISP ignores and doesn’t maintain fiber network electronics, a fiber network could quickly turn into a brick.

I’ve not said anything above that is not common knowledge. But I am at a loss of how to turn what we’ve learned from the past behavior of ISPs in a way to consider sustainability when awarding grants. If sustainability was the most important factor in awarding a grant, I personally would give all of the money to cooperatives and none to big ISPs. And I wouldn’t fund technologies that must be largely replaced within a decade. This is probably why nobody is asking me to award grants!

Categories
The Industry

Telephony in 1927

This may seem like an odd topic to write about. The topic comes from browsing through the 2.8 million documents from 1927 that just entered the public domain as the original copyrights expired. The big national headlines focused on Winnie the Pooh, Disney’s Steamboat Willie, and Sherlock Holmes entering the public domain. But it’s fascinating to browse through the wide range of documents that provide a snapshot of the science, technology, and culture in 1927.

Today’s blog comes from a 12-page pamphlet on The Telephone: Its History and Methods of Operation, published in 1927 by the Illinois Committee on Public Utility Information. The document is a great primer on the state of the telephone industry that year. The most interesting impression for me seeing how pervasive the telephone became only 51 years after the first lab test of the technology.

Some of the more interesting facts from the pamphlet:

  • There were 18 million telephones in the U.S. that year, including 1.6 million in Illinois. That’s one telephone for every 15 people.
  • The U.S. had 61% of all working telephones in the world at the time. Europe had 27%, with only 12% of telephones for the rest of the world. Illinois had 1,189 central offices.
  • We talk about industry consolidation today, but in 1927 there were 39,000 telephone companies in the country, most serving small towns or neighborhoods.
  • There were 380,000 people employed by telephone companies in the U.S., including 15,000 girls employed as private branch exchange operators, just in in Illinois. Illinois had over 30,000 telephone-related manufacturing jobs.
  • A state-of-the-art handset is shown as the picture at the top of this blog.
  • Telephone service cost an average family less than 1% of household income, and it was the affordability that led to the rapid acceptance and deployment of the technology.

The pamphlet gushes about the deployment of telephone technology. “Yet behind this little act of calling by telephone and holding converse with some distant person, there is the story of a marvel so great as to almost put to shame the winder of Aladdin’s Lamp. . . Beginning 5 years ago, the record and the development of the telephone has been so wonderful, so vital in the affairs of man, that it has actually changed the course of human history and has played no small part in the civilization of mankind.”

There were three types of central offices in 1927. Small exchanges used the magneto system that had a battery at the base of each telephone that was charged by turning a crank on the telephone. Larger telephone exchanges used a common battery system that supplied power to telephone sets over copper wires. This system alerted an operator that somebody wanted to place a call by lighting a small lamp on a switchboard. Operators manually placed calls to other exchanges to arrange the connection of a call. Large city exchanges were using the new switchboard technology that allowed an operator to complete calls by connecting a jack to the appropriate trunk line, eliminating most of the time-consuming labor needed to set up a call.

There is a fascinating section describing the network used to place transatlantic calls. A U.S. originating call used a voice path to Europe routed to Rocky Point, Long Island, where the calls were transferred to a powerful radio system that transmitted the call to Cupar, Scotland. The return voice path took a similar path from Rugby, England to Houlton, Maine.

Within seven years of this pamphlet, Congress passed the Telecommunications Act of 1934, which put some regulatory restraints on the large Bell Telephone monopoly that was gobbling up telephone systems across the country. Looking at telephony from a 1927 perspective shows us a time when telephony was still new and was a wonderment to most people.

Here is a look at all of the books and periodicals from 1927 that are now in the public domain. Here is the Telephone Pamphlet. Pay particular attention to the last section that instructs people how to talk on the telephone.

Exit mobile version